quote: Original post by Timkin
For example... let''s say you wanted to define a fluid surface, like a pond. A mesh is a good way of doing that. Now, if an object is dropped into the pond the current state of the mesh needs to be known by the CPU so that it can be updated with the affect of a deformation wave. At each point of the simulation the CPU needs to know the state of the mesh and the graphics card needs to know the state of the mesh. Obviously some information is being thrown back and forth between the CPU and GC for this purpose.
In this particular example I see no problem with having the graphics card perform the physics simulation.
I had been thinking along different lines, with different examples. A player fires a rocket. The triangles of the rocket are somehow tagged as a specific entity, and that entity is then assigned a velocity. In subsequent frames, the rocket is tested against the rest of the world geometry for collisions by the client''s graphics card. But wait: This needs to be synchronized with the server. The rocket also needs to be destroyed at the appropriate time, and explosion needs to be created.
quote: Original post by Timkin
As I said, I''m not well versed in how GCs implement their hardware acceleration, so if this is not possible, then tell me and I''ll throw my idea in the ''stupid'' basket.
I doubt I''m any more knowledgable in this than you are. The way I see it, though, most game physics affect the player relatively directly, and in ways that affect gameplay and not just graphics. Some things like parametric mesh deformation and particle systems could be seperated from the CPU, but not collision detection and whether-or-not-the-rocket-hit-you.