🎉 Celebrating 25 Years of GameDev.net! 🎉

Not many can claim 25 years on the Internet! Join us in celebrating this milestone. Learn more about our history, and thank you for being a part of our community!

Accelerated AI boards.

Started by
42 comments, last by GBGames 22 years, 5 months ago
I remember reading about them somewhere, but I can't find any information on google. Or maybe I was using the wrong search terms. Does anyone have any info on them? How would they be implemented so that they can be general enough to run any AI algorithms yet optimized to run quickly? Edited by - GBGames on January 17, 2002 4:52:43 PM
-------------------------GBGames' Blog: An Indie Game Developer's Somewhat Interesting ThoughtsStaff Reviewer for Game Tunnel
Advertisement
Are we talking about hardware here or what? Sorta like a graphics cards for AI? What is this?

_________________________________________________________________

Drew Sikora
A.K.A. Gaiiden

ICQ #: 70449988
AOLIM: DarkPylat

Blade Edge Software
Staff Member, GDNet
Public Relations, Game Institute

3-time Contributing author, Game Design Methods , Charles River Media (coming GDC 2002)
Online column - Design Corner at Pixelate

NJ IGDA Chapter - NJ developers unite!! [Chapter Home | Chapter Forum]

Drew Sikora
Executive Producer
GameDev.net

Yes, exactly. I remember reading about how that would really be an important step towards better games, among other things.
I just realized how unclear it was.
It is like a graphics card for AI. Games could utilize them to take a load off the processor for other things, which might lead to innovations in gaming. I remember reading about Star Wars: Shadows of the Empire making use of the N64's graphics processor, and the programmers said that since that much load was kept off the CPU, they could create environments without repetitive textures. Every room could have different wallpaper, and nothing would look like it did when they were limited by the number of sprites they could use on previous systems. One of the benefits of using what is basically a graphics card is that the CPU could also do better AI.

I read somewhere about accelerated AI cards or at least the concept of them. Imagine what could be implemented when the AI is not using up much CPU time.
So basically, does anyone know anything about this?

Edited by - GBGames on January 17, 2002 9:33:30 PM
-------------------------GBGames' Blog: An Indie Game Developer's Somewhat Interesting ThoughtsStaff Reviewer for Game Tunnel
I know of many discussions about this both here, at gamasutra (now at www.igda.com/forums) and elsewhere where AI is discussed on the net. The main problem it all boils down to and the reason such a hardware acceleration for AI has not been implemented (en masse, there have been neural network boards produced on an individual basis at universities and research institutes before) is that it''s difficult to define an AI ''polygon'' that can be thrown at the accelerator for speed improvements. AI is such a wide field (I''m not saying graphics isn''t but there are certain intransients between systems) that finding a single way to solve the problem of ''great AI'' for every type of game in a universal way suitable for acceleration has not been achieved. I''m sure that if one universal problem was language parsing and that the optimal solution was using Chomsky style grammar deconstruction then you could produce a board that could be programmed with sections of syntax and a dictionary able to give you deconstructed sentences and that this process could be made faster by placing it on a separate board (as there are CPU overheads to throwing information to specific hardware) to such an extent to improve the gameplay cheaply and effeciently, then it would be done.

The only generally used form of AI in every game is pathfinding and that can be done in many ways, with optimisations reducing the overhead. However, if there were an AI card put into production then this would be one of the first pieces of functionality to be included (as has been said before).

What functionality would you put on an AI board?

Mike
As I''ve always said, a processor is all the AI acceleration I need!

Power and flexibility.

Graphics are going more and more towards dedicated cards... Sound too. Maybe give the physics its own integrator chip, then AI developers will be free to do what they please


Artificial Intelligence Depot - Maybe it''s not all about graphics...

Join us in Vienna for the nucl.ai Conference 2015, on July 20-22... Don't miss it!

na I say make it a law that everone has to run dual (insert brand name here) chips in their computers and rewrite (insert OS name here) so that the second processer can only be used for AI =P

Edited by - Great Milenko on January 18, 2002 11:22:17 AM
The Great Milenko"Don't stick a pretzel up your ass, it might get stuck in there.""Computer Programming is findding the right wrench to hammer in the correct screw."
MikeD:

Is it a problem of AI being too broad or AI developers being too finicky? I''m sure some of the ellipsoid and voxel proponents out there would have claimed that graphics accelerators based on polygons would suck. While doing formal studies in AI, I found that an awfully large of verbiage in the field ends up restating a problem in a different domain where it is "simpler", without actually adding to the problem.

Let''s say you had a board centering around genetic and neural algorithms. It has a set of nodes and a set of synapses, as well as a bunch of different ideas about how to use those tools. You can solve pretty much any problem with those two tools, and you can optimize with pretty much just those two tools.

That is, I *think* you can. You''re welcome to point out shortfalls in that logic.

thank you,
ld
No Excuses
I doubt that you will ever see hardware dedicated to so-called AI computation. Even the name "AI" is a catch all term for a huge branch of computer science. There is no standard model for computation, no "most general" approach to tackling the problems of the field.

Computer graphics, on the other hand, has converged as a field (not exactly, but much more so than most fields). Things like colour , depth buffers and polygons are ubiquitous in the area of 3D graphics. No such parallels can be made for AI (e.g. Tree searching and path finding are two of the most common tactics in AI ... but are hardly "the best" or "the only" way of doing things).

While it''s true that in silica implementations of Lisp (for example) have been used to augment the power of a normal CPU, I think that any commonplace custom AI hardware would not come in the form you might expect. There is a better way...
------When thirsty for life, drink whisky. When thirsty for water, add ice.
quote: Original post by Graylien
I doubt that you will ever see hardware dedicated to so-called AI computation.


Oooh, I''m willing to bet you''ll eat those words in a few decades time. Besides, I''ve already got hardware dedicated to AI computations...I keep it in my head.

FragLegs
quote: Original post by Graylien
Computer graphics, on the other hand, has converged as a field (not exactly, but much more so than most fields). Things like colour , depth buffers and polygons are ubiquitous in the area of 3D graphics. No such parallels can be made for AI (e.g. Tree searching and path finding are two of the most common tactics in AI ... but are hardly "the best" or "the only" way of doing things).


The fact that CG has converged doesn''t mean it''s found either the best or the only solution to any given (abstract) problem in the domain. 2 years ago John Carmack predicted that polygon-based modelling was on the way out, to be replaced by solid representations ( the analog of ''vector'' vs ''raster'' graphics, to his mind ). As with many great minds, he may simply have been ahead of his time.

Point being, you don''t need the *best* solution to take advantage of acceleration. You just need a solution that produces results which are acceptable. Polygon-pumpers still have mathematical troubles with thin polygons & interpenetration; I imagine that lighting & transformation add a whole world of mathematical nightmare to the experience.

I''m not saying that it''s automatically possible, just that it''s not as simple as it may seem on the surface.

ld
No Excuses

This topic is closed to new replies.

Advertisement