From: Reason (reason@exratio.com)
Date: Fri Jun 29 2001 - 19:07:31 MDT
> Human interactions rest on the assumptions that much of
> the behavior of fellow humans can be predetermined.
> If humans can predict it, then I see no reason why
> an "artificial intelligence" cannot be designed on
> the basis of relatively simple methods that should
> expect the norm (bus passengers should sit quietly)
> or produce them (I a human-like AI must sit quietly).
Having worked in the field, in association with the game industry and
otherwise, I have to agree that the "AI" is horribly overused. The following
terms have been bandied about in my circles as replacements for the overused
"AI":
1) Human Simulation or Human Behavioral Simulation
Used for relatively simple algorithms designed to mimic a small fraction of
the wide range of human behavior. For example, an automated stock analysis
tool, bus-passenger-behavior code, some portions of first person shooter
game NPC algorithms (e.g. issuing orders).
The game industry calls such things "AI." The non-game industry tends to
call them whatever the marketing guys think will stick (I'm embarrassed to
relate the label that was stuck to my first contribution to the field).
See www.twilightminds.com for a small open source human behavioral simulator
(the Brainiac). Sadly, it's very advanced insofar as such things go outside
academia. The state of the art really isn't all that.
2) Kinesthetic AI
Refers to the first person shooter algorithms designed to get NPCs to move
about and perform physical actions "correctly" under various stimuli.
The game industry calls this stuff "AI" as well. The state of the art right
now would (arguably) be Unreal Tournament.
3) Strategy/Tactical AI
Refers to algorithms used in real time strategy games and the like in which
the computer is expected to simulate a single human opponent playing the
game against you. Most game industry expertise and debate goes into the
varying ways of going about this. There's a whole lot of information out
there that goes into excruciating detail.
The game industry, of course, calls of of this "AI" also.
--------
There's many subcultures out there -- especially within the wider game
industry -- in which the word "AI" has taken on some very specific meanings.
These meanings do not generally have much to do with the original meaning.
In most cases it's all about simulating human behavior for a small selection
of the human experience.
Of course, a sufficiently broad and competant simulation should be able to
pass a Turing Test (and game AI/chatbots manage it every day)...so a
question: what's the list bias on what a sufficient test for sentience is?
Many people I've spoken to seem to have an aversion to -- hypothetically --
declaring a Turing-Test-passing entity sentient if they can examine and
understand its algorithms. If they can't understand how it works, they're
happy with that. Which leads to a whole serious of obvious parodies, but
enough...
Reason
http://www.exratio.com/
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:08:22 MST