From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Tue Aug 12 1997 - 18:15:21 MDT
"Emotions are the easy part. We'll have human-equivalent computational
emotions long before we have human-equivalent reasoning."
-- Eliezer S. Yudkowsky, official prediction.
I hereby open fire on the stereotype of the emotionless AI.
There's an old story that some AI ancient once told a graduate student to take
a summer off and "solve the problem of vision", figuring that it would be easy
in comparision with the problem of general reasoning. Decades later, Marr
finally cracked the visual cortex wide open, explaining the computational
algorithms used to process vision, and even assigning specific neurons to
specific tasks. We're far from knowing everything about vision... but
definite and irrevocable progress has been made.
We may have been wrong about taking off one summer... but we were completely
*right* about vision being easy in comparision to general reasoning. We still
don't have the *faintest* idea of what algorithms are being used... much less
assigning individual neurons to their processing!
The point of all this? To provide proper context for my statement that, in
comparision to general reasoning, emotions are easy. I might even take a
summer off one of these days...
Dogs have emotions. Only humans have sophisticated general reasoning.
Emotions are more primitive than reasoning - not in the primal sense, but in
the computational. They evolved over billions rather than millions of years.
They are simple, relatively inflexible. In many cases, emotions have been
observed to correspond with particular areas of the brain.
All this leads up to my statement: We will have emotional computers before we
have human-equivalent AIs. In fact, we will have completely cracked the
problem of emotion, in toto, before we have human-equivalent AIs.
The stereotype of emotionless machines, therefore, is misplaced on
technological grounds. We might - I say *might* - be able to divorce the
emotional processes from the rational, and deliberately build emotionless
machines. Perhaps not. The emotions started out as instincts; they are
probably the single longest-evolved part of the brain. So they might be so
simple and physiological - so directly programmed - as to be easily untangled.
Or they might act as a coordination center for the rest of the brain. It
could go either way; probably both.
I'm sure we all know the origins of the common stereotype. For some reason,
machines and computers got associated with pure logic. Now, that's not the
case - never has been, probably never will be. Deep Thought is not a human
missing the emotions of its primate ancestors. Deep Thought is far, far from
the level of an emotional dog. Deep Thought would have to be considerably
more complex to qualify as an insect. A bacterium, maybe, if not simply a
physical process.
But if a human tried to imitate Deep Thought - which is how we describe it, by
putting ourselves in its place - he would have to suppress his emotions,
because Deep Thought doesn't have any emotions. He would also have to
suppress his long-term strategical reasoning, his memories, his chunking
ability, his ability to perceive complex spatial relationships, his linguistic
capability... but we can't even imagine suppressing those. Those abilities
are too intimately bound up with ourselves, being recently evolved and
therefore indirectly programmed and therefore developmentally tangled with all
other systems... not to mention processing the self-symbol-subsystem. While
we can suppress our emotions; we do it all the time.
The height of the stereotypical silliness occurs when a machine is portrayed
as having deeply repressed emotions. The idea is so silly it's not even
wrong. It derives from viewing humans who behave "mechanically"; they often
have deeply repressed emotions. The idea that you can generalize to all
mechanisms... it reminds me of a cartoon I have on my door, a Mr. Boffo. The
caption is "Most-Broken Award"; the cartoon shows a cuckoo-clock bird going:
"What time is it?"
My argument, in summary, is this.
Modern computers are emotionless, and utterly unintelligent. Their level of
"pattern" ranges from undirected physical processes such as Deep Thought, to
bacterium-level organisms such as a word processor, to the insect-level
mind-boggling complexity of Windows 95.
When our processing power and programming capability reaches "dog" level,
we'll start seeing emotional computers.
A while later, we'll see emotional AIs, and then very shortly thereafter
posthuman AIs, who may or may not have anything recognizable as emotions.
The only reason why we don't already have emotional computers - it is a simple
problem - is that there are no perceived profits. As soon as somebody figures
out how to make it sell, we'll have emotional computers.
-- sentience@pobox.com Eliezer S. Yudkowsky http://tezcat.com/~eliezer/singularity.html http://tezcat.com/~eliezer/algernon.html Disclaimer: Unless otherwise specified, I'm not telling you everything I think I know.
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:44:42 MST