From: Robin Hanson (rhanson@gmu.edu)
Date: Mon Jul 02 2001 - 10:48:07 MDT
** This Contains Spoilers!! **
Eliezer Yudkowsky wrote:
>The very first thing that struck me about A.I. was the rather extreme
>stupidity of the AI *researchers*. ... David is beta software. His
>emotional responses ... show a binary, all-or-nothing quality.
>Then the AI researchers had the bright idea of putting this beta software
>into a human body, adding in those emotions calculated to produce maximal
>emotional attachment on the part of humans, and giving it as a human
>surrogate to a mother already in an emotionally unstable state because her
>own child has been in medical cryonic suspension for five years.
>... David realizes that his mother will someday die, ... His mother,
>... feels enormous emotional stress at the thought of returning David to be
>incinerated. Nobody thought of this back when they were building a
>loving, lovable, naturally immortal, allegedly disposable child?
I found the stupidity level to be plausible, and not extreme. David's
father was a company employee, not a random person, and at some point
you'd have to test the system with a real mother. Most of what you
see as thoughtless, I see as insensitive - they just didn't care.
David feels bad that his mom might die? Who cares what David feels.
Mom would feel awful giving up this product? That's the ideal product
- one customers couldn't think of living without.
What you see as binary capabilities, I see as crude context-dependent
choices. When you don't see the subtleties of a situation, your
actions will have more variance, and seem more all or nothing compared
to appropriate responses.
>... David nearly kills himself competing with his revived brother, by
>attempting to eat; the second catastrophe occurs when David nearly drowns
>his brother. ... the AI researchers should have thought of it.
You can't think of everything - that's what beta tests are for. David
killing himself wasn't a huge potential loss, though killing his brother is,
but I think we can see that as an unlikely risk that the company was
well willing to take given the huge potential profits awaiting. Stories
all the time focus on telling about the consequences of unlikely events.
I did find it implausible, but not crazy, that robots would be that much
denser than humans. David should have floated - androids are designed
to fit into human physical slots, and density is one obvious parameter
to fit.
>... Again, someone at the mecha corporation was being damn stupid
>and deserves to be sued into bankruptcy.
But these were company folks who signed everything asked of them.
Should no one be allowed to agree to be a beta-tester?
What was perhaps less plausible was that the company didn't have
people constantly watching the test ready to take over in the event
of some disaster. But that would be a lot more expensive - maybe they
instead choose ten families to test David in at the same time.
Also less plausible was not having tracking devices in all robots.
But storytellers are really finding it hard to adapt to the brave
new world where all characters can always know where all the other
characters are and talk to them at any time.
>group of androids who are scavenging spare parts from a dump. ...
>Why do these nonemotional androids want to survive? ...
I think it is that these androids are actually a lot more
sophisticated that most humans around them give them credit for.
All that blather about David being the first robot to have emotions
is just marketing hype - until David, robots just weren't very good
at expressing emotions in a way to induce sympathy from humans.
But a robot like Gigolo Joe just couldn't do what he did without
having a lot of things that function as emotions internally.
>And the crowd rises and boos the ringmaster off the stage - "Mecha does
>not plead for its life!" - but their decision is correct only by
>coincidence.
Yes, that's the point. Whether humans treat you nicely has little
to do with your absolute moral worth - it is mostly about whether
you can put on a good enough emotional display.
>... surely an advanced AI knows what 'fiction' is, and an AI boy
>knows that bedtime stories aren't true.
Surely if its makers want the AI to suffer the sort of confusions
that real boys do, they might be able to construct such an AI.
>... Gigolo Joe's speech about how
>humans resent robots because they know that, in the end, robots will be
>all that's left. Where did *that* come from?
It makes more sense if you assume that Gigolo Joe is a lot more than
a chatbot and simple sex droid. He is fully capable as an abstract
reasoner. He lacks more in his ability to display and evoke emotions.
And some of these lackings are probably by design - just as the slightly
unreal flesh tone is probably by design - so humans can be "racist".
>... If there are that many Davids, why are they all designed to
>have the human emotion of wanting to be unique?
I thought it was because human boys have that emotion.
>... For that matter, what possessed the idiots in Marketing to
>produce a batch of identical AIs all named David, instead of giving them
>individual faces and individual voices and maybe some quirks of
>personality? ...
The boy bodies on the rack did have different faces I think.
>Finally, after David realizes that he is not unique, he deliberately
>topples off a window ledge into the ocean. Uh... why? How is that a
>means to the end of getting his mother to love him? ...
David doesn't seem to fully realize that he is not a human boy.
He seems to be designed to think that he is human and do what
a human would do.
>pouting gibberish about yada-yada space-time yada-yada pathways yada-yada
>DNA yada-yada only one day yada-yada. ..
Yeah, that was the obvious dumb plot device. I won't defend it.
>The Successors could easily have given David a full deck of emotions, or
>could easily have created an immortal virtual Monica that was real to the
>limit of David's limited perceptions. Why didn't they?
Yeah, that bugged me too.
>I know there is a certain style of filmmaking that holds that the viewer
>should be allowed to pick their own ending, and I hate that style with a
>fiery passion.
Like Hal, I didn't mind it.
Were any of the other fathers out there bugged by David's not imprinting on
his father? David loved only his mother; the guy was "Henry". David didn't
care for or miss Henry at all. I suppose we could write this off as
postulating a big swing back to strong gender roles, but even so it seemed
implausibly extreme.
One other thing that bugged me was that there was no
Robin Hanson rhanson@gmu.edu http://hanson.gmu.edu
Asst. Prof. Economics, George Mason University
MSN 1D3, Carow Hall, Fairfax VA 22030-4444
703-993-2326 FAX: 703-993-2323
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:08:25 MST