Re: Movie Review - AI Artificial Intelligence

From: hal@finney.org
Date: Sat Jun 30 2001 - 16:22:55 MDT


This has spoilers so if you don't want to read them please skip
the message.

The big issue raised by the movie was the treatment of mechas by the
orgas. It was shocking and troubling to see how cruelly people treated
these machines even though they were sentient. The one salvation was that
the machines did not seem to object much. It is only when David cries
out in fear and pain that he is saved from destruction in the flesh fair.
"Mecha does not cry out and beg for its life," the crowd says, and David
is freed.

The scene that really drove it home for me was a small one; Teddy has
followed David to the Flesh Fair and is picked up as a lost toy. He is
carried off to lost and found, not cruelly but casually and uncaringly,
just as we would with a teddy bear.

But all the while he asks, "Have you seen David? Where is David?"
His questions are ignored. This seemed strange to me. Granting a world
where "supertoys" can talk, wouldn't it make sense that they might be able
to tell something about their owner? Even if the security guard wasn't
interested in getting into a conversation with the toy, you'd think he
would pay attention enough to say, "Is David your owner? What's his
last name? What's he look like?", questions which the supertoy could
reasonably have answered and which would help return it to its owner.

In retrospect I felt that the key issue which was left unstated was the
morality of the creation of David himself. Is it right to create a robot
so helpless, so dependent, so locked into loving its surrogate parent?
David was helpless in the world not because he was new, but because
he was designed that way. He was not given the knowledge to survive,
he was designed to be frightened easily and to look to grown-ups for help.

This was his purpose, he was there to fill a need, but what about him
and his needs? That doesn't enter into the equation.

And what would have happened if David had stayed with his "mommy"?
Would she still want him after 5, 10, 20 years when he's still 7
years old? He is to be a perfect child, never growing, never changing.
But won't parents become frustrated eventually with a child who never
learns and advances?

In the end we learn that Dr. Hobby has lost his own son David and has
modelled the robot David after his son. David meets another copy of
himself, and I got the impression that this was a version which Dr.
Hobby had kept for himself, had kept as his own surrogate child to
replace his lost son. This David seemed happy, personable, well adjusted,
compared to the lost, frightened David we followed throughout the movie.
No doubt if things had worked out better our David would have stayed
home and had that same happy smile. But still I wonder how long that
happiness would have lasted, to a robot for whom 50 years is not very
long after all.

The other injustice to David was that his self esteem was built in large
part upon the idea that he was unique and special. It is when he sees
that he is only one in a whole assembly line of Davids that he gives
up on his quest. It is true that he was the first model (apparently)
and so somewhat special in that way, but seeing all those other robots
with his face and name drove away any true feeling of uniqueness.

Why did this have to happen? Why did David have to be taught a lie like
that? Every child grows up knowing that he is one of millions of children
in the world much like himself, but still with his own bit of uniqueness.
Why couldn't David have been given the same understanding, instead of
being burdened with a falsehood which would eventually destroy him?

I view these objections and questions not as flaws in the movie or
even in the fictional world presented, but rather as questions that
we must face as we begin to develop these capabilities ourselves.
Hans Moravec argues that our creations, although not alive and conscious
in themselves, can be thought of as approximations or conduits to an
abstract conscious entity. Mistreatment of these approximations is,
in a sense, as harmful as mistreatment to actual beings. In his model
it's not so much that it increases the amount of misery in the world,
but rather, it brings our world closer to one where there is unhappiness.

When we mistreat our creations now, we develop habits which may carry
over to when they become more alive than they are today. This movie
is something of a reductio ad absurdum of such a trend. I have always
assumed and hoped that as our creations "wake up" we would begin to see
that creations are human and deserve human rights. But what if instead
the change is so gradual that we find ourselves continuing to treat
them as toys to be broken at will? This is the world that AI depicts,
and perhaps it could creep up on us after all.

Hal



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:08:23 MST