Re: The Imitation Game (was: AI)

From: Dan Fabulich (daniel.fabulich@yale.edu)
Date: Sun Dec 19 1999 - 22:37:15 MST


'What is your name?' 'Eliezer S. Yudkowsky.' 'Do you deny having written
the following?':

> Yes; how many people would refuse to shut off their computers for fear
> of destroying the "Don't kill me!" applet? If there was a little box to
> converse with it, and it had a page of standard responses for the "I
> have a right to live" script, you could even get the viewers to form an
> emotional attachment with it. And whenever you closed the page or shut
> down the applet, it would scream. If you'd conversed with it long
> enough, it could even say "I thought you were nice!" before it died.

On further consideration, it seems to me that something more norn-like
would stand a better chance. It's all too easy for people to see through
ELIZA, but it's much harder to tell the difference between something
comparably less intelligent and an artificial simulation thereof. Maybe
not THAT much harder, but harder. Simulating a cute little animal with a
limited vocabulary (a la Furbies) would probably be the way to go, if this
were ever to be tried.

You'd probably want to have it be able to take steps to prevent you from
shutting it down, in addition to the verbal "panic" reaction. There could
be instructions like: "Rorty may be shut down by pressing the 'Kill'
button, though he may try to prevent you from shutting him down; if he
does, just drag him to the cage first."

> The question is, when does this actually become immoral-for-humans?

This IS a good question, but there are more variables involved than simply
the potential to people who stumble onto your Javascript, or even the
potential ethicality of destroying something "cute."

I mean, after all, what would it mean for something to actually pass the
Turing Test? There are a vast body of thinkers out there who would
"officially" declare something like this to be conscious; coupled with the
cuteness factor and potentially thousands of people out there with what
could be a very intense positive emotional reaction to this program, and
you have a proof of concept for artificial intelligence.

Granted, we don't have anything that can adequately play the Imitation
Game for humans, but getting something that can successfully play the
Imitation Game for animals would nonetheless be a significant step
forward.

On top of that, consider funding. There are a lot of people out there
today who think that we can't build something that seems conscious.
Doing so, starting there, interest in AI might greatly increase, in a
hurry.

> This disturbs me. And I don't even believe qualia are
> Turing-computable.

I agree, though my reaction lies more in the personality element; the guy
who tortured those norns probably exhibits some personality traits which
I'd hope my friends don't have.

-Dan

      -unless you love someone-
    -nothing else makes any sense-
           e.e. cummings



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:06:08 MST