Re: When You Aren't a Person at All ( was How to tell if you are a nice person)

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sat Jun 22 2002 - 16:57:35 MDT


Lee Corbin wrote:
> Eliezer writes
>
>>Here's a stranger and stronger form of the question: What would you do if
>>you found out that the world was a simulation and *someone else* was the
>>only real person in it? Assume that we subtract from the equation your
>>Cartesian knowledge that you are a real person and replace it with the hard
>>knowledge that you are a giant look-up table. What would you do?
>
> When I asked people to consult their intuitions about what they'd do
> as a VR solipsist, I was implicitly asking that they adopt the first
> person viewpoint: they were to imagine that it had happened to them,
> and then they were to next imagine what they'd think, what they'd
> feel, what they'd plan, and finally what they'd do.
>
> But you've pulled the rug out from all that. There is no first person
> anymore in your experiment! So I can only imagine how I look to other
> people, and ask how poor Lee would appear to behave. It seems to me
> that he would probably vigorous deny that he wasn't conscious. He
> would probably start claiming that he had been entirely wrong about
> lookup tables!

No; the idea is that you imagine what you would do if it were proved to you
that you were a giant lookup table, even though this is not a
self-consistent description of reality, and from this you reason what a GLUT
wallpaper version of yourself would do. This may or may not be the case,
but it's how I'm phrasing the ethical question.

> You, right now in your office, can act as though you accept
> the fact---were it proved to you right now---that your brain
> is in a jar in California. (Someone got the drop on you, and
> you never really made it back from the last nano conference.)
> Also, I submit, you believe that this is quite possible, and
> all you need is evidence for you to believe it. But could you
> believe, right now, that you are not conscious? (This is,
> like, telling us something VERY important, IMO.)

I couldn't, but that's because I am *not* a GLUT and have evidence to this
effect (qualia). In a strictly pragmatic sense the test is impossible
because a qualia-bearing entity has valid evidence of existence at any given
time, and a GLUT simulating a qualia-bearing entity will behave as if it had
valid evidence of existence, even though it does not. So I will always
validly refuse to be convinced that I am not conscious, and a GLUT Eliezer
will always invalidly refuse to be convinced that it is not conscious.

Actually, in this case I should probably be assuming wallpaper rather than
GLUT. Wallpaper might have enough pseudo-processing that it would be able
to "notice" its own lack of consciousness. After all, I have qualia but I
don't *have* to have qualia; I am real but I don't *have* to be real. I can
imagine a version of myself that is wallpaper, and ask what that self would
do, without breaking identification.

I.e: http://www.plif.com/archive/wc059.gif

-- 
Eliezer S. Yudkowsky                          http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence


This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:14:58 MST