Re: How will you know that you've woken up from cryogenic sleep?

From: Mike Lorrey (mlorrey@datamann.com)
Date: Wed May 01 2002 - 12:24:54 MDT


Louis Newstrom wrote:
>
> From: "Adrian Tymes" <wingcat@pacbell.net>
>
> > This almost exactly duplicates Descartes' infamous musing. If you were
> > uploaded into a VR that *perfectly* replicated your current reality, in
> > a way that you did not remember the upload, then no, it would not
> > matter: by definition, there would be no difference to you.
>
> I disagree. There may be no difference in the position of matter and energy
> that I perceive, but there IS a big differnce.
>
> It would mean, for example, that those in pain could be cured, but that some
> intelligent being has decided not to. Also, our loved ones who die, need
> not die. They could be saved. This is diferent from reality, where these
> things are beyond anyone's control.
>
> As a related analogy: Imagine a child on the railroad tracks gets killed by
> a train. Now imagine that an adult knew this was going to happen and didn't
> stop it. Would you say "the result is the same, so it doesn't matter"? I
> wouldn't.

Ah, but is virtual death as important as 'real' death? This is the
question posed in the movie "The 13th Floor". While The Matrix posited
that virtual death caused real death (without any explaination as to why
other than the power of belief), in The 13th Floor, Leonardo DiNofrio's
virtual character goes into homicidal rage at his lack of existence in
the "real world", concluding that since he, and everybody else in his
world, is a simulation, that their deaths are unimportant to 'the real
world'. He is thus free of moral stricture, and chooses to take out his
rage at realizing his unreality upon the only representation of the
'real world' he can find.

Life is important in the world in which it exists. There is no other
standard by which it needs to be compared for importance.

At the same time, life that is simulated, in relation to a higher
reality it is generated from, is less important in that higher level
reality than a life in that higher level reality, *but only to those in
that higher level reality*. That is not to say that shutting down a
simulation containing intelligences is an entirely amoral act. Starting
a simulation that generates intelligences is an act of enormous
responsibility. This is where those 'Scruples' questions I'm known to
ask come into play.

Lets say you create a world which evolves intelligences in it, a whole
world or civilization full of them. Those intelligences, however, are to
you as a mouse is to a baseline human. The Scrupulous Question to ask
then is: at what point are x many mouse lives worth one baseline human?
When do you give your life to protect that simulated civilization from
extinction? How many simulated intelligences is your life worth?



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:13:45 MST