From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Wed Apr 02 2003 - 13:28:24 MST
Mark Waser wrote:
> Lee Corbin wrote:
>
>>>"But as for being morally "okay", no, suffering of any form is
>
> inadmissible (contradicting rationalizations for suffering in another
> thread), including our displeasure at having apparently undergone sorrow.
>
> I just have to jump in here . . . .
>
> Suppose I'm playing a fantasy role-playing game (Dungeons & Dragons or
> something similar) and one of my characters dies a horrible death. Is this
> morally wrong?
No. I am reasonably certain that fictionally imagined characters don't
have qualia. I'm in really, really deep trouble if they do.
> Suppose it's twenty years in the future and I'm playing the newest total
> immersion version of this game. I'm booted out of the game once it's
> apparent that my character is about to die horribly but everyone else "sees"
> me die horribly. Is this morally wrong?
No. No qualia.
> Suppose that for the excitement/thrill/entertainment or for some learning
> experience that I'm willing to accept experiencing some shadow of that
> horrible death. It won't kill me or permanently damage me but it will allow
> me to stay in the game until the last possible instant, try to strive to
> accomplish nearly impossible things, have a great and realistic death scene,
> etc. Is this morally wrong? There's certainly might be (minor) suffering
> involved here but it's suffering that I've chosen . . . .
I might perhaps advise against it, or try in my personal capacity to
persuade you not to do it, but I would not regard it as permissible to try
and prevent it by force.
> Suppose that for the ultimate in realism and to truly "live the game", I've
> decided to accept the temporary blockage of all outside-game knowledge.
> Until I "die" inside the game, I won't remember/know about anything outside
> the game but once I "die", I will go on living my normal life outside the
> game. Is this morally wrong (Assume that we're advanced enough that there
> is no way in which in-game events can harm, much less traumatize, my
> outside-game self)?
Yes, because "you" and "you with your memories elided" are *two different
people* - your future self is a different person, for purposes of
volition, than your past self. The past you that decided to have its
memories elided is not the you who suffers and says "Let me out!"
Furthermore, once the present-day you has been immediately extracted from
the simulation in accordance with his volition, there is no reason why the
elided memories should be "reloaded" (perhaps effectively killing the new
you) unless you consent to it. Having your memories elided is a one-way
trip unless the new you decides to accept them back, and certainly you
can't make decisions for your elided self.
> Note: This can also explain Eliezer's asking to be let out of the "sim" but
> apparently not being let out in at least three different ways:
> a) he left instructions not to be let out
What do I care what my old self did? I am Eliezer now.
> b) the instant he got out, he asked to be returned with no in-game time
> elapsed and all memories erased again
"Erased again"? Who said anything about asking for the memories back? I
might be interested in looking them over, but whoever I am now, my
motivations are those of Eliezer, and *I* wouldn't play the video game of
me. I am not interested in having my motivations overwritten by those of
someone who is clearly so different from me, even if it's myself.
> c) someone else is now playing the character "Eliezer"
Er... "someone else"... um... okay, that's rather an interesting statement
from my perspective.
> I must admit that I don't see any way in which it can be proved to me that
> I'm not living in a sim. All the research on vision which has recently been
> referenced here clearly shows that we don't even really see what we think we
> see. I think that Boostrom's requirements for a simulation are way, WAY
> higher than they need to be because we don't experience/know anywhere near
> what we believe we experience/know (particularly if I/you are in a
> individual/solipsistic sim where everyone else is either programmed or knows
> about the sim and is manipulating it). And I know that there are all sorts
> of reasons why I would be willing to be placed in the last scenario since I
> could easily imagine it as the method by which an advanced civilization
> investigates other possibilities or even might teach their children.
But wouldn't you have decided to play Darwin or Gandhi rather than Mark Waser?
> But, the final point which I wish to stress, however, is that while arguing
> about whether or not we are in a sim is amusing . . . . ultimately, if the
> sim is to be of any value, we must behave as if we are not in a sim.
> Suffering (or, at least, senseless suffering and senseless deaths) cannot be
> rationalized from OUR perspective and we need to strive against it with all
> our might.
If I could know this was a sim via a method of knowledge that definitely
discriminated between simulated Eliezers and planetary-evolved Eliezers, I
would behave very differently; I would start looking for a girlfriend, for
example.
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:42 MDT