From: Anders Sandberg (asa@nada.kth.se)
Date: Fri Nov 09 2001 - 16:01:54 MST
On Fri, Nov 09, 2001 at 02:11:18PM -0500, Smigrodzki, Rafal wrote:
> "Robert J. Bradbury" <bradbury@aeiveos.com> wrote:
>
> The fact that I may construct simulation hardware, create copies
> that run on it and terminate the simulation when I've obtained
> the desired results of the experiment seems moral to me. I
> suspect that to Anders & Greg this is immoral because my
> equivalent selves obtain equivalent moral actor status the
> second I've started them running.
>
> It seems that the only logical way out of this is that once
> you discover you are running in a simulation you accept the
> fact that your moral actor rights are subject to the rights
> of the simulation creator. I.e. the simulation creator's
> moral system supercedes your own.
>
> ### I take the A&G side here, partially - sentient entities, whether copies
> or naturally evolved ones, have rights insofar as they recognize and cherish
> the rights of other entities.
Actually, I do not quite take the position Robert suggests.
First, if a simulated being begs for or at least desires its continued
existence, my view is that it is expressing its right to life, and I am
not ethically allowed to erase it - even if that simulated being is
isomorphic to me. On the other hand, if that being accepts the
situation, then there is no ethical conflict at all. Now, since I
largely think (at least right now, when I am not in the simulation!)
that I would accept an isomorphic copy of me as equal in value and a
continuation of the Anders thread, it seems that both me outside the
simulation and me inside would be content with the simulate-and-erase
scheme, at least if the copy did not have the time for significant
divergence. But person X, who happens to have a different view of what
constitutes X-ness and how to define himself, would not come to the same
conclusions about the acceptability and would be morally obliged not to
terminate the simulation (or refrain from it).
> If a copy (and by extension the original),
> believes that the existence of a self aware entity may only be terminated if
> the entity agrees to it (without duress), or if the entity is willing to
> disregard a life-wish (=the entity is immoral and therefore has no rights),
> then you may not terminate it. This is an innocent person. However, if your
> copy believes that destroying innocent entities is acceptable, then this
> copy is immoral, and may be terminated. In other words, if you think its
> wrong to kill others, then it's to kill you. If you believe that it's OK to
> kill others, then it's OK for others to kill you. If I made a copy of
> myself, I would be prohibited from killing it, because I (and my copies)
> believe you may not kill innocent others. If you make a copy of yourself, (a
> copy that believes that it is OK to kill copies), then you may kill it. A
> more thorny issue is whether it's OK to kill you, and personally I think it
> would be wrong - since you kill only guilty entities, copies which are
> willing to kill, then you are not any more culpable than a hangman, who
> kills guilty persons. But then, if a copy escaped and managed to kill you,
> it would be also blameless. This is based on the old Kantian reciprocity,
> with a little twist to it.
I still can't come to grips with your view that it is OK to destroy
entities that disregard the life-wish of others even when they are no
threats to you. It seems somewhat inconsistent: you respect the
life-wish of everyone - regardless of their mental states - except a
certain subset defined by their ethics. But that is a limited respect of
the life-wish (after all, even those ignoring it in others have it
themselves), if it was extended further you would yourself fall into the
category of people whose life-wish should not be respected. In fact, it
is not obvious that you are already in that category. Would you respect
the life-wish of someone who respected the life-wish of everybody,
except the life-wish of some small ethical subset like (say)
scientologists?
> Of course, if are in a simulation, your ability to enforce your morality
> might be limited but this is a technical, not ethical problem.
Exactly. This is about what is *right* to do, not what we can do.
-- ----------------------------------------------------------------------- Anders Sandberg Towards Ascension! asa@nada.kth.se http://www.nada.kth.se/~asa/ GCS/M/S/O d++ -p+ c++++ !l u+ e++ m++ s+/+ n--- h+/* f+ g+ w++ t+ r+ !y
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:11:54 MST