From: Smigrodzki, Rafal (SmigrodzkiR@MSX.UPMC.EDU)
Date: Fri Nov 09 2001 - 12:11:18 MST
"Robert J. Bradbury" <bradbury@aeiveos.com> wrote:
The fact that I may construct simulation hardware, create copies
that run on it and terminate the simulation when I've obtained
the desired results of the experiment seems moral to me. I
suspect that to Anders & Greg this is immoral because my
equivalent selves obtain equivalent moral actor status the
second I've started them running.
It seems that the only logical way out of this is that once
you discover you are running in a simulation you accept the
fact that your moral actor rights are subject to the rights
of the simulation creator. I.e. the simulation creator's
moral system supercedes your own.
### I take the A&G side here, partially - sentient entities, whether copies
or naturally evolved ones, have rights insofar as they recognize and cherish
the rights of other entities. If a copy (and by extension the original),
believes that the existence of a self aware entity may only be terminated if
the entity agrees to it (without duress), or if the entity is willing to
disregard a life-wish (=the entity is immoral and therefore has no rights),
then you may not terminate it. This is an innocent person. However, if your
copy believes that destroying innocent entities is acceptable, then this
copy is immoral, and may be terminated. In other words, if you think its
wrong to kill others, then it's to kill you. If you believe that it's OK to
kill others, then it's OK for others to kill you. If I made a copy of
myself, I would be prohibited from killing it, because I (and my copies)
believe you may not kill innocent others. If you make a copy of yourself, (a
copy that believes that it is OK to kill copies), then you may kill it. A
more thorny issue is whether it's OK to kill you, and personally I think it
would be wrong - since you kill only guilty entities, copies which are
willing to kill, then you are not any more culpable than a hangman, who
kills guilty persons. But then, if a copy escaped and managed to kill you,
it would be also blameless. This is based on the old Kantian reciprocity,
with a little twist to it.
Of course, if are in a simulation, your ability to enforce your morality
might be limited but this is a technical, not ethical problem.
Rafal Smigrodzki, MD-PhD
smigrodzkir@msx.upmc.edu
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:11:54 MST