Re: Is this world a computer simulation?

From: Robert J. Bradbury (bradbury@www.aeiveos.com)
Date: Mon Sep 06 1999 - 14:14:35 MDT


On Mon, 6 Sep 1999, Eliezer S. Yudkowsky wrote:

>
> Would you, or I, or anyone on this list except possibly den Otter,
> really allow all the suffering and pain and death if we could end it?

Sure, if we *knew* it wasn't real, if we *knew* it was only for play.
We as humans, "generally" do not have a problem with "games" where
the opponent (whether they be virtual in a computer, or real in
a military war game) gets "killed", because they get "killed" anyway
when we unplug (stop) the game.

But say I can put the game in suspend state, because I've got
something better to do, then say a new model game comes along
that is much more interesting and I decide to throw out all my
old saved "game disks". Oooops, sorry, guys, but you don't have
a claim to my matter/energy resources.

This goes back to Greg's discussion of the "morality of mind"
and the ethics of godhood. If the scale is sufficiently
different, then I don't think a God/SI has any problem with
allowing suffering. Suffering is ultimately a consequence
resource limitations that *will* sooner or later result in
death. If the universe is finite, and all energy eventually
runs down in it, then death for everyone (human & SI alike)
is universal and can in no way be "significant". Its like gravity,
it pulls "down" just because thats just the way it *is*.
If SIs can escape the universe (or death), then suffering
(leading to death) is the way it *is* for un-enlightened minds
(humans), be they physical or virtual. So, SIs don't
cause suffering and they don't have an obligation to
interfere -- perhaps just as we have no obligation to
interfere in the natural selection that occurs every day
on our planet. It is interesting, while I've heard
environmentalists/animal rights activists argue, that
we shouldn't *harm* the animals, I don't think that they
would ever argue, "We should save them all."
Since higher primates, dolphins, whales are close enough
for us to consider them "relative" equals, if we *do not*
make the argument, "Let us save them all from anything
that would harm them", then we cannot make the argument
that SIs would care about saving us from suffering.
Particularly when an SI is to us, what we are to bacteria
and the further down the scale you go, the less concern
we seem to have for it. Greg might argue that since we
are "moral entities", the SI would want to minimize our
suffering, but that presumes that our moral reasoning
is more sophisticated to it, than the chemotaxis (attraction)
a bacteria has to sugar!

[Qualification: when I say "suffering", I mean it in the sense
of "unintentional" suffering, not the kind of suffering inflicted
by individuals on each other. Though one could argue that
intentional suffering is the same as unintentional suffering,
because even if you remove the "intentional" sources, you still
*ultimately* suffer. In a situation when it is absolutely
"known", that there is no point to the minimization of suffering,
then I do not know why anyone (human or SI) would bother to waste
their time trying to minimize it.]

> There might be an elaborate illusion of objective morality,
> created by greater Powers and capable of fooling lesser ones.

Perhaps it is nothing but "false hope", that we would like
believe there is an "objective morality". It is certainly
true that in game theory, cooperation/honesty (if guaranteed)
wins because you don't waste resources/opportunities on defending
yourself against opposition/betrayal.

> But my intuitions say this Universe is real on the quark level, and I
> trust my intuitions.
>
With simulated nanobots in your brain, how can you say that
with any confidence? :-)

Robert



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:05:03 MST