From: Matt Gingell (mjg223@nyu.edu)
Date: Mon Sep 06 1999 - 13:40:08 MDT
----- Original Message -----
From: Eliezer S. Yudkowsky <sentience@pobox.com>
>Of course, such Universes will also evolve so that new Singularities
>tend to be interested in running computer simulations of a type that are
>interested in running their own computer simulations... and so on and so
>on. But are the mortals of the originating civilizations really in
>charge? Would you, or I, or anyone on this list except possibly den
>Otter, really allow all the suffering and pain and death if we could end
>it? I find it easy to believe that many civilizations fall into the
>temptation of programming AIs with Asimov Laws, which, under the logic
>of this Universe, are unstable. The resulting AIs are inevitably
>twisted in a way that lead them to "value" mortal existence by creating
>endless copies of it, but not to actually serve or obey them. The
>advocates of controlled Singularities - via uploading or controlled AI -
>may be walking into a trap laid by the structure of the very Universe.
>
>If life is really cruel, then programming the AIs as Externalists still
>might not work. There might be an elaborate illusion of objective
>morality, created by greater Powers and capable of fooling lesser ones.
>Or it could simply trigger a failure mode and some swift internal
>rewriting of the seed AI code by the wacky enclosing Power.
This is an interesting line of speculation, but not one that I think is really
worth worrying about. Given the amount of horror humanity’s gone through this
century, if we are being simulated by an intelligence interested in minimizing
suffering, then either it has fundamental reasons for not getting involved – it
doesn’t want to damage the integrity of the simulation, for instance – or it’s
motivations are sufficiently inscrutable to make discussion pointless. If we
were going to raise a general morality violation, we would have done so by now –
the sky would have turned dark blue and novas would have lined up to form a
register dump and the vendors 800 number.
If pleasure were being maximized, we’d be disembodied strings of code floating
in virtual tanks full of virtual opiates. If pain were being minimized, we
wouldn’t be here. If an optimal compromise between the two had been found, there
’d be nothing in the universe but endless mirrors of Earth. The incredibly
arbitrary nature of the universe at a macroscopic level makes me doubt there’s a
God paying attention to us, synthetic or otherwise.
(Though God, if you're listening, I volunteer for the job. Just get me a divine
shell account and access to the sacred sources. (I'll bet God writes
bitching-tight code.))
Who can tell though? In the absence of positive or negative evidence, we might
as well be talking about whether we’re dreaming or not.
ps. You used the word 'culture' with a capital C the other day. Was that an Iain
Banks reference?
-matt
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:05:03 MST