Re: Keeping AI at bay (was: How to help create a singularity)

From: Eugene.Leitl@lrz.uni-muenchen.de
Date: Tue May 01 2001 - 06:38:22 MDT


Damien Broderick wrote:

> How will we know when we're ready, 'gene? What does it *mean* to be `ready'
> in this case?

What I mean is to reduce the vulnerability.

There are reasons to suspect that AI is easier to do than an upload,
and that it will emerge relatively explosively, maybe even in a
catastrophic fashion -- the manmade Blight scenario.

The reasoning goes as follows: the capabilities of hardware progress
much faster than capabilities of orthodox (human) software design.
This causes a growing underutilization of hardware, particularly
exacerbated the impending restructuring of computer architectures
towards fine-grain reconfigurable computers, and the impending advent of
molecular electronics, and world-wide deployment of such hardware,
well interconnected by photonically switched networking, while
running buggy, bloated pieces of man-made code. A brain the size
of a planet, potentially.

The moment somebody creates a darwin-in machine method to utilize
above infrastructure much, much better, and uses it to breed
an AI core, and -- either deliberately, or accidentally releases
the thing into the network, we've got a large problem on our
hands. Very soon, we're no longer in control. Soon after, we
might be dead, as a simple side effect of the new ecology's
metabolism. It probably happened before, it's just the molecular
moieties bicking the bucket were incapable of reflecting on their
fate while busily being restructured by the new kids on the block.

In contrast to this, an upload is hard. The brain doesn't have
a clean, abstractable architecture from a human point of view.
So you're forced to do neuronal emulation, plugging in
neuronanatomy and computing its dynamics at a rather low
level of theory. This makes it slow, and power hungry. Many
orders of magnitude separate such a first target upload from
an AI running on the same substrate, because the AI has evolved
a highly efficient encoding and means of processing on that
substrate. It hugs the substrate very close, instead of
something cloaked in layers over layers of legacy virtual
machines, which go back to what people designed in the middle
of last century.

The situation is not hopeless, because a better encoding can be
developed, using more abstract means such as statespace dynamics
instead of numerical crunching electrochemical spikes propagation
along pieces of squishy stuff, which some large, complicated
digitized by disassembling your frozen noggin. The problem is
it, we have very little idea how to do it, and it will take a
while before we'll figure it out, and by the time it will become
an option for people to converted into this after their death,
or before, if we thoroughly debug and streamline it. As I said,
uploads are hard. It will take a while.

This doesn't mean everybody is going to jump the train, some
will choose to remain on the platform, but eventually the train
will start to move, unfortunately dramatically restructuring
everything in its wake, the platform included. Unless we chose
to drag the remaining dawdlers and renitents in against their
will, kicking and screaming, they'll probably going to die.
It's a tough choice: is their choice informed? Do we have to
respect their wishes, or assume they're just being difficult,
and upload the heck out of them (of course, you can still leave
them the choice; telling them: you're a virtual model now.
You've got a day or two to decide whether you like it, then
we're giving you the choice to terminate yourself).

Convergent evolution says at this stage we'll become indistinguishable
from the AIs, so we're removed the threat of mass extinction by
becoming indistinguishable from the threat in a more or less smooth
course of development.

This is iffy, because the uploaders who're ahead in the conversion
process might grow impatient, and unwilling to wait for all the
dawdlers, in which case we're dead meat, too. All hail to the new
master race, the stupid buggers.



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:07:24 MST