Re: Principle of Nonsuppression

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Thu Sep 02 1999 - 12:46:53 MDT


mark@unicorn.com wrote:
>
> Eliezer S. Yudkowsky [sentience@pobox.com] wrote:
> >Once Zyvex has nanotechnology,
> >I'd be fully in favor of their immediately conquering the world to
> >prevent anyone else from getting it. That's what should've been done
> >with nuclear weapons. If the "good guys" refuse to conquer the world
> >each time a powerful new weapon is developed, sooner or later a bad guy
> >is going to get to the crux point first.
>
> You know what scares me the most about the future? All these control freaks
> and their desire to take over the world to protect themselves from the "bad
> guys"; Eliezer and den Otter the most obvious proponents on this list. We
> must all support the "good guys" in taking over the world and introducing
> their global surveillance utopia while we await The Coming Of The Glorious
> Singularity!

That's pretty much the idea, though bear in mind that what I *want* is
AI, no nanotechnology, no global surveillance necessary. Bear in mind
that I have absolutely no interest in any form of nanotech-inspired
social power, except the ability to prevent others from using nanotech.
With violence, there's a stable Libertarian solution where people retain
the capacity to use violence, and gang up on anyone who initiates force.
 In a nanotech world, the first person to use force can conceivably wipe
out all the others. It seems to me that whoever first gets nanotech has
the responsibility to prevent the proliferation of nanotech by any means
necessary; otherwise, it'll simply spread until it touches someone who's
willing to hold the world hostage for personal benefit.

Unfortunately, in practice, whoever gets nanotechnology first *probably*
isn't going to get full-scale Drextech without fairly good AI and a lot
of advance software being written, which *could* be the case but
probably won't be. Thus precipitating us directly into the situation
outlined in "Molecular Nanotechnology and the World System" - one of the
best reasoned, most inexorable, and bloody depressing things I've ever
read - in which the most I can hope for is that a few people manage to
evacuate the planet before things go completely to hell, and finish
building a seed AI in sealed globes orbiting the ashes of Earth.

http://www-bcf.usc.edu/~tmccarth/main.htm

> Look Eliezer, we know you're a rabid Singularitarian,

Why, thank you.

> but to those of us who
> actually work on developing advanced hardware (my employer designs chips at
> least as complicated as anything coming out of Intel) the idea that we'll
> have this new technology appear and then in a few days we'll be surrounded
> by nanotech death machines and massively intelligent AIs is blatantly absurd.

The faster nanotechnology runs, the less risk there is. From my
perspective, anyway.

> Building hardware at nanoscales is difficult enough, but the software is
> way, way, behind; there are features we've had in our chips for years which
> are only just coming to be used by applications, and developers aren't even
> beginning to use the computing power we're giving them in anything but the
> most simplistic ways. No matter how powerful the hardware, the software will
> be a long time coming, even with neural nets or genetic programming.

Yes, the state of software programming is simply hideous. I'll probably
have to do something about it just to have the proper support base for
an AI. But don't worry, I've got a plan.

> Maybe if you took a more realistic view of how technology is really developing
> and how it's likely to develop in a nanotech future, you wouldn't be so
> scared that you're willing to destroy the Earth in a nanotech war in order
> to prevent one.

Again - I am scared of nanotechnology in *inverse* proportion to how
fast it progresses once it develops.

-- 
           sentience@pobox.com          Eliezer S. Yudkowsky
        http://pobox.com/~sentience/tmol-faq/meaningoflife.html
Running on BeOS           Typing in Dvorak          Programming with Patterns
Voting for Libertarians   Heading for Singularity   There Is A Better Way


This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:05:00 MST