Re: Major Technologies

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Tue Jan 05 1999 - 19:14:47 MST


Billy Brown wrote:
>
> Eliezer S. Yudkowsky wrote:
> > Nanotechnology is a wild card that could stay unplayed or enter the game
> > at any time. Nanotech's first applications will be entirely
> > destructive. The researchers at Zyvex or Foresight will naively release
> > the information and someone will reduce the Earth to grey goo.
> <snip>
> > Most probable kill: Grey goo; nuclear war provoked by a
> > nanotech threat.
>
> Nobody does pessimism like a countersphexist, hmm? We could argue all day
> about the potential for gray goo, but I can at least assure you that the
> Foresight people don't take it lightly. They've put a good bit of thought
> into how to avoid it, and I expect they will continue to do so.

With all due respect to the people at Foresight, active shields are a
pipe dream. I don't dispute that they might be able to defend against
malfunctioning assemblers, but military red goo will win every single
time. The only halfway sane analysis I've seen is "Nanotechnology and
International Security", which lists many of the major destabilizing
factors. Nanotech is far more destructive than nuclear weapons, can be
developed with less of a visible lead time, has better first-strike
capability, and larger benefits for use. MAD won't work. Threatened
nuclear powers may decide to strike first. So the world will blow up.
This is the logical consequence.

> As far as the puny stuff goes, nukes won't end civilization. This is a myth
> perpetuated by people who haven't studies the numbers. It would be feasible
> to build enough high-yield weapons to do the job, but even at the height of
> the cold war we never came close to doing it. Today, the best we could do
> would be to knock ourselves back to a pre-WWII industrial base for a couple
> of decades. The death toll would be huge, but we would still end up with a
> Singularity.

That's reassuring, but the question is whether we would stand a better
chance of survival the second time around. My guess is that the
resource base for nanotech would pop back up faster than the Internet,
and that major technophobia would strike at nondestructive IE more than
destructive nanotech. Stupid, but so what?

I should like to ask everyone, regardless of their personal evaluation
of the effect of a nuclear war, and regardless of their personal
preference with respect to the Singularity, not to deliberately start a
nuclear war.

> > Humanity's primary hope of survival lies in a quick kill via AI, and the
> > best way I see to do that is an Open Source effort on the scale of
> > Linux, which I intend to oversee at some point. Some IE via
> > neurohacking may be developed fast enough to be decisive, and the
> > existing Specialists (such as myself) may be sufficient.
>
> Where do I sign up? You've seen my own projection by now - I want to make
> sure that if you get hit by a truck halfway through the project, the damn
> thing still has a decent chance of being sane.

You sign up a few years from now, actually. I'm still trying to build
the resource base. I'm not going to ask for time, even though I'm only
19, because Zyvex won't wait; but I still think that a brief detour may
be my fastest route to the target.

Sanity is mostly a matter of not doing stupid things than specific
precautions, and most of the major precautions I believe I put down in
_Coding_. If I get hit by a truck, you can probably build a _sane_ seed
just by referring to the Page.

-- 
        sentience@pobox.com         Eliezer S. Yudkowsky
         http://pobox.com/~sentience/AI_design.temp.html
          http://pobox.com/~sentience/sing_analysis.html
Disclaimer:  Unless otherwise specified, I'm not telling you
everything I think I know.


This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:02:43 MST