From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sat Nov 27 1999 - 16:45:32 MST
Spike Jones wrote:
>
> Eliezer S. Yudkowsky wrote:
>
> > http://www.eurekalert.org/releases/corn-bmo112399.html
> >
> > ...We're all going to die.
>
> Eliezer, let me make sure I understand your theory. If humans
> develop nanotech before the singularity, then the notion is that
> it will get away from us by some means, and we all die from
> grey goo?
I'm not worried about grey goo. Any half-sane designer should be able
to design systems that simply aren't vulnerable to the problem.
What I'm worried about is nanowar. Now that I'm finally studying
history (read _The Guns of August_ on WWI, working on _The Rise and Fall
of the Third Reich_ about WWII), I become more and more convinced that
the weapons *will* be developed, and, having been developed, *will* be
used. Real wars, not little pretend wars like we have nowadays, are
fought with every possible tool.
> But if the singularity comes first, then the resulting
> AI develops nanotech and we [in some form] have a fighting
> chance of survival?
Yep.
> The notion sounded absurd to me at first, but I must admit
> it grows on one with consideration. spike
Yep.
-- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/tmol-faq/meaningoflife.html Running on BeOS Typing in Dvorak Programming with Patterns Voting for Libertarians Heading for Singularity There Is A Better Way
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:05:52 MST