Spike Jones wrote:
>
> Eliezer S. Yudkowsky wrote:
>
> > http://www.eurekalert.org/releases/corn-bmo112399.html
> >
> > ...We're all going to die.
>
> Eliezer, let me make sure I understand your theory. If humans
> develop nanotech before the singularity, then the notion is that
> it will get away from us by some means, and we all die from
> grey goo?
I'm not worried about grey goo. Any half-sane designer should be able to design systems that simply aren't vulnerable to the problem.
> But if the singularity comes first, then the resulting
> AI develops nanotech and we [in some form] have a fighting
> chance of survival?
Yep.
> The notion sounded absurd to me at first, but I must admit
> it grows on one with consideration. spike
Yep.
-- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/tmol-faq/meaningoflife.html Running on BeOS Typing in Dvorak Programming with Patterns Voting for Libertarians Heading for Singularity There Is A Better Way