From: Spike Jones (spike66@ibm.net)
Date: Sat Nov 27 1999 - 09:45:36 MST
Eliezer S. Yudkowsky wrote:
> http://www.eurekalert.org/releases/corn-bmo112399.html
>
> ...We're all going to die.
Eliezer, let me make sure I understand your theory. If humans
develop nanotech before the singularity, then the notion is that
it will get away from us by some means, and we all die from
grey goo? But if the singularity comes first, then the resulting
AI develops nanotech and we [in some form] have a fighting
chance of survival?
The notion sounded absurd to me at first, but I must admit
it grows on one with consideration. spike
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:05:52 MST