Re: Practical Cosmology Symposium--Five Papers Now Online

From: spike66 (spike66@attbi.com)
Date: Sun Jun 16 2002 - 23:53:54 MDT


  Eugen Leitl wrote:

>On Wed, 12 Jun 2002, spike66 wrote:
>
>>It is not clear to me why one cannot outrun a singularity, given a
>>few years head start. One might escape by the lack of I/O devices:
>>the AI could not get into your ship for the same reason that the
>>chip running your car's engine doesn't get viruses.
>>
>
>Yes, that's trivial, but you're stuck in interstellar space. All the solar
>systems are taken long before you can travel there. Sooner or later you
>either run out of juice or a random drone runs into you, and eats you.
>
These are all possibilities, but we left out several possible fates of
life on planet earth. The singularity is only one such possibility. It
might turn out that in general, military use of nanotech preceeds
a singularity, or for that matter bioterrorism.

Gene I am stuck thinking there is some unknown and unforeseen
damping mechanism that either prevents an all-out singularity or greatly
delays its onset. Some singularity thinkers dont take that possibility
seriously enough to even respond to my posts on that topic,
but I think it can happen. Every other open loop growth process
that I know of turns out to have unforeseen damping mechanisms.

As was suggested by a young lady (whose name I do not know)
at Eliezer's pitch at the last nanoschmooze in May, it could be that a
pre-singularity AI would spawn a virus gun that began barfing
worms everywhere, spewing viruses faster than the AV software
could respond, gumming up the works, perhaps even taking
down the internet.

A spacecraft fleeing bioterrorism *might* make it to the
next star a couple thousand years later, find nothing but
uninhabited inviting planets and never hear a word from distant
earth. spike



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:14:51 MST