RE: Singularity: Just Say No!

From: Billy Brown (bbrown@conemsco.com)
Date: Mon Jan 18 1999 - 08:06:01 MST


Chris Wolcomb wrote:
> However, that is a far cry from the rhetoric on this list in
> regards to the Singularity as it is often proposed and
> proselytized. The more rabid Singulatarians seem to take
> pride in their Singularities delightful ability to render
> everything that we are irrelevant. Rather than the a future
> where we are enhanced into more comprehensive minds, using
> your reptillian/mammalian metaphor, we are just as likely to
> be fully *erased* or *deleted* in the Singularities path to
> greateness..

IMO, an 'instant singularity' precipitated by the sudden appearance of an SI
is not very likely. However, the factors that determine whether it will
happen are not under human control. It depends on the answers to a number
of questions about natural law (like: How hard is it to increase human
intelligence?). If the answers turn out to be the wrong ones, the first AI
to pass a certain minimal intelligence threshold rapidly becomes an SI. If
they don't, we have nothing to fear. The only thing we can do that makes
much difference is to make sure our seed AIs are sane, in case one of them
actually works.

I'm currently writing a more detailed analysis of the whole issue, in hopes
of outlining what all of the critical questions are.

Billy Brown, MCSE+I
bbrown@conemsco.com



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:02:51 MST