Re: making microsingularities

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Fri May 25 2001 - 21:24:44 MDT


Spike Jones wrote:
>
> "Eliezer S. Yudkowsky" wrote:
>
> > Just for the record, for those of you who aren't aware, there's a
> > significant faction that regards the whole Slow Singularity scenario as
> > being, well, silly.
>
> This is something Ive wondered about. If someone manages
> to create a seed AI very early in human history, such as now,
> this year, the computer infrastructure would be still insufficient
> to create the SIAI vision of a non-Slow Singularity. Or am I
> hopelessly confused about something really basic? spike

There's enough computing power on the Internet right now - or, as Eugene
Leitl and Christine Peterson prefer to call it, "poorly defended"
computing power - for transhumanity. You can get a custom DNA sequence
synthesized and an arbitrary DNA sequence turned into protein, so if you
can crack protein folding you can get drextech, in, oh, call it three or
four days with FedEx, or one or two days if you assume that the newborn
can spend millions of dollars without tipping anyone off. That's the
slightly-but-not-extremely optimistic scenario, anyway.

So if someone created a seed AI right now, the computing infrastructure
would probably be (a) sufficient unto hard takeoff, superintelligence,
nanotech, etc., or (b) insufficient to do anything but hum around and not
have any significant effect on society. Either way you don't get a Slow
Singularity.

-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:07:48 MST