Re: making microsingularities

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Mon May 28 2001 - 20:41:04 MDT


Samantha Atkins wrote:
>
> "Eliezer S. Yudkowsky" wrote:
> >
> the record, for those of you who aren't aware, there's a
> > significant faction that regards the whole Slow Singularity scenario as
> > being, well, silly.
>
> But the fast path assumes a lot of "friggin magic". It assumes
> that we can build not only a true AI but an SI and instill it
> with or have it develop wisdom and Friendliness.

Not at all. The fast path says that, after you build a certain threshold
level of AI, it flashes off to superintelligence in relatively short
order. The slow path says that you achieve human-equivalent AI and then
it basically hangs around in the same place for the next fifty years. So
the word "silly" is being used advisedly, as shorthand for "screamingly
anthropomorphic" or "as seen on Star Wars". As for Friendliness, I don't
see the connection.

-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:07:49 MST