Re: Hal Finney: "Re: Some questions on the Extropy Institute philosophy..."

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Wed Mar 20 2002 - 12:26:32 MST


John B wrote:
>
> The point of all this, is that as technology
> increases, our grasp of it decreases - the singularity
> approaches. And, rather than some mystical event
> occurring where humanity suddenly understands
> technology much faster (Ok, not mystical, but
> potential IA or AI technology isn't something we can
> do TODAY and which we have no real conception of its
> limitations) we just keep falling further behind,
> perhaps to the point where we can no longer adapt to
> our technological changes.

The Singularity *is* IA and AI, by definition. The Singularity is the point
in time when technology improves on human intelligence, and the subsequent
breakdown of our current models. This is Vinge's original definition. The
latecomer "acceleration" version of the Singularity is a noncanonical mutant
spinoff meme.

-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:13:03 MST