Re: Singularity?

From: Matt Gingell (mjg223@is7.nyu.edu)
Date: Thu Sep 02 1999 - 13:22:46 MDT


----- Original Message -----
From: Eliezer S. Yudkowsky <sentience@pobox.com>

>See http://pobox.com/~sentience/AI_design.temp.html [343K]
>
>> Where does your 2020 figure come from? (I'm afraid I
>> don't know how to expand the 'CRNS' qualifier.)
>
>Current Rate No Singularity. The 2020 figure is how long I think it'll
>take for AI researchers to think their way out of a cardboard box if I'm
>not running things. I'd *like* to have it done by 2005.

I skimmed your document and, with all due respect, I do not see that your model,
as I understand it, differs significantly from classical AI. You have a number
of modules containing domain-specific knowledge mediated by a centralized world
model. This is a traditional paradigm. The macro-scale self improvement you
envision is not compelling to me � if you�ve written a program that can
understand and improve upon itself in a novel and open-ended way then you�ve
solved the interesting part of the problem already.

Could you identify the cardboard box you think AI research is stuck in, and what
you�d change if you were in charge. (You have 5 years...)

-matt



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:05:00 MST