From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Tue Aug 20 2002 - 21:47:41 MDT
Colin Hales wrote:
> The AGI researchers? Here they are,
> the Mindsmiths, from which some combination of talents the AGI may ensue, as
> far as I can tell, in no particular order:
>
> 3) Eliezer, http://www.optimal.org/
> If you want to judge where the AGI component of the singularity is coming
> from here's the layman's 'AGI litmus test', IMO: If the AGI wannabe lifts
> hands over the keyboard to write one line of code that will be part of the
> AGI 'final runtime program', then they have failed. Think of it this way: A
> cake recipe, to an appropriately trained human, can represent a cake really
> well to that human. _But it's not the cake_. The recipe becomes a cake when
> the human is there. The progress is slow because most seem bent on recipes
> instead of the cake, and they don't know it. We have to make cakes, not
> recipes.
Did you leave out a negative in one of those sentences? I can't parse the
reasoning as it stands.
-- Eliezer S. Yudkowsky http://singinst.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:16:18 MST