From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Tue Mar 14 2000 - 21:48:35 MST
"Robert J. Bradbury" wrote:
>
> That is interesting information. So the pressure will be on for Eli
> in May. Even in disguise, I think we will recognize the memes
> unless he is totally silent (and somehow I don't expect that to
> be the case...) :-)
Whoa! Let's be clear about one thing: I am not showing up at this
conference to talk about AI. I am showing up at this conference to talk
about the Singularity, a subject on which, as far as I know, I have
written more - and more specific - material than anyone else in the
world. I shouldn't need to touch on the subject of AI-as-science except
to talk about (A) self-improvement curves and (B) goal systems, both of
which I've been known to analyze in quantitative terms. If anyone wants
to challenge my credentials as a Singularitarian, they can check my
qualities against the Singularitarian Principles, which I wrote, or ask
about it on the Singularitarian mailing list, which I moderate.
-- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/beyond.html Member, Extropy Institute Senior Associate, Foresight Institute
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:27:24 MST