From: Eliezer Yudkowsky (sentience@pobox.com)
Date: Fri Jun 04 2004 - 09:42:26 MDT
Maybe I'll write a more detailed reply to this later, but:
1) Those who dance often seem insanely self-confident to those who hear
not the music. Michael Wilson does strike me as overconfident, and I am
diligently correcting his remarks. I regard myself as moving to the beat
of a partially formed technical theory that is different in kind from the
wild guessing of my earlier days. For this reason must I sadly refuse many
of the brilliant "intuitive" insights that people seem eager to offer me.
I'm sorry if it seems that I think I know vastly more than you do, but
that's exactly what I think. The arrogance is useless and irrelevant,
though not particularly harmful and a part of my present-day self whoever I
wish to be. But someone has to acquire considerably greater competence
than all this wild guessing, and that is what I am in the process of doing.
2) If SIAI does not scare the crap out of you, you don't take us
seriously. SIAI has scared the crap out of me since late 2000 when I first
realized it was theoretically possible for someone to screw up the
Singularity. For that matter, we should scare the crap out of you even if
you think we have a 100% probability of success on FAI.
3) If [AGI Project X] does not scare the crap out of [AGI Researcher X],
he doesn't believe in himself, he's still processing the potentially lethal
challenge using the part of the brain that handles movies and philosophy
books, or he hasn't realized it's possible to screw up.
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:47 MDT