Re: the few, the proud...

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Wed Dec 02 1998 - 00:35:35 MST


Spike Jones wrote:
>
> Eliezer S. Yudkowsky wrote:
>
> > I, incidentally, am a fanatic Singularitarian. I want something smarter and
> > more powerful and more ethical than I am to exist; I don't care how, or what
> > happens to the human race or myself afterwards...
>
> eliezer, you are not alone. i suspect that desire for something greater is
> hardwired into our brains by way of genetics.

Spike, you're talking to someone with Web pages about evolutionary psychology.
 I appreciate the thought. However, I have done my best to filter
politicosocial imperatives out of it. What I have set out to do has a
significant probability of annihilating the human race, and I know it. You'd
better believe I'm not in it from some vague desire to be part of something
big. I grew up knowing I was a genius and knowing that I could almost
certainly be part of something big if I played my cards right. I am not and
have not ever been acting on an insignificance complex. I'll admit that it
was pride and the desire for acknowledgement that figured in my fantasies,
back when I thought I could have my cake and eat it too. But now...

I am not acting on wishes. I do not at this time project that the Singularity
will result in the gratification of any of the desires that initially
motivated me, not pride, not the acknowledgement of greatness, not the fun,
and probably not even the knowledge of success. The personal drama that once
captivated me is irrelevant. I am acting on logic.

Now, I happen to think that even from humanity's perspective, a rapid
Singularity is the best way to go, because I don't see a feasible alternative.
 At this point, the only way to prevent a Singularity is blowing civilization
back to the Stone Age or wiping humanity out entirely via grey goo. In which
case, everyone now living will die in any case. The only way that any of us
can "not die" is through a friendly Singularity. If that's impossible, well,
at least our deaths will be the ethically correct thing to do. Sooner or
later human civilization will perish or go through a Singularity. This I
guarantee. What can you possibly accomplish, for yourself or for anyone, by
delaying it?

But that could all be rationalization. It's not the reasoning I use. I wish
to do the right thing, which is a question that is resolved by intelligence,
and thus I am required to create a higher intelligence to accept orders from.
My allegiance is to the Singularity first, humanity second, and I'm tired of
pretending otherwise to myself. If you want safeguards, if you want to
negotiate using humanity's ability to prevent a Singularity by exterminating
itself, if you would like humanity to be a game-theoretical partner whose
interests must be considered, then get a non-hypothetical negotiating position
and say so.

-- 
        sentience@pobox.com         Eliezer S. Yudkowsky
         http://pobox.com/~sentience/AI_design.temp.html
          http://pobox.com/~sentience/sing_analysis.html
Disclaimer:  Unless otherwise specified, I'm not telling you
everything I think I know.


This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:49:53 MST