From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Wed Mar 29 2000 - 17:34:07 MST
Lee Daniel Crocker wrote:
>
> > Yeah, but that's not realistic. I'm trying to save the world and *I*
> > never got a "threshold of the adventure" scene.
>
> Our hero sits at the keyboard, the phosphors of his monitor showing
> his young features in sharp contrast. Having just completed his work
> "Coding a Transhuman AI", he contemplates for a long while the full
> consequences of posting the document on a public site for all to see--
> including those who might use the knowledge for evil ends. But if
> he doesn't publish, will those who _should_ use this knowledge ever
> come in contact with it? After deliberation, he decides to seal his
> fate. He presses the <Enter> key and watches the moving bar on his
> display track the progress of the upload. The adventure begins...
See, now that's just what I mean! There isn't any of this great,
dramatic indecision and hesitation. The correct choice is usually an
order of magnitude better than any alternative, and obviously so; I can
recall only one or two major decisions in my life when this was not the
case, and neither of them were transhumanism-related. (And hey, if you
make the wrong decision, you make the wrong decision. Obsessing over it
won't help.) I read _Great Mambo Chicken_ when I was eleven, and I did
not "decide" that my life would be about ultratechnology; I simply knew
that it would be.
There was never a point, in all my life, where I could have plausibly
refused the quest.
-- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/beyond.html Member, Extropy Institute Senior Associate, Foresight Institute
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:27:44 MST