From: Eliezer Yudkowsky (sentience@pobox.com)
Date: Sun May 23 2004 - 17:06:29 MDT
Eliezer Yudkowsky wrote:
>
> I am not saying that you will end up being stuck at your current level
> forever. I am saying that if you tried self-improvement without having
> an FAI around to veto your eager plans, you'd go splat. You shall write
> down your wishlist and lo the FAI shall say: "No, no, no, no, no, no,
> yes, no, no, no, no, no, no, no, no, no, yes, no, no, no, no, no." And
> yea you shall say: "Why?" And the FAI shall say: "Because."
>
> Someday you will be grown enough to take direct control of your own
> source code, when you are ready to dance with Nature pressing her knife
> directly against your throat. Today I don't think that most
> transhumanists even realize the knife is there. "Of course there'll be
> dangers," they say, "but no one will actually get hurt or anything; I
> wanna be a catgirl."
Just in case it is not clear, I do not think I am grown enough to mess with
my own source code. At best I am grown enough to be scared away in my own
right, not just because an FAI tells me it would be a bad idea.
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:47 MDT