From: den Otter (neosapient@geocities.com)
Date: Sun Dec 06 1998 - 12:12:34 MST
----------
> From: Eliezer S. Yudkowsky <sentience@pobox.com>
> Look, these forces are going to a particular place, and they are way, way,
> waaaaaayyy too big for any of us to divert. Think of the Singularity as this
> titanic, three-billion-ton truck heading for us. We can't stop it, but I
> suppose we could manage to get run over trying to slow it down.
Or you could try to hop aboard and grab the wheel. Get very rich,
focus your efforts on upoading, and there's a considerable chance
that you *cause* the Singularity and become its God. Difficult, yes,
but certainly not impossible.
> > Plus: whether it's moral or not, we would want to make
> > sure that they are kind to us humans and allow us to upload.
>
> No, we would NOT want to make sure of that. It would be immoral. Every bit
> as immoral as torturing little children to death, but with a much higher
> certainty of evil.
Obviously AI is *not* the smart way to cause a Singularity. Placing
yourself at the mercy of your creation (or anyone, for that matter) is
never a good idea. The only value of AI research is that it may help
us to understand the mechanisms of uploading better.
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:49:54 MST