Re: Yudkowsky's AI (again)

From: Randall Randall (wolfkin@freedomspace.net)
Date: Thu Mar 25 1999 - 13:20:09 MST


I've lately thought that on Thu, 25 Mar 1999, Eliezer S. Yudkowsky wrote:
>den Otter wrote:
>>
>> Not necessarily. Not all of us anyway.
>
>The chance that some humans will Transcend, and have their self
>preserved in that Transcendence, while others die in Singularity - is
>effectively zero. (If your self is preserved, you wouldn't kill off
>your fellow humans, would you?) We're all in this together. There are
>no differential choices between humans.

Heh. Tell that to any number of States, mafias, and other criminal
organizations, some of which (the US govt, e.g.) are already interested
in some of the enabling tech.

*I* would not kill off anyone, but there are lots of people whom I would
expect to do so.

--
Wolfkin.
wolfkin@freedomspace.net | Libertarian webhost? www.freedomspace.net
On a visible but distant shore, a new image of man;
The shape of his own future, now in his own hands.-- Johnny Clegg.


This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:03:23 MST