From: Twirlip of Greymist (phoenix@ugcs.caltech.edu)
Date: Mon Dec 09 1996 - 16:07:35 MST
On Dec 8, 10:39pm, Eliezer Yudkowsky wrote:
} > Super intelligence need not necessarily come with correspond with
} > sophistication of values. Our human experience shows us that.
} I'm not going to go into any more detail, but nobody should be able to
} read that sentence without sensing a possible flaw.
Certainly. Humans obviously have a wider complex of values than
animals. One might well expect entities 'beyond' humans to have even
more complex values which we can't understand -- in fact, that is exactly
part of the strong Singularity.
On the other hand, I have seen no evidence that human geniuses have more
complex values than other humans. Since I maintain that Powers would be
*in principle* on our level, they needn't be any more ethically complex.
Anyway, having complex values doesn't imply being ethical.
} THIS LINE OF THOUGHT IS THE GREATEST KNOWN DANGER TO HUMANITY!
This statement I consider rather extreme, however.
Merry part,
-xx- Damien R. Sullivan X-) <*> http://www.ugcs.caltech.edu/~phoenix
But 'twas beyond a mortal's share/To wander solitary there:
Two paradises 'twere in one,/To live in Paradise alone.
-- Marvell, "The Garden"
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:35:53 MST