From: Eliezer Yudkowsky (sentience@pobox.com)
Date: Sun Dec 08 1996 - 21:39:23 MST
> Super intelligence need not necessarily come with correspond with
> sophistication of values. Our human experience shows us that.
I'm not going to go into any more detail, but nobody should be able to
read that sentence without sensing a possible flaw.
> Why not a path that leaves certain values inerasably etched on the
> newborn power.
THIS LINE OF THOUGHT IS THE GREATEST KNOWN DANGER TO HUMANITY!
Nuclear war isn't worth mentioning. Gray goo might leave some
survivors. An attempt to imprint the Three Laws of Robotics on a Power
could and probably would backfire HORRIBLY!
As I explained in an earlier post, the ethicality of the Powers depends
on their ability to override their emotions. What you are proposing is
taking a single goal, the protection of humans, and doing our best to
make it "unerasable". Any such attempt would interfere with whatever
ethical systems the Power would otherwise impose upon itself. It would
decrease the Power's emotional maturity and stability. You might wind
up with a "Kimball Kinnison" complex; a creature with the mind of a god
and the emotional maturity of a flatworm.
Then, at some point, no matter how well we design the Power's leashes,
it will start trying to work around the limits we have imposed. And it
will be human against Power, in the Power's own mind, on the Power's own
ground. It will almost certainly win. And with the emotional maturity
of a flatworm, plus whatever emotional forces it called up to override
its protectivity of humanity, it may well turn on humanity and squash us
like bugs.
Even if this plan works, placing a single goal above all others would
probably interfere with deducing the Meaning of Life; you might wind up
with a good-intentioned, creativity-squashing, and utterly
unchallengeable dictatorship, as in "With Folded Hands".
NO! NO! NO! NO! NO! NO! NO! NO! NO! NO! NO! NO! NO! NO!
Eliezer S. Yudkowsky
-- sentience@pobox.com Eliezer S. Yudkowsky http://tezcat.com/~eliezer/singularity.html http://tezcat.com/~eliezer/algernon.html Disclaimer: Unless otherwise specified, I'm not telling you everything I know.
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:35:52 MST