From: hal@finney.org
Date: Sat Jul 07 2001 - 16:15:51 MDT
Chris McKinstry writes:
> Just one more point before I go off to read your 'Friendly AI'... if
> Kurzweil is right, and in the future I can scan my personality into a
> computer, the event will create an instant conflict simply because the
> copied version of myself will fight to the death not to be turned off by
> the original version.
I don't see why this is. Just because one of you has power over the
other, the power of life and death, does not inherently mean that you
have to come into conflict.
There are many people in the world that you could kill now if you wanted.
Just buy a gun and you could shoot them. You don't even need a gun if
it's a little kid. Does this create an instant conflict between you
and everyone else? It does not. You would not exercise this power,
you would not wantonly kill. Most people are good and moral and so we
do not live in a constant state of fear and conflict.
In the same way, the computer self need not fear that the human self
will suddenly turn murderous. And similarly, if the computer self
eventually came into power (say, it became rich by being able to run
super-fast and out-compete humans, and it could theoretically use its
wealth to hire assassins), the human need not fear the computer.
Hal
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:08:33 MST