From: hal@finney.org
Date: Thu Nov 25 1999 - 11:47:20 MST
Greg Burch writes:
> Eliezer. Just so that it's clear, are you saying that there is no question
> in your mind that letting an SI run human affairs is preferable to any
> arrangement of society humans might work out on their own?
I believe Eliezer is working from a model where there is something
called Absolute Morality, and SIs are smart enough to figure it out and
observe it. Then, by definition, anything the SIs do is "better" than
whatever humans would work out. Even if the SIs wipe out the human race
and destroy all life in the universe, that was "good" by the standards
of Absolute Morality.
In this model, SIs do not have goals of their own, hence there is no
danger that they will be selfishly motivated and take actions that
benefit themselves at the expense of humans. In following the tenets
of Absolute Morality they will do whatever is Right.
I'm not sure if his argument works if there is no such thing as Absolute
Morality. In that case it seems that there is a risk that SIs will
develop their own goals (just as we do) and that their actions will not
be beneficial to the human race.
The worst outcome would be if the SIs are programmed by Eliezer to have
as their only goal the search for the Holy Grail, that is, Absolute
Morality. However, smart as they are, they still haven't found it.
They have to be smarter. And to do that they have to turn all available
mass into SI computational elements, which means, regrettably, wiping out
the human race. Then, at the end of a millenia-long development effort
that consumes half the galaxy and reaches realms of abstraction we can't
begin to imagine, they finally decide that their is no Absolute Morality.
So they all commit suicide. Oops.
>From the point of view of those of us who don't believe in Absolute
Morality, Eliezer's program amounts to building an unachievable goal
into the SIs, a highly dangerous proposition and one we might well oppose.
Hal
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:05:51 MST