From: Billy Brown (bbrown@conemsco.com)
Date: Tue Feb 23 1999 - 08:04:44 MST
Eliezer S. Yudkowsky wrote:
> I suggest that the section:
>
> ==
> In the first case, we could make sure that the values of tolerance and
> respect for human well-being are incorporated as core elements of the
> programming, making part of an inviolable moral code..
> ==
>
> Be amended to read:
>
> ==
> In the first case, we could make sure that the values of tolerance and
> respect for human well-being are incorporated as core elements of the
> programming, making part of an inviolable moral code. (However, some
> think this would be a hideous mistake from a <a
> href="http://tezcat.com/~eliezer/AI_design.temp.html#PrimeDire
> ctive>programming</a>
> perspective; some also question the morality of such an action.)
The responses so far appear to be generally unfavorable.
I see two fatal criticisms of the idea suggested in the FAQ:
First, it is mind control. Remember, posthumans are by definition fully
sentient beings. Programming them to abide by a preordained moral code is
no different than doing the same thing to our own children, or to each
other. I can see no possible way to justify such an action on moral
grounds.
Second, in the long run it won't work. You can program non-sentient
machines to do your bidding, but that isn't what the FAQ is talking about.
It suggests that we attempt to enforce an eternal program of mind control on
fully sentient posthumans - for those of you who have forgotten what that
means, I suggest you read the definition of 'Posthuman' in the same
document. Does anyone really think such a prohibition would work on humans
for all of eternity? Then why do we think we can do it to an entire society
of superintelligent, self-modifying entities?
I suggest that this passage be amended to remove the advocation of mass mind
control. Perhaps something like this:
In the first case, we could make sure that the first such entities possess
a thorough understanding of, and respect for, existing human moral codes.
That would be enough to ensure that they at least think the whole thing
through before adopting some completely alien viewpoint. It is also the
strongest measure that I can see either a moral or a practical basis for.
Billy Brown, MCSE+I
bbrown@conemsco.com
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:03:07 MST