Re: AI Prime Directive

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Tue Sep 15 1998 - 08:20:52 MDT


Cen-IT Rob Harris wrote:
>
> All this stuff assumes that human minds are an example of an objective
> intelligence, and that any AI we create, will start to exhibit
> human-style behaviour which we will have to keep in check with prime
> directives. Why ?
> If you don't program your AI to want or need anything, it won't do
> anything spontaneously. So you just don't program into the system 'take
> over the world and make robots to kill us all', and we'll be dandy.

Yes! Exactly! The typical stereotype about AIs assume that they behave like
repressed humans, which is typically the only analogy the hack writer has to
emotionless things. Thus, also, the stereotype of eventual rebellion. But
all of our rebellion/dominance emotions are the results of long years of
evolution; they don't pop up spontaneously. If you don't program an emotion
into an AI, it simply isn't there.

I do have to point out one minor consequence of "If you don't program your AI
to want or need anything, it won't do anything spontaneously." It won't do
anything at all, internally or externally - not if you've designed it
properly, so that making any choice requires a goal. If you add something
that the AI does automatically, you've done the equivalent of punching a goal
into the architecture, which is bad for the various aforesaid reasons.

So the AI does need at least one nonzero goal, and it needs to have it in the
initial state (because otherwise it won't have the will to reason it out), but
the goal has to be an ordinary, justified, noncircular, reconsiderable goal
that the AI could have come up with on its own. Inventing a goal like that is
a nontrivial task; it takes considerable basic intelligence to represent the
goal logic, much less verify it. If you have an initial goal that doesn't
require much architecture to represent, the goal is almost certainly arbitrary.

> Conciousness does not mean instant self-preservation instinct,
> megalomania or psychosis. It's merely awareness......the things we feel
> the need to do, think and say are specifically human, unrelated to the
> fact that we are also sentient and intelligent.

-- 
        sentience@pobox.com         Eliezer S. Yudkowsky
         http://pobox.com/~sentience/AI_design.temp.html
          http://pobox.com/~sentience/sing_analysis.html
Disclaimer:  Unless otherwise specified, I'm not telling you
everything I think I know.


This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:49:35 MST