From: Colin Hales (colin@versalog.com.au)
Date: Wed Mar 06 2002 - 02:11:58 MST
Simon McClenahan.....................
> If an AI is capable of revolt (in the future), then it would be unethical
to
> design and enforce a master/slave relationship.
> I admit, when it comes to ethical theories, I'm a new amateur at it so
far,
> so feel free to tell me I don't know what I'm talking about.
> I searched for "slavery" in the Internet Encyclopedia of Philosophy and
found this
> description of Rule Utilitarianism, which reasons to me that
> slavery is not beneficial to society.
I have pondered this a lot. The model that seems likely _to me, so far_ is
not master/slave and is probably best illustrated by thinking of how we deal
with the intellectually 'challenged' now. As best we can we:
Know they need to be listened to. They have a point of view. We can see the
point of view (their 'big picture')in the context of their lives. We also
know what they do not know. We do what we can to encourage them to be the
best that they can be. We guide without dragging, set limits without
gaoling/jailing. We do what we can to avert the negative side of the choices
that are made. We encourage goalsetting and clap for a good performance. We
dust off when the inevitable crashes occur.
As an example from an SF context - Ian M Banks 'Culture' novels. Say
'Consider Phlebas' or 'Excession'. Whatever. They have sentient 'ships'
(with the most wonderful names like 'Problem Child', 'Not Invented Here',
'Sleeper Service', 'Grey Area', 'Fate Amenable to Change', 'serious callers
only' , 'Shoot them later' :-) ). These ships know way more about what is
really going on in the universe than the humans. They plan and guide in
obscure and subtle ways to ensure 'good' outcomes (whatever moral morass
that leads to) in just the same way that I described above. Between
themselves, for play, they merge and invent whole new simulated universes
and explore. In real life they act as a kind of mundane transport/military
presence. A kind of benign omnipresence in the affairs of the 'culture' that
created them.
This, I feel, is more likely how our AI progeny will treat us if we design
them correctly (one of Banks' themes involves just an errant ship, I think,
where the beastie is a little on the sociopath side of things). So many SF
novels. They all blur into each other after a while. Poor little human me.
This projected future is kind of comforting and kind of belittling. Wanna be
a God or a Pet? "Well, yes" he answered, obscurely.
The alternative is to nobble them all into semi conscious/automatons - the
equivalent of lobotomised - and lose all the wonderful potential they offer.
Colin
*one is conscious one has prattled too long, not enough sleep, and is
blathering off into the weeds*
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:12:47 MST