Re: Posthuman mind control

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Mon Mar 01 1999 - 16:08:25 MST


Billy Brown wrote:
>
> The kind of AI Eliezer envisions is essentially immune to this kind of
> influence. It is a pure reasoning engine. It is incapable of having
> desires, emotions, preferences or opinions - it has only data, and chains of
> logical argument based on that data. Only data and rational argument can
> ever convince such a mind of anything. Since we can't construct an
> objectively provable system of basic moral axioms, the AI is never going to
> accept them as more that a temporary working hypothesis.

I prefer not to phrase things that way - it causes many listeners to
envision an AI that acts like a human with no emotions, or even a human
with repressed emotions, which is not the case. It is equally valid to
say that Elisson has desires, emotions, preferences, and opinions,
except that they are implemented as rational thoughts instead of a
limbic system.

-- 
        sentience@pobox.com         Eliezer S. Yudkowsky
         http://pobox.com/~sentience/AI_design.temp.html
          http://pobox.com/~sentience/sing_analysis.html
Disclaimer:  Unless otherwise specified, I'm not telling you
everything I think I know.


This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:03:12 MST