John Marlow wrote:
>
> Okay, call me self-aggrandizing, but this has for some
> time been my take on entrusting our fates to machines:
>
> Marlow's Paradox:
>
> “We cannot entrust our fate to machines without
> emotions, for they have no compassion; we cannot
> entrust our fate to machines with emotions, for they
> are unpredictable.”
A Friendly AI is neither emotional nor unemotional. It is simply
Friendly.
> Anything purely logical would exterminate us as
> unpredictable and dangerous. Anything emotional is
> itself unpredictable and dangerous.
You, sir, have been watching too much Hollywood cognitive science. The
desire to exterminate unpredictable and dangerous things is itself an
emotion.
There is nothing inconsistent about the idea of a 'logical' (intelligent)
entity whose goal is to be Friendly. (Why isn't it selfish? Because
selfishness is an evolved attribute, and complex functional adaptations
don't just materialize in source code. So how does the Friendliness get
into the contents of cognition? Because we put it there. That basic
inequality is what makes it all possible.)
-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2b30 : Mon May 28 2001 - 09:56:18 MDT