Re: Why would AI want to be friendly?

From: Jason Joel Thompson (jasonjthompson@home.com)
Date: Wed Sep 06 2000 - 15:27:35 MDT


----- Original Message -----
From: "J. R. Molloy" <jr@shasta.com>
To: <extropians@extropy.org>
Sent: Wednesday, September 06, 2000 1:23 PM
Subject: Re: Why would AI want to be friendly?

> > Does truly superior intelligence require free will?
> >
> >
> >
> > --
> >
> > ::jason.joel.thompson::
>
> The term "free will" is an oxymoron, a self-contradiction.
> You can't have willfulness and freedom from willfulness simultaneously.

I recognize the point you're making, but it's a different discussion.

Let me steer this away from a conversation over free will by rephrasing the
question, again:

Isn't an essential component of superior intelligence the ability to detect
and route around factors that limit its efficacy?

--
   ::jason.joel.thompson::
   ::founder::
    www.wildghost.com


This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:30:49 MST