Re: Why would AI want to be friendly?

From: Michael S. Lorrey (retroman@turbont.net)
Date: Fri Sep 08 2000 - 08:39:09 MDT


Brent Allsop wrote:
>
> "Michael S. Lorrey" <retroman@turbont.net> asked:
>
> > Who ever said that free will = freedom from willfulness?
>
> In my experience, many people insist "free will" is "having
> the ability to do otherwise" even if this "otherwise" is other than
> what someone really wants or their true "willfulness". As in
> supposedly, some people freely choose hell by their actions.
>
> If anyone really chooses eternal damnation, are they really
> free?

Since I don't beleive that eternal damnation is a real possibility, I see your
question as meaningless.

Free Will is having the ability to choose freely without outside compulsion. Do
we have our own internal programming which helps us make decisions? Yes. Is it
so discretely deterministic that we are not able to make choices that seem to be
counter to the most likely choice our programming would have us make? Definitely
not. Are we able to break out of our programming and make new programming? Most
definitely.



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:30:51 MST