Re: Why would AI want to be friendly?

From: hal@finney.org
Date: Wed Sep 06 2000 - 10:25:43 MDT


Brent writes,

> To me, free will is the ability to do research to discover
> that which is the best, and then set such as a goal and seek after it,
> whatever it turns out to be. To the degree that you can reliably and
> deterministically know and get what is truly the best the more you are
> free.

But can't you argue that this is not free will, but rather enslavement
to this arbitrary goal of seeking the "best"? ("Best" at what, anyway?)

Why should seeking this particular goal be entitled to be called "free
will", while being enslaved to some other goal, like maximizing your
owner's return on investment, would not?

Hal



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:30:49 MST