Re: Why would AI want to be friendly?

From: Brent Allsop (allsop@fc.hp.com)
Date: Wed Sep 06 2000 - 11:41:46 MDT


Hal <hal@finney.org> asked:

> > To me, free will is the ability to do research to discover
> > that which is the best, and then set such as a goal and seek after it,
> > whatever it turns out to be. To the degree that you can reliably and
> > deterministically know and get what is truly the best the more you are
> > free.

> But can't you argue that this is not free will, but rather
> enslavement to this arbitrary goal of seeking the "best"? ("Best"
> at what, anyway?)

> Why should seeking this particular goal be entitled to be called
> "free will", while being enslaved to some other goal, like
> maximizing your owner's return on investment, would not?

        Ultimately, it could be the belief that there is some omega
point like absolute best for all. By definition, what is best, is
what we all want. Anything that chooses less, for whatever reason, is
"enslaved", unfree, or unable to get what it really wants - By
definition. It makes no sense to say one wants to chose what it does
not want. Or that it is "enslaved" if it is getting what it really
wants does it?

        Of course, until we know the absolute, or if the absolute is
more open ended, then we will likely never know absolutely for sure
what we really want or what is of absolutely the most value. We will
always run the risk of finding something that is better than what we
thought we wanted, and when we do thise, we will become more free
since we will then be better at both knowing what we want and getting
it.

        Again, anything that causes us to deviate from that which is
the best, whether it be ignorance, inability, some perverted ability
to "do otherwise" enslaves us, and prevents us from getting what we
really want, it doesn't make us free does it?

                Brent Allsop



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:30:49 MST