Re: The Meaning of Life

From: Hal Finney (hal@rain.org)
Date: Thu Feb 13 1997 - 14:36:19 MST


From: John K Clark <johnkc@well.com>
> On Mon, 10 Feb Hal Finney <hal@rain.org> Wrote:
> >I also am not clear on how to define what it means to make a
> >prediction.
>
> I ask the computer program: "Suppose you decided to search for the
> smallest even number greater than 4 that is not the sum of two primes
> (ignoring 1 and 2) and then stop. Would you ever stop?"

It says, "You bet I would. I'd get bored after only a few numbers.
I've got better things to do with my time!"

> >In principle, a person with access to an accurate neural-level
> >description of his brain could do so as well [...] Would people's
> >ability to answer such questions imply that they don't have free
> >will?
>
> The electronic computer is much faster than my old steam powered biological
> brain, so it figures out in 10 seconds what I'm going to do in 10 minutes,
> but if the computer then tells me of it's prediction about me, and my
> personality is such that out of pure meanness I always do the opposite of
> what somebody tells me to do, then the prediction is wrong.

No, the prediction is right. This was the main point of my earlier mail.
The prediction was of what you would do given situation X. When you do
something different, you are not in situation X, but situation Y, where
Y = X plus the knowledge of what the prediction was.

You can make a trivial computer program which reads a number and outputs
the next number. Now we will try to predict what it will type next.
We input our prediction and see what it types. Our prediction is
always wrong! No matter what we input, it outputs something different.

This is not an example of free will by any reasonable criterion. The
fact that a system's behavior is a function of its inputs means that
many, many systems will behave differently if one of their inputs is a
prediction about their behavior. Such a trait is totally irrelevant to
any discussion of free will, because so many systems which no one would
claim have free will have this property, as well as systems that people
do claim have free will.

So I don't see that the fact that a person will behave differently given
a prediction of his behavior is at all relevant to questions of free will.

If you had a box which had a simulation of your mind which ran faster
than your own, then you could always find out what you were going
to do before you did it. Would the existence of such a box make you
doubt your own free will? Wouldn't it be strange to always have this
box around which knew exactly what you were going to do ahead of time?
I would find that it would make me question how "free" I really am.

Hal



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:44:10 MST