RE: (level seven) Further Discussion of Identity

From: Lee Corbin (lcorbin@tsoft.com)
Date: Thu Nov 14 2002 - 01:35:52 MST


Wei Dai writes

> On Mon, Nov 11, 2002 at 11:23:19PM -0800, Lee Corbin wrote:
> > If you were Yevgeni, in my story "The Pit and the Duplicate",
> > http://www.leecorbin.com/PitAndDuplicate.html, and the alien
> > Itself placed you in the pit, how many times would you press
> > the button? What would you expect to happen when you did?
>
> Let's distinguish between two versions of your second question.
>
> 1. What would you expect to happen in objective terms?
> 2. What would you expect to experience subjectively?
>
> ...
>
> Perhaps you would also agree with my position that all decisions
> (including the decision of whether or not to press the button)
> should be made by considering the consequences in objective terms.

I concur, but with some reservation. Your entire program here,
namely of sacrificing the subjective viewpoint for the objective
seems very ambitious. I believe it succeeds in providing the
best answers when used with care.

> The answer to question 2 should not matter and therefore there
> is no point in answering it or defining its semantics.

I try to answer question 2 also. I answer that I will experience
everything. I will experience being on the rim of the pit. I will
experience being in the pit even though I have pressed the button
many times. It is of no importance that these experiences happen
to occur at the same time. What do we know about time anyway?

> (I believe the difficulty of defining the semantics of expectations
> of subjective experience in the context of duplication is related to
> its inadequacy as a decision making tool.)

So far, you appear to be correct. If people do adopt your program
of trying never to think in subjective terms, then I think that
they will obtain what I regard as the correct decision in all
thought experiments, but only provided that they have already
worked out moral maxims of which they approve and hold to them.

> You should press the button if and only if you prefer having two copies of
> yourself, one in the hole and one outside, to having one copy in the hole.
>
> Unfortunately, evolution didn't program us to make decisions this way most
> of the time. Usually we make decisions based on how we expect them to
> affect our subjective future experiences. We tend to eat when
> we feel hungry, even if the extra calories are harmful. We can
> imagine an alternate universe where human beings have conscious
> knowledge of all the internal data related to deciding when to eat (fat
> reserves, blood sugar level, etc.) and make a calculated decision each
> time. We don't because it takes more computing power, and the alternative
> worked well enough in the past. Notice that now there is an evolutionary
> pressure towards making eating decisions based on objective
> considerations, and some people are already doing it this way,
> counting calories and measuring blood sugar levels.

That's a very nice analogy. But while those who count calories and
so on are indeed consciously acting in their own best interests
(objectively speaking), they will counter with a claim that to the
degree it has any meaning at all, one's "objective benefit" is
nothing more than the integral over all one's subjective states.
They have a point.

> I think the duplication issue is similar. Once duplication becomes
> possible, there will be an evolutionary pressure towards making
> duplication-related decisions based on objective rather than
> subjective considerations. Your story is a nice demonstration
> of how badly the latter works when duplication is involved.

Thanks. But continuing my theme, isn't there an equality between what
is best for me objectively speaking, and the integral over all space
and time of my subjective benefit?

Lastly, we are moral animals, (Jef Albright urges that one read Robert
Wright's "The Moral Animal" in preference to Matt Ridley's "The Origins
of Virtue"). Our evolutionary altruistic circuits (IMO) lie at the
basis of much of our kindness towards other creatures, and I certainly
don't want ever to sacrifice that part of our makeup. But when some folks
learn that animals and people are only molecules in motion, then in
*objective terms*, or so they say, happiness and suffering of others
cease to be important considerations (to the degree that they are
able to override these ancient impulses).

So I feel safest asserting the equality stated in the paragraph before
last.

Lee



This archive was generated by hypermail 2.1.5 : Wed Jan 15 2003 - 17:58:06 MST