From: Nicholas Bostrom (bostrom@mail.ndirect.co.uk)
Date: Thu Jul 31 1997 - 21:00:44 MDT
Hal Finney <hal@rain.org> wrote (a very good account of the standard
response by Everettians to the problem of how to make sense of the
probabilities):
> Nicholas Bostrom, <bostrom@mail.ndirect.co.uk>, writes:
> > Apart from the staggering ontological implications, there is one
> > objection that many philosophers of physics consider fatal: The
> > Everett theory fails to make sense of the probabilities. For
> > instance, take a particle that can undergo either of two processes,
> > and say that according to quantum mechanics the first event (A) has
> > an 80% chance of occuring, and the second (B) 20%. Now, according to
> > the Everett interpretation what happens (basically) is that the
> > universe splits into two universes; and there is one copy of me in
> > each of these universes. But if this were the case, then there should
> > be a 50% chance for me (i.e. *this* copy of Nicholas Bostrom) to
> > find that A had happened and a 50% chance that B had happened; which
> > we know from experiment is not true.
> >
> > I don't know of any good reply to this objection that would save the
> > Everett interpretation.
>
> Actually, Everett spent considerable effort in his original paper to
> address this issue. (Later authors have muddied the waters, though,
> particularly DeWitt's claim that the probabilities follow from the
> formalism itself, which no one takes seriously today.)
>
> The idea is that as the state function evolves into a mixture of
> non-coherent states, a measure function can be applied to the various
> components of the mixture, based on the probability amplitudes. Everett
> shows that, in the limit, the probability results observed from a series
> of measurements will match the squared-amplitude of the relative state,
> exactly as required by conventional QM; this is true except in a set of
> branches of total measure zero. So you then only have to assume that
> branches with amplitude zero never exist, and you derive that observed
> probabilities will follow the predictions of conventional QM. (This is
> the basis for DeWitt's claim, since he takes the premise as obvious.)
>
> Other authors have suggested that this solution is not quite enough, since
> it only applies in the limit of an infinite series of measurements. They
> suggest that a more powerful assumption is needed, namely that the
> measure of a branch, defined as amplitude-squared, determines the subjective
> likelihood of a conscious observer finding himself in the branch.
Yes this is what they say. The problem is that the measure doesn't
seem to correctly represent probabilities, if the world is as the
Everett interpretation says it is. If there is exactly one version of
me in a world where I measure Spin up, say, and exactly one version
of me in a world where I measure Spin down (and no other versions of
me, in this simplified example), then why is it that *this*
version of me usually ends up being the version of me that obtains
the outcome that quantum mechanics says is the most probable one?
Imagine that at each point of time t we define the set Ct as
containing all time slices of minds that exist at t. Then, as time
goes by, the proportion of mind slices in Ct that would see quantum
mechanical predictions verified experimentally would quickly become
completely negligable; and yet, miraculously, it constantly turns
out that I am in Ct! The improbabiliy that this should happen is the
improbability that the Everett interpretation is true, if there is no
way to escape this conclusion.
> (You might think of this as indicating that the "degree of reality" of
> a branch is proportional to the square of the amplitude.)
But then I want the follower of Everett to explain to me what a
degree of reality is, and more particularly why there is a greater
probability that I should find myself in the world with the higher
"degree of reality", when there are two real worlds and there is one
version of me in one of them and one in the other.
If sense cannot be made of "degrees of reality" (I'm not convinced it
can't, but it's a very deep issue) then, it has been proposed,
perhaps we should say that there are a greater *number* of worlds
associated with more probable outcomes. This would solve the problem
of probabilities, but now look at the ontology you have bought into.
You have commited yourself to an *uncountable infinity* of worlds
existing side by side, many of them identical! (Is it even meaningful
to postulate two distinct worlds that are *exactly* similar.) All
simplicity and plausibility seem to have been lost at this stage.
> The bigger problem with the conventional interpretation IMO is that it is
> inconsistent; the two different processes produce different results, and
> it is not clear exactly when we should use one or the other.
I'm not sure exactly what you referes to as the "conventional
interpretation". Personally, I (also?) am unhappy with
interpretations that appeal to the notion of measurement at a
fundamental level. But we must not forget the possibility that
something like the GRW interpretation turns out to be correct, so it
would be all too hasty to jump to something really weird, especially
if it doesn't even work.
------------------------------------------------
Nicholas Bostrom
bostrom@ndirect.co.uk
*Visit my transhumanist web site at*
http://www.hedweb.com/nickb
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:44:40 MST