Re: uploading and the survival hang-up

From: Eugene.Leitl@lrz.uni-muenchen.de
Date: Sat Jun 02 2001 - 10:33:04 MDT


Lee Corbin wrote:

> Personal identity in philosophy has no easy definition
> like that. Unlike the important concepts in computation

That's too bad for philosophy, perhaps it shouldn't infringe
upon the domain of science and technology, at least without
utilizing similiarly rigorous tools.

Gedanken as habitually employed by philosophers are
worse than worthless. You can never get anything unexpected
from a gedanken. If you think you absolutely have to do a gedanken,
you have to at least establish a measurement procedure which
you could have implemented in practice. You're still immune
from surprises, but at the very least this eliminates fuzz
from your thinking. If you can't think of a measurement procedure
the problem will not occur in reality, and can be safely
disregarded.

> theory, the important ramifications of personal identity
> determine future courses of action that you may take.
> Philosophy's proper role is to prescribe action, and

Philosophy's (as religion's) proper role is to
1) shut up 2) get out of the way as far as science
and technology is concerned. I'm overdoing
it on purpose, but philosophy's mandate has obviously
expired. Similiar applies to ethicists (thanks, but no
thanks).

> today's bafflement about
>
> * whether to teleport (assuming fictional equipment)

If it's not there, I don't have to consider it. Currently,
the only way to read out the state of a flesh creature is to shut
it down and do an abrasive scan, destroying the original in the
process, or instrument it with ridiculous amounts of medical
nano (which will take a long time, during which you're absolutely
immobile (though not necessarily aware of it), and not exactly
pretty to look at (though not necessarily aware of it)), and construct
a numerical model of relevant aspects of it. Teleporting a
bit critter is trivial: it happens all the way in modern
parallel architectures, including clean shutdown, transfer
and resume at the other end (not entirely trivial to do in a
mature environment which was not designed to support such
functionality).

Either way, I'm baffled to see anything baffleworthy here.
It's ordinary, boring current technology, and projected
capabilities of current technology a few decades downstream.

> * whether to upload (assuming its immanence)

You decide. I think it makes a remarable sense,
considering that 1) you're dead to begin with, so
the only other alternative is to get you out of the dewar,
dig a flat grave in the field and toss you inside
2) it makes you way fitter than the average monkey,
so git thine ass onboard while you still can, as the
train is leaving

Either way round there is nothing remotely waffle-worthy
here, it's a purely personal decision. There's no "right"
or "wrong" here.

> * whether to fork (after uploading)

Ditto, see above. Forking might sense if you want to experiment
with superpersonal aggregation (a cluster of tightly coupled
persons which were initially identical), or creating hybrid versions
of self, absorbing a clone previously dispatched to accomplish
a remote task.
 
> and so on, demand careful thought. People have to eventually
> confront extremely numerous and varied possible thought-
> situations in order to formulate concepts both consistent and

Consistency is a pointless requirement. We're monkeys, why should
we be consistent? Life is not consistent, it just is.

> satisfactory. The only entirely consistent and (mostly)

Satisfactory is a purely personal metric. There can't be
any blanket recommendations, valid for every single Jane
and John Doe out there. Everybody has to think for oneself,
get acquainted with the issue as far as it is possible, and
make a purely personal decision on basis of limited data,
as everybody must who gets out of the bed in the morning.

> satisfactory view that I know of is the "information theory
> of identity", sometimes called the state theory, explained in
> most detail in Mike Perry's book "Forever For All". I believe

I don't see any point in reading a book, the problem set does
either already occur in current practice, or can be extrapolated
with ease.

> that you, Eugene, are a "statist" like me, but that for some

I don't know what a statist is, actually.

> reason you think that if even a single bit, (or a single atom
> in the physical thought experiments) is different, then
> identity goes from 1 (completely true) to 0 (completely false).

Sure, as that's a definition. Identity is a boolean metric.

If you'd said similiarity, I'd agree. There's a contiguous
"similiarity" metric over the discrete (well, we're
talking about bitvectors here) space of persona.
 
> But how could that be? If an entire hair disappeared from
> my head, or a whole neuron died (sadly that happens now and
> then), it matters not a whit to my personal identity. And
> anyone who thinks that it does isn't trying to understand
> what we are talking about.

The question is meaningless. You are you, and there can't
be any diff to you (what is the pattern: singleton?), also
by definition. If you lost a hair or a neuron died on you
this all happens to a single instance of you active at the
time. If you fork, and force the other instance of you go
through the same changes as you, you're of course forcing the
other guy to be you. The trajectory forcing rapidly becomes
nontrivial the longer you wait, because the diff gets larger
and larger, and the bits are not labeled, so you have to
look for patterns, and these are pretty dynamic, so the
longer you wait, the more work you must do.

As soon as you allow the pair to decohere, there are two of
you. A fork has occured, you've started on your deviating
trajectories (of course you can fuse a fork by means of
trajectory constraints, this is much easier if the fork
is very young still, as you easily see the diff and the
amplitude of the control signal on the system trajectory).

> Of course, as I explained above, this is asking too much.
> (Much as someone might ask of a socialist or libertarian,
> "define the preference metrics you are using".)

Well, then we're talking about ships and sealing wax here,
and whether bees have stings.
 
> >Also, please tell me whether you think that your twin
> >brother is yourself (basically the same situation has
> >occured in reality as in simulation).
>
> This is an interesting empirical question. Evidently,
> identical twins learn to distinguish themselves very
> early in life. First, though, I am not talking about

Of course, because they're different people who forked
in the womb. Very early stages of embryomorphogenesis are
surprisingly deterministic, but from a certain stage
onward intrinsic system noise and different input (you
can't be in the same time in the same place) made them
diverge.

> the animal level "it hurts when they spank me, but it
> doesn't hurt when they spank him". I'm talking about
> ---and here I am guessing a little---the way that each
> twin almost seizes upon tiny differences, and slowly
> amplifies them over time to create an actual, separate
> person.

There's no seizing, though some twins may make active
attempts to introduce extra differences (by deliberately
hanging out with different people in different places).

They can't not be not different, because they're superficially
similiar (they're thinking different thoughts, and if you
look at neuronal ultrastrctures it is entirely different,
and in relevant ways), but different people.
 
> The strongest clue: people who fall in love with one
> twin, but not with the other. But duplicates, however,
> are totally different: if your wife loves you, she'd
> love your close duplicate too.

Nevertheless you're different persons, though far more
similiar than any identical twin (duh, you only forked, like,
five minutes ago).
 
> (Naturally, since many people are reading this, it is
> necessary to define "close duplicate". A close duplicate
> is a process running at a separate spacetime location such
> that neither is a memory superset of the other, and such that
> the differences correspond to temporal differences of just
> a few minutes, or at most a few days of a normal human's
> life.)

Very good. Computer science people might think of two instances
of the same class, with initially minor differences in internal
state (animals and people have huge amounts of internal state).
 
> Lee Corbin

-- 
______________________________________________________________
ICBMTO  : N48 10'07'' E011 33'53'' http://www.lrz.de/~ui22204
57F9CFD3: ED90 0433 EB74 E4A9 537F CFF5 86E7 629B 57F9 CFD3


This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:07:54 MST