From: Eugene.Leitl@lrz.uni-muenchen.de
Date: Sun Jun 03 2001 - 17:25:27 MDT
Lee Corbin wrote:
> Well, this is what we are **argueing** about. Many materialists
> who adhere to the information theory of identity in fact maintain
> that we ARE NOT physical systems---we are patterns, or programs
> that merely run on physical systems. If you move the same (or
> nearly identical) process to another system, it's still us.
1) of course we're patterns
2) spatiotemporal patterns of physical systems
If you have the physical system (bolts, nuts, blood, guts and
all), you have the means to discover the pattern (the observable).
It might not be trivial, but the information is all in there.
> What do you mean? I certainly can be at home today, and also
> at home tomorrow. Or any other convenient place that you say.
Which "I" is I? Identity as in state at t (but what to compare
to? there is only one instance of you active, so identity
predicate is true, anytime, as you're doing a diff on a singleton),
or identity as contiguous worldline?
Both are different. By referring to monkey-terms such as
"I", you're not making the problem more tractable, quite
the opposite. It is really a good idea to treat the problem
using an abstract gas box, or a robotics arm.
> Well, this, again, is what we are arguing about. Some people
I'm not sure I've got any nerve to go on arguing about about
anything anymore. The thread is not salvagable. I would have
nuked it retrogradely, if I could. That zombie is damn resilient,
let me tell you. The gasoline didn't help one bit.
> define (so to speak) a person as a certain set of atoms that
> shuffles through space, slowly changing a few atoms at a time.
A person is a spatiotemporal pattern of a given physical
system. It has a contiguous worldline, or at least can be
assembled into a contiguous worldline from individual wordline
fragments (irrelevant for most monkeys, unless you go for
that Matt Groening head in the jar thing).
> Others, like you, define it as an object (apparently) although
Object, shmabject. As soon as you leave the physical layer,
you're in deep doodooland, and not even knowing it. Please
do stick to the physical layer, it does really help.
> I don't think that the way that you've just defined it is
> consistent with all your usages (see below about backups).
That's possible, I should be sticking to the hardware
layer myself.
> Sorry. Given a physical object in the year 3000, some would say
> that it's running them (or that it is them) iff there is a
> continuous path to what they are now. Others say that (in principle)
Sure, I was gastrula once, but I got better since. I'm still
on the mend.
> the path doesn't matter, only the final state. The best example
> is Max More's "The Luckiest Man in the Universe", a story of the
> sudden appearance of a set of atoms that just happens to be an
> exact match of the atoms constituting Sir Francis Bacon on a
> certain day in history. Those of us who are statists all the
That be stateists, I presume. I'm still not quite getting it,
as state is static, and personhood is a (contiguous, or
contiguousable by worldline surgery) spatiotemporal pattern.
> way say that since there is no difference between this and the
> actual Sir Francis from 1626, then it really is him and he should
This follows the severed timeline scenario, by invocation of magic.
> be glad to be alive. He should not listen to people who say that
> he's not the "real" Sir Francis, because the real Sir Francis died
> in 1626, and there is no continuity between him and that historical
> personage. We, however, say that paths don't matter (just as in
> thermodynamic state).
Okay, I agree with that. I'm not sure whether we're heading
for serious havoc with worldline salami (sliced as thin as it
gets, of course). I'm not sure whether this is Egan country,
but it sure smells like one.
> Of course. This is completely logical. It's funny how many
> extropians will chime in and agree with you saying, "oh, for
> sure, we'll have backups in the future". But they're forgetting
> that the backup might consist of a frozen brain somewhere (in
A frozen brain is not a backup. There's no fork (and no spoon,
either), as you ain't thinkin' anymore. A frozen brain is an
interruption of your wordline, to be resumed either verbatim,
or via an equivalent (upload).
For all practical purposes a frozen brain is a you potential
(unless you're unfortunate enough to happen to be a CI
patient, that is).
> one kind of thought experiment), or might consist of a stack of
> CDs in another. Then, when they stop to think about how their
If you're dead, it doesn't matter what your representation
is, provided it is sufficiently accurate. (Say, that's a damn
high stack of CDs of yours).
> own destroyed brain is discarded---due to massive injuries
> sustained in a car accident, say---and some kind of backup
I very much suggest we limit the discussion to fully
deterministic uploads, because it allows you a far more
rigorous treatment. For once, determinstic uploads don't
suffer from intrinsic noise, and hence don't need
trajectory forcing to keep being synched while in presence
if identical input (which in case of deterministic
uploads can be truly identical, while this is not possible
for flesh puppets).
> produced that allows them to continue with their lives, well,
> in this case they suddenly realize that for philosophical
> reasons alone, backups aren't really useful to them (after
Omg, while reading Aristoteles I just realized I'm not a real
person, but some icky weird kind of plastic person! EWWWW!
Excuse me, while I run off, and put an end to this sorry
fake kind of life I'm having. Thank you, Aristoteles, for
making me see the error of my sad plastic person ways.
Good bye! cruel world (sob, sniffle). (Exeunt, all).
> that point in time at which the backup is made).
>
> If I were uploaded, and there was some stupid law that said
> that I couldn't be executed in two places at the same time,
You can do it, if it floats your boat. But I think the
other person should definitely have the veto on decisions
concerting his (not yours, not anymore) future trajectory.
It's a question of personal lifestyle, some people might find
it easy popping in and out of existence, because the information
loss is not large, but you'll might find out some people will
violently object on principles.
Do not try to kill -9 my parent process. I warn you.
> and so we decided to freeze our original brain and run uploaded,
> then I would regard my frozen brain as a backup. But from what
If you're upload, hanging on to a head in a jar does not make
much sense. Unless you're one heck of a sentimental upload.
> you've said, you could not. After you were uploaded and running,
> you would have to regard your old frozen brain as someone else.
My frozen brain (assuming, we didn't have time to coexist to fork)
will do very nicely in case I happen to develop a fatal liking to
Microsoft software, and overwrite my amygdala with zeros after
a privilege violation (that was a joke, okay? computronium doesn't
do Windows), and kick the bucket in the process. In other case,
resuming a frozen brain (regardless, whether in carnem or in machina)
produces the old stupid boring fork, and you're talking to someone
else very much like you, but not you.
> morning roles around. "Hey!", they blurt out, "that backup
> ISN'T me anymore! It's a backup of that Tuesday fellow. If
That checkpoint is static. It's not a dynamic pattern, but
a snapshot of a pattern. There's no fork.
> "I" die today, there will be no path continuity between "me"
> and that old worthless backup!"
I used to be there where that potential is sitting, so I lose
the tail end of a worldline. It's not pretty, but I'll survive it,
I guess (and henceforth will make more frequent checkpoints
that just daily, goddamit).
> If we graph the value of a backup, then it stays at 100% all
> through Wednesday and part of Thursday. Supposing that the
> scan process takes a micro-second, then suddenly the value of
The sane thing to do is to suspend you while you do a readout,
as you never checkpoint a running system. Depending on coding,
there's some chance of winding up with just garbage.
> the backup (to the funtioning process) has plummetted from 100%
> to 0%. That's what I meant.
Huh?
> (Incidentally, I once formulated The Principle of Value Continuity,
> which says that the value of any state should be a continuous
> function iff one's values are consistent. For example, some
> enemies of abortion claim that the egg has human value 0 but
> that a microsecond after fertilization it has human value 1.
> This discontinuity of value is a sign that their values are
> not consistent. It's similar to Harvey claiming that the
Nice argument. But the Bible is anything but consistent, and
you might notice people say at day X the embryo gets imbibed
with a human soul (from God's stock, made at Creation event,
and never replenished since), where it suddenly goes from zero
to hero.
It's sure nice to live in a magical universe.
> fetus that he used to be when he was 1/100 inch long is really
> the same person that he is.)
> That's exactly so! Now imagine that you've alreadly walked into
> the black box and you start thinking, "Gee, there are 999 other
Bzzzzt. The box is flat, I can't start thi... anything. That's why
I explicitly made it a *very* flat blackbox. If it had more
depth, I would not be longer be so tolerant of its opaqueness,
as then some fancy version of hell might be lurking in there.
The boundary condition of splicable worldline leaving the
box will make me forget, but I will definitely not enjoy
anything which happenes in that hellish little box while
it happens, in vivid technicolor.
> Eugene's in here." You pause to examine the wallpaper and you
> think a random thought. "Oh, oh! Yikes! My odds now are only
> one in a 1000 of surviving!", and you break into a cold sweat
> and indeed fear that you are going to die. Now you no longer
If you have time to think, and realize that you've forked, you
indeed are going to face the consequences of 1:1000 odds.
Do not try pulling that trick on me. Assuming one of the
instances of me will survive it, and make you live long enough to
regret it.
Nuking 999 people just because you can (even if they're not
aware of it) is not okay in my value system.
> believe that what happens in the black box is irrelevant to
> Eugene Leitl's survival. But if so, then you are wrong. Eugene
> is going to survive just fine. So you lose a little memory. So
Hey, you just killed 999 people out of 1000, while making
them suffer by giving them full disclosure of what is going
on. Don't tell me you "lose a little memory".
> what? It's exactly like being replaced by a checkpoint or backup.
> It's exactly like you've just taken some Midazolem and aren't
> going to remember the present anyway.
I don't do Midazolem, whatever that is.
______________________________________________________________
ICBMTO : N48 10'07'' E011 33'53'' http://www.lrz.de/~ui22204
57F9CFD3: ED90 0433 EB74 E4A9 537F CFF5 86E7 629B 57F9 CFD3
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:07:56 MST