duplicates are the "same"?

From: Deniz Sarioz (ds1058@columbia.edu)
Date: Wed May 30 2001 - 16:37:28 MDT


1) re: http://www.leecorbin.com/dupproof.html
This "proof" is so fatally flawed. Consider that you put Albert Einstein
and Adolf Hitler in two adjacent cells. Ask Hitler, would you like Albert
Einstein to go through a bad experience for longer or you for less? He'll
be like, of course. And then you merge their short-term memories into one
another. Hitler will realize that ow, that really hurt, I should not do
that again--PROVIDED THAT A MERGER WIlL HAPPEN. Seeking his self-interest,
Hitler might decide not to, or might go ahead with it, since he was not the
one actually in pain, but just has memories of it. There is a great
evolutionary reason for pleasure and pain not to be easily re-lived in
one's brain, unlike proofs of theorems. You can remember what happened
during that painful or pleasurable act, but you can't reconstruct a similar
pain or pleasure without manipulating the physical world with your motor
neurons. If someone told ME that they would give me $10 in exchange of
being transferred memories of my arm having been cut 10 minutes ago, I
would probably take it.

Anyway, the claim is that the two were the same person to begin with. If
you make the bold claim that two distinct (however similar) brains can be
merged at all, you can't deny that this is possible. That "proof" can be
used to show that everyone is the same, an absurd proposition from a
materialistic point of view. The glitch in the proof is the merging--I
used to think it was possible, but I am not sure if it is possible to merge
two entities that differ in experience only in two hours on the same
physical substrate. Such a merge would be impossible without a loss of
information (relative to the agents after two hours--of course, you can
never duplicate experiences themselves) on the same physical substrate. I
would think that you would need more memory space (in whatever way it is
physically implemented) to be able to contain all the memories (hate the
English language for not having two real words for memory-space and
memory-stuff!) And an increase in memory space would mean that it would
take more time for information retreival (i can explain why if you wish),
and so the physical substrate would have to be made more efficient for the
two entities to be adequately "distilled" into one.

2) Having a duplicate is not significantly more than having an "identical"
twin. Hypothetically, they can grow up in the same environment, same
parents, same classroom, same desk, wear the same clothes, etc. Make them
siamese too, so that they are always at the same place at the same time,
give or take a meter. (Special or general relativity and light cone stuff
has little to do with this.) They would certainly not consider themselves
to be the same entity. They have been to the same places, done similar
stuff, but they have had room for their thoughts.

I liken this talk to two processes running on an operating system. Their
executable is compiled from the same source code, and if you wish, let it
even reside in the same region in the hard drive. OK, so i just spawn two
processes from that executable. They have different process IDs, and have
their own stack space. A lot of different stuff can go into that stack
space. They can have similar similar functions, but they're not the
same. One such process running on some scientist's computer can in its
internal state, have a brilliant patent worth a million dollars, and I
could probably wish it was the same as the one I am running!

Of course, now, you can argue that they would be the same if the two
processes are exactly in the same configuration (think Turing Machine) at
any given time (you *really really* don't have to worry about relativity in
a computer in which all components are necessarily in the same reference
frame!). Their abstract representation, yes, fine, it would be the same,
you could create a Turing Machine (an abstract model) that would describe
their behavior completely. But I would argue that by the fact that they
have spatially distinct locations, they are different. ESPECIALLY if these
processes were self-conscious, and ARE NOT designed to perform a certain
set of functions ONLY. We, human beings, can consider that our FUNCTIONS
are to eat, live, and reproduce, but that's not exactly a function the way
a mail client has functions.

Consider this extreme scenario: One day, some other volitional
consciousness comes to you, directly into your head, and tells you that
there are uncountably many parallel universes, and a subset of them are
identical, and yours is in that subset... Now, you have a choice to a) die
now, b) survive at the cost of all your duplicates in the rest of your
universe subset die. They "are you" in that they have had exactly the same
physical configuration as you, up to now. You are to make that choice only
inside your head--so there cannot be any fraud involved (given what I think
is the current state of technology). Whether you are delusional or
actually such a thing could exist is irrelevant. The question is: which do
you choose? You could start thinking, now this overlord persona might be
tricking me, maybe it is asking the same question to all of us and testing
us, so maybe I should say I should die, if we reach one at allnow that such
ueber-beings exist maybe there can be afterlife... and if it means what it
says, the same thoughts will go in all of our minds and we will reach the
same answer eventually, so maybe I should just keep quiet... OK, so that's
not really the question we want then, the real question is: which one would
you really prefer? To know that there are people physically implementated
the same way (with whom I cannot communicate) does not reassure me for
anything, as long as I don't keep on living, here, now, into the future as
I know it.

That would exactly be the case in an upload (assuming it's possible), or a
materialization from an upload. They would be this person with whom I
share every single PIN, every single memory of interacting with other
people. I'd try to talk to them and sort out who will have what. I
actually really would not mind that happenning--especially considering that
I don't have much of a bank account and most of my wealth is encoded in my
brain--another person with similar interests (same to start with),
interacting with whom will make me discover facts about myself that I
cannot now. But I would betray it if I really had to, but would try to
watch out for its interests otherwise (as in doing the right thing when
confronted with prisoner's dilemma--knowing that he would most likely do
likewise). The bottom line is, I would CERTAINLY not agree to be put to
death after he is materialized since "he has my genes and memes
anyway". And yes, I would go for a gradual transformation of my neurons
slowly being replaced. That way, not only my genes/memes will be retained,
but the continuity of the sense of I (it won't be just my copy who will as
in the duplication scenario).

It is not a good line of argument to say "but you have more in common with
your other self than you had with you yesterday". The me-yesterday DOES
NOT EXIST ANYMORE, so it is a tautology. Since it does not exist, it has
NO ethical significance WHATSOEVER as far as I am concerned. But in
society, of course, we take responsibility for our actions, and our
pasts. Because it can affect us now, and into the future. I would defend
my yesterday-me in court solely because of that, not because I have
compassion for him, he doesn't exist, let alone be me. I know the bastard
can get me into a lot of trouble :) I consider all past-tense verbs to be
modifiers of the subjects and the objects--to now-nonexistent entities that
might have resemblence to some entities existing now.

3) from Yevgeny in the pit. "Moral: Imagine yourself in the pit. Both of
the following statements are
true: (1) you will never get out of the pit (2) you will get out of the pit
every time you press the button. "
at time t, entity y will be in the pit.
at time t+1, there will be two entities y' and y'', sharing
memory-architecture and memory-stuff but not space, hence not memory-space,
one in the pit, one outside. If the copying is indeed perfomed perfectly,
there is little sense in talking about which one is the original or the
"real" Yevgeny. This is a social problem, not a physical problem. We
haven't had such problems so far because at time t, d is called deniz and
at time t+1, d' is also called deniz, since d (existing only at time t)
does not care to change his name.
Using "you" as an immutable over time is nothing but cheating. We need to
invent new words to talk about such stuff, since we are faced with
untraditional concepts of identity.

4) still from Yevgeny: "Just as the number 42 can be both in the pit and
out of the pit, so is it possible for you or me to be in two places at once."

Incoherent. The number 42 cannot have a location. It is an abstract
construct. Its representation can have location, such as on my monitor, on
my copy of the Guide, in the circuits of my computer implementing my mail
client, and in my head. The glyphs 4 and 2 juxtaposed is not 42. Our
human languages usually don't have good operators such as literal and
evaluate that you would use in some computer languages, for good reasons,
but it can lead to such contrived propositions such as the number 42 being
in the pit. People are not their abstractions, Lee Corbin is not my
conception of Lee Corbin, the way a juxtaposition of glyphs is not
instrinsically a number; even if some semantic interpreter would like that
to happen, reality isn't that way. (You can say that a mark of an ink or a
pattern of pixels on my screen is not instinsically a glyph either without
a semantic recognition mechanism to interpret it as such--and I would
absolutely agree.) So we utilize a lot of levels of indirection before we
make our statements, and we can fool ourselves easily. None of us can /be/
in two places at once, since we are basically 3-dimensional blotches of
dirt who have come to think that they are animals, transhumans, whatever.



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:07:51 MST