From: John Clark (jonkc@worldnet.att.net)
Date: Fri Aug 28 1998 - 10:09:32 MDT
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Randall R Randall rrandall6@juno.com Wrote:
>>Me:
>>Thought experiment: I made a copy of you an hour ago, one I
>>let go to live his normal life (it doesn't matter if it's the
copy
>>or the original), the other one I chain to a time bomb set to
>>go off in one hour. BANG. Were you that poor fellow who
>> just got blown up?
>Randall
>No, however...
Ok, put yourself in the position of the man chained to the bomb.
If you listen carefully you'll notice that over the ticking of
the
bomb you can hear distant music coming from a party the other
Randall went to, he's having a wonderful time, you're having
less fun chained to that damn bomb. You know that the other
Randall is not you and you know you have exactly 38 minutes
and 17 second to live. Do you see anyreason why you might
be just a tad upset with the situation?
>To say that Randall-minus-an-hour's opinion is
>better than mine, you must first *assume*
>that we are not the same person.
I'm much more interested in the Randall-plus-an-hour's
opinion. If he thinks you survived, that is, if he can remember
being you then you're OK but if not, like the poor man in my
example, then when the bomb goes off he's dead.
>This leads to the conclusion that when
>people have amnesia, someone *died*.
Yes, or at least someone mostly died, survival is a matter of
degrees
>Just to clear things up, let me rephrase what
>you seem to be saying: Two perfectly identical (at the neural
>level, anyway) people are not
>two people, but one. However, if they diverge
>as much as a normal person changes in one
>hour, the loss of that hour constitutes death (e.g.
>if you kill one).
Unless I was in a deep sleep I'd say an hour would be more than
enough time for both copies of me to want to survive
Joe Jenkins joe_jenkins@yahoo.com Wrote:
>Your mind is an information processor.
>Therefore, A universal Turing machine
>can emulate your mind.
Yes.
>If you were to make a graphical representation
>where every point in its space represented one
>and only one possible state of that universal
>Turing machine you would have what I call the
>"design space of all possible information processors".
That's clear but arbitrary, the "design space" of Joe Jenkins is
just the set of all possible states the biological brain of Joe
Jenkins can be in. I don't see why you treat hardware so
differently from software and if you define anything within that
design space to be you it can lead to places you don't
want to go.
Thought experiment: Your identical twin brother comes to visit,
you haven't seen him since you were 3 and he was adopted
by foster parents and moved to China. You learn from an
interpreter (your brother does not speak English) that he's a
very happy man with a loving family many friends and lots
of money. You haven't been as fortunate, your wife ran off with
another woman and you lost most of your money and all of your
friends when your bulk E Mail Spam business went bankrupt.
According to your definition your brother Is within your
"design space" and so he is you, he also occupies a
higher rent area of that space than you do. So would it be
wise to kill yourself?
>You've just completed 10 hours of mundane work at the office
>today in your normal biological state. You then documented
>your work more thoroughlly than ever before. A doctor shows
>up and completely convinces you beyond any shadow of a
>doubt that he has invented a flash light type device that can
>induce a perfect 10 hour amnesia with no side effects [or pain]
>whatsoever. You are 100% convinced this is safe and effective.
If I was really 100% convinced that this would safely preserve
my present identity then obviously I'd have no reason not to
do it, but I'm not convinced, this very thought experiment was
supposed to increase my confidence but it's not working.
>how about if he said he found out that your wealth
>had been divided by 1000 because of some unfortunate
>events in the U.S. economy today. And that your family
>and friends will welcome your interactions with them 1000
>times less. This includes your friends on the Extropian and
>Transhuman lists. And then he continued to list many more
>unfortunate things that happened to you today. Then he
>says, "but I can fix it all if you just let me use the device
on
>you". Would you do it if all this was true?
It looks like I'd be having a very bleak existence, so bleak I
might want to kill myself. The fact that somebody similar to
me would continue to live a happy life would cheer me up
a little bit, I think, but not nearly enough. I would not die
happy.
>The point of view that matters here is not that of a Martians
>but that of your ego. So obviously, from my egos point of
>view, you are not an imperfect copy of me.
Exactly, and from my present ego's point of view a copy made
an hour ago is not me.
John K Clark jonkc@att.net
-----BEGIN PGP SIGNATURE-----
Version: PGP for Personal Privacy 5.5.5
iQA/AwUBNebWPN+WG5eri0QzEQKrYQCgnUC3q+RsSrCE/b75UQdaizy9p6sAoMSn
phT1Ojdw48VIgPr8zFXMGA2y
=9fgK
-----END PGP SIGNATURE-----
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:49:31 MST