Re: Immortality

From: Jason Joel Thompson (jasonjthompson@home.com)
Date: Thu Dec 07 2000 - 15:12:32 MST


Hello Hal.

----- Original Message -----
From: <hal@finney.org>

> Now ask, what kinds of changes or transformations to your program would be
> permissible in various models of identity? Can the program be suspended
> and then resumed? Can it be moved from block to block of memory?
> Can portions be swapped out to disk and then brought in as needed?
> Can it be suspended, copied to another computer, and then resumed there?

Thanks for introducing this Hal-- I think this is probably the best way to
get to some of the core issues.

My suspicion is that the above thought process is an abstraction of
intelligence (into the software model) that is employed by many people.
I've personally found great value in drawing these sorts of associations...

...and I hope you don't think I'm being unnecessarily difficult when I say
that I believe that this sort of model for human intelligence is the precise
problem. IMO, a human intellect running on any substrate is going to be the
result of some big, crazy emergent property factors due to the immense
complexity and adaptability of the system. Do I think that we can damage or
collapse that emergent state of intellect by arbitrarily turning aspects of
it on and off? Well, umm... yes I do. Does that belief have scientific
validity in the context of Hal's idea above as expressed in the context of
traditional computer software? No... I don't think so.

My PERSONAL SUSPICION (as yet unfounded in light of our current
understanding of the human brain,) is that our consciousness runs as an
emergent sub-set of our current hardware and cannot be reduced to its
components in an effective fashion. We are RAM. Or rather, we are a sort
of standing wave-form generated by complex hardware-- a ghost in the
machine.

If I store the state information of that standing wave-form, cut the power
(totally terminating the current iteration of said wave-form), turn the
power back on and re-instate the wave-form, is it still (that) me?

Well... (and I can hear the groans already from the back of the theater) I'm
going to have to say "uh... sort of no."

Is there logic behind my reasoning?

Yes, mostly... but at this point, I have an admission to make: I have sort
of an intuitive understanding of why I hold this belief, but to be frank, I
feel that it is right at the edge of my intellect (with regards to my
ability to describe and fully engage it.) If you press me deeply for the
science on this, I -will- eventually have to shrug and say: "Dunno. The
mind is a mysterious thing my friend..." That IS a cop out-- come back in
10 years.

 My contention is that a *particular* iteration of the emergent wave-form
that represents my consciousness (based on the contiguous motion of a
reality experiencer along the arrow of time) is a significant, discrete,
entity/process. This entity/process does not really exist independent of
the substrate (it is, in fact, the substrate.)

If this entity/process -fails- to have a contiguous experience across time,
(as in the situation above,) it ceases to be, and its activities are resumed
by it's clone (assuming one is made.) (Please note that in my estimation,
the point of interface with reality is gaussian across the time vector-- the
wave-form does not collapse across the tick of the Planck clock, for
instance. The point of interface with reality of this 'flame' may actually
be -really- fuzzy, and the requisite breadth of experience continuity may be
large and a subject for future fascinating debates.)

My belief is that I currently represent the sole (and original) iteration of
this 'flame' contained in my particular substrate (... still ticking...)
There are dozens of reasons why I may not be, and no way of verifying that
this belief is true, other than petitioning Occam. (Yes, I -could- be a
robot with transplanted memories that you are even now activating, you
sinister fiend, you.) My further belief is that any particular flame
iteration is special, unique, and contains aspects that are significantly
irreplicable. (1. I suspect we are sufficiently fine-grained for QM to play
a factor. Hell, we probably -are- the QM. 2. An *exact* copy must
necessarily share -every- property (including location in physical space and
time) and that ain't significantly possible within a single reality.)

Does the -universe- care if I flick the power switch on and off like this?
Probably not. Does this iteration of the Jason/flame care? Yes, even IF it
can be shown that this is iteration #367 (although, as stated, I believe the
iterations are robust and this is still iteration #1)

So why did I say "sort of no" above?

Well, I think this model, too, is simplistic. A particular iteration of my
entity/process may be even more fuzzy and inclusive-- that is to say, I
might be able to preserve continuity of existence across the 'power outage'
described above due to robust hardware and good active storage mechanisms.
Robust fuzziness in the future is likely to allow me to turn off, copy, and
replace portions of my matrix (just not all at once, or catastrophically.)
I am optimistic about the possibility of upload vectors that can support the
existence of a particular iteration across substrates.

Nine Tough Questions You Haven't Asked Me Yet:

1. "Is it logical to cling to a particular iteration?"

For some, perhaps not. I've got a bit of solipsism mixed into my philosophy
however, so in my case, it's the difference that makes all the difference.
I have a bit of an experiential bias towards reality. I confess: I don't
really care about (or even, perhaps, believe in) external realities that
don't share a causal experiential chain with me.

2. "We build a form of uploading in which it can be definitively shown that
iterative destruction results. Do you go for it anyway?"

No, damn it. But I won't be happy about it. (Sigh... at least it will be
'me' not being happy about it, as opposed to Jason #2 having the time of his
life.)

3. " Transporter beams?"

Nope, sorry.

4. "Don't you want to be like Captain Kirk, and see the universe and make
out with green chicks on strange planets?"

Doesn't everybody?

5. "Yer just being difficult, aren't you?"

I would never never never do that. Never. How dare you suggest such a
thing? I'm shocked.

6. "Come, come, elucidate your thoughts."

Watch it.

7. "If you do upload, don't you want to make back-up copies of yourself?"

Yes I do. But it's not going to be as simple as having the contents of my
hard drive on a DVD somewhere. I imagine an uploaded system with lots of
redundancies, capable of recovering from widespread damage on the fly with
nearly instantaneously mirrored processes and data. However, there would be
an amount of damage that would be unacceptable-- a point at which the
intelligence "goes away." That would be a bad thing.

8. "Can't you see all the ways in which this is a limiting belief?"

Functionally, I appreciate the problem. But, again, can't -you- see why
it's the difference that makes all the difference? This belief doesn't
limit myself, it limits the prospects of the next iteration (which, I admit,
might not be a very nice thing to do. I apologize in advance for any harm I
may be inflicting on my future iterations-- but if you guys just hold out
long enough, I might change my mind!)

9. "Are you done yet?"

Yes, goodbye.

--
   ::jason.joel.thompson::
   ::founder::
    www.wildghost.com
>
> I think this approach frees us from an excessive focus on the particular
> characteristics of our brains.  It lets us look more abstractly at the
> nature of information and how it might be associated with consciousness
> and identity.
>
> Hal


This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:32:15 MST