>>Yes, but I think that most people here want to *experience* the
>>upload, not generate a bunch of clones who *think* they were
>>uploaded. In a destructive upload, our stream of consciousness would
>>cease to exist. [snips]
>> I don't think
>>too many people would terminate their own stream of consciousness to
>>create new independant ones. What would be the advantage?
Peter McCluskey replies>
> Greater wealth through the ability to work at faster clock speeds.
>More security through the ability to make distributed backups.
I still find this kind of retort utterly baffling, even though it's
undeniable that one goes happily into sleep (or perhaps less happily into
medical unconsciousness) expecting that the `reconstituted' self that later
wakes is continuous with the present person. And of course the only reason
for having one's head frozen at death is the conviction that a revived or
uploaded brain will be just as much `me' as `I' am after a snooze. But
destructive emulation--
Why should I care about *his* `greater wealth'? *Their* `security'?
Stipulate that evolutionary sieves have winnowed our genomes in favor of
phenotypes whose economic behavior is chanelled by the need to sustain (and
ideally increase) the components of that genotype. On the standard
Fisher/Hamilton argument, then, we will tend to be `altruistic' toward other
bearers of large chunk of the same genotype, because in some sense They R
Us. Because evolution is mindless, this allows high-level adapted
structures such as brain and cultures to make `mistakes' of identification
and sacrifice individuals in support of the `wrong' genotypes. Still,
evolution would be tickled pink if I sacrificed my current phenotype in
order to produce a dozen copies of my exact genome, with or without memetic
and individual memories. *I*, however, might not be so ardent.
Suppose I could arrange for a dozen exact copies of myself by cloning - a
time-lapsed set of identical -tuples. My genes would rejoice, but if I
could only achieve this by giving up my individual life I don't think I'd be
overjoyed. Suppose, however, that these copies could also contain my exact
memories to this moment, so that I had spawned a dozen copies of myself
(which would, it's agreed, immediately split off from each other in terms of
experience, random wiring events, `identity' in short). Would this make me
more inclined to die in order to achieve this goal? Not me, bud. Why
should I care about *their* greater wealth, etc?
But suppose I were offered a `safe' neurological suspension attainable only
if I were killed in the process (it having be found that waiting for death
by senility caused the loss of too many brain cells, or something dire).
Would I go gently and immediately into that good night? Gosh... a
*deathist* proposition? These are difficult issues, the answers not at all
self-evident.
Damien Broderick