From: Damien Broderick (damien@ariel.ucs.unimelb.edu.au)
Date: Tue Nov 05 1996 - 18:19:08 MST
At 08:06 AM 11/4/96 -0800, John Clark wrote:
>Suppose I show you 2 people, you talk and get to know them, then I then take
>them away and upload them. [snip of 2 methods, fast and slow]
>Is there any way you could determine which one
>was uploaded quickly and which one was not?
Of course not. But this is an operational question posed from *outside* the
skin of the two would-be uploaders. An important issue unresolved by this
`Turing test' question remains, as I posed it earlier: would one be prepared
to die (sacrific one's current instantiation) in order that an exact copy of
oneself be reconstituted elsewhere, or on a different substrate? To insist
upon this question is not to be a hostile `upload skeptic', at least not in
the sense that the philosopher John Searle might be, denying that a
consciousness could ever be instantiated on a computer system.
Greg Egan's fiction, which I keep blathering admiringly about, plays with
these questions in a series of skin-crawling gedanken experiments. His
favoured mode of upload/duplication/extended non-carbon life is a growing
`crystal' that is implanted in the brain, and echoes (in effective, mirrors
or redundantly and passively stores) every brain state from then on: `learns
to be me'. It's a bit like the two cerebral hemispheres, to the extent
(whatever that extent is) that those are not complementary but redundant
backups. Eventually, when your consciousness is running on a joint system,
neural net perfectly echoed by the crystal, your meaty brain is scooped out.
This, to the unprejudiced, is no cause for alarm. Had the mirroring been
done at a distance, onto a human-capable machine via radio, the same
situation would obtain.
Well, what if all twins were perfectly telepathic... but that doesn't quite
work, because they are differently situated *as visible actors* in space,
and the loss of one body would surely devastate the other locus of their
shared consciousness.
What if everyone were *given* a cloned double, with whose brain-states he or
she became redundantly resonant... No, it's hard to retain empathy for the
subjects of these thought games. Vinge's 101 dalmations in space, in A FIRE
UPON THE DEEP, goes some way toward exploring the experience of a
disseminated consciousness some of whose modules die and get replaced by
variant plug-ins (ie different doggies entirely).
Bottom line: physical continuity and general brain-process continuity
jointly create our sense of continuing identity and allow us to believe that
the person who wakes up tomorrow is the same one who went to sleep tonight
(or in 10 years after a coma, but nobody denies that this case is deeply
traumatic; or after half the brain is removed due to cancer, and no one
denies that this case is even more deeply troubling). Even if *I* can't
tell the difference between you and your perfect clone, *you* surely can, as
they strip you down. Check out Algis Budrys's classic 1960 sf novel about
matter transmission, ROGUE MOON, where there's a copy at each end, or James
Kelly's much discussed recent story `Think Like a Dinosaur' (I believe it's
called) where aliens with matter transmission tech impose a requirement that
the person left behind must be killed when the double is materialised at the
far end. I'd go kicking and screaming... not at all reconciled by the
simultaneous existence of my double in the receiving pod - *unless* I
remained, throughout, in perfect non-local continuous connection/identity
with him. In that case, I'd swiftly get used to the notion that it was just
like dropping off to sleep on the jet and waking up in another town. Of
course, I could be *fooled* into believing that this were the case, even if
a destructive-uploading expert knew it wasn't, which would soothe my dread.
`Just step into the nice warm shower, Damien...' But I'd rather not be
deceived in that way.
Damien Broderick
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:35:49 MST