Earlier, James Swayze wrote:
>
> john grigg wrote:
>
> > I want an uploading process that actually takes my brain
> > and transforms it into a form of energy without any loss to the authentic
> > me.
>
> In my humble opinion that's not possible. I feel it is impossible for the same
> reason I don't believe in ghosts, vitalism of the soul, pure energy beings with
> complex intelligence's, spirits or any non material mind et cetera . . .
>
> I believe for information to be stored and manipulated one needs a complex
> material apparatus to direct energy about--to and fro--like the switches and
> gates in silicon chips or the earlier transistors. I believe neurons can be
> compared to these same gates and switches. If you smash those neurons or turn
> them to dust there is no more information or mind.
I agree with you, James, although in your next comments, I couldn't
figure out at first what you meant by "xian". You mean "Christian",
right, as in writing Xmass, instead of Christmas? Kinda cute, but I
really had to think to figure it out.
> . . .
> Suppose a person, a pious priest even, has a brain injury. Before
> the injury s/he was good and pious and caring and would never consider acting
> overtly evil (by western xian definitions). After the injury . . . even murderous
> individual. Has the soul changed? In xian
> terms is s/he now considered hell bound and damned? (as seen by other
> humans--not making assumptions about godly forgiveness, so let's not go there)
> The point is if the mind or soul were "vital" no amount of damage to the brain
> could affect behavior. We know it does.
>
> Since damage to the material does affect behavior/information, then the mind is
> of the pattern in that matter not the electrochemical energy scurrying about
> from neuron to neuron.
In scientific terms, this idea of mind as an organized, causally
developing pattern is almost surely the only one that makes any sense.
Disrupt the pattern, and this brings the personal integrity of the
developing mind into question. Indeed, the practical identity of one's
original mind could quite possibly be effectively destroyed no matter
how much the *new* pattern claims to be the same person. Outside of
stories from the field of medicine, people obviously tend to shy away
from this sort of "fate worse than death" stuff, and prefer not to deal
with any philosophical implications that consideration of disrupted or
damaged minds might tend to bring up.
In my own personal thoughts about grand scale philosophy, I myself have
sometimes forgotten that the human mind must necessarily have this the
basic physical vulnerability. For instance, it's sometimes seemed like a
neat idea to me to think that I could personally survive in a "galaxy,
far away", if only a near perfect copy of me were somehow generated
*way* out there (this would be almost as in a religious afterlife, in
the sense that science wouldn't have the "ghost" of a chance of
detecting such an unlikely, faraway, event). Unfortunately, for this to
work, I have to deal with it as an outcome of space being infinite,
which in turn also means infinite numbers of copies that aren't really
enough like me to *be* me, also infinite numbers of copies that would
resemble a brain damaged me. The result is that it's much better to deal
with my real survival as something actually, instrumentally detectable
to science on *this* planet (if technology is advanced and I am
sufficiently lucky, anyway). Referring to my "far galaxy" fantasy,
what's the good of those randomly generated person-copies, if they are
so far away as to be detached from everything currently around me, and
also apt to be somewhat like the brain damaged "Man Who Mistook his Wife
for a Hat", more likely than not?
Getting back to high-tech uploading and the like, if we develop the
technology for making mind copies, we'll presumably take great care that
the copies are good, complete, and reliable "whole mind" copies of our
original thoughts, emotions, and dispositions. Further, if we want it
that way, we could also take care to shut down our old brains, in good
timing with starting up any new uploads of ourselves. So, no "old self"
watching the new self, and wondering if it is "really me", at least if
we care to do it the right way, instead (OK, maybe there'll some such
doppleganger awkwardness for a few of the volunteers in the initial R&D
for the upload technique). Given that we're *not* talking about random
damage here, and given that we *are* talking about one way to
verifiably, really, survive, I don't see what the objection to a truly
*good* uploading procedure might be. Could it be that there are really
bad, lingering philosophical memes in the way of many people, memes like
"I don't need a tech answer, I'll try for a sure thing in Heaven, or
maybe in a "random galaxy, far away"?
Perhaps the most scientifically interesting topic that relates to this
right now is the controversy over whether cryonically frozen people
could ever really be revived, as themselves, with practical memory and
identity intact. In this regard, I was interested to read Robert J.
Bradbury's recent (Wed, March 1st), message, where he raised a concern
about lysosomal enzymes possibly destroying the mind patterns of current
cryonically frozen persons. The concern here is one of enzymes almost
literally chewing up memories, as in "there go the neurotransmitter
proteins that marked the path to remembering who my best friend was when
I was twelve" -- that sort of thing could be a real "ouch"! The point
is, if memories and skills can be preserved by freezing, that's good,
but if there is a real problem that essentially destroys the protein
structure of brain synapses, then cryonics as we know it would be a
total loss, or else just a gamble that didn't work.
David Blenkinsop <blenl@sk.sympatico.ca>
This archive was generated by hypermail 2b29 : Thu Jul 27 2000 - 14:04:31 MDT