From: Robert J. Bradbury (bradbury@www.aeiveos.com)
Date: Sun Aug 01 1999 - 16:01:42 MDT
> Gina Miller <nanogirl@halcyon.com> wrote:
> *Say that there are two "copies" of an original person uploaded. There are
> two persons with the same uploaded (or downloaded) data, consisting of one
> originals information, what is the perspective of the consciousness?
When you make two identical upload copies, you get the same thing as if you
have an operating system that can "save" the entire memory/hardware state
of a computer. If you "reload" it (virtually intact) [presuming the hardware
is completely reliable and the computer hasn't been engineered "down"
to the levels where the probability of quantum effects becoming "significant"
is high (and then you don't have a "reliable" computer)], then you return
to the same state each time. You should be able to do this as many times
as you want.
> Are these to copies from two different viewpoints experiencing the same
> consciousness?
Consciousness is a function of "executing" an instance. If the copies are
identical, each time you run one you get a different "consciousness". Now,
the really interesting question is *iff* you run separate copies and give
them *exactly* identical inputs, do the consciousnesses remain "identical"?
I.e. if after a period of time you "saved" the executing copies, would they
be bit-for-bit identical? I would argue that they should be (i.e. the
brain should operate like a finite state machine) *unless* there are physical
processes in the brain that are so sensitive to quantum (or chaotic?) effects
(i.e. processes close to a 50:50 probability split in their output or processes
that are derived from initial states that vary from pseudo-random starting
states). Such variations would also have to transcend any "natural"
majority logic/self-correction feedback loops built into the brain.
Only if you made the "copies" from different physical brain states, would
you have different "consciousnesses". The brain states would have to be
separated far enough in time (experience), that fundamental memories of
events or mental biases were altered. (Otherwise I suspect the "return-to-center"
tendency of the brain will have final say and the copies will evolve to a very
similar state of consciousness.
> See I was having a conversation with someone and we were discussing consciousness.
> conscious is not tangable enough to be transferred into information to be
> dowloaded. I told him that the real question is "what is the definition of
> consciousness", until that is clear this kind of question is limited in
> answer. Any idea's?
Caca. Every day when you wake-up you "reboot" your consciousness. Because
your brain (& mind) are "survival & reproduction machines", they have to reboot
*essentially* the same consciousness. If you didn't do this you would forget
all of the lessons you had learned, goals you had planned, etc. Presumably
there is a fair amount of variation (subject to genetic influences) on how
successful these "reboots" are. That accounts for forgetfullness, creativity,
people driven to achieve goals, etc. But *iff* an upload is an exact copy
running on hardware that exactly duplicates the brain, then the reboot
processes should be identical (genetic variations are coded into the copy
or the hardware on which it is running).
Now, if you are put to sleep, a copy gets made, and that copy gets "rebooted",
it should be no different from the "reboot" that occurs when you wake up.
The same would be true for all identical copies.
It seems to be true that some people can master injecting their consciousness
into their dreams (lucid dreaming). I would view dreams as "partial reboots"
with safetys "on" (kind of like a holodeck adventure).
Now, it looks from my reading of Nanomedicine, that nanobots should be able
to "noninvasively" monitor neural discharges (they can detect the heat
produced). *If* they can do this without generating too much extra heat
(which could impact on brain functioning) *and* if they can "outmessage"
this information (which requires a *really* high bandwidth link), *then*
you could have "active" consciousness monitoring (and backup). You could
even have this copy running in real time and probably "compare thought
sequences". This would be kind of like the computers on the space shuttle
that are constantly checking each other for errors.
There is a very interesting discussion of many interesting aspects of uploading/
conscious self-editing in "Permutation City" by Greg Egan. I think there were
problems with the premise on which the book was based, but the exploration of
the various aspects of this technology was very interesting.
Robert
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:04:37 MST