>
>
> >If consciousness is based upon information and information
> >processing, my above statements hold.
>
>
> Unless the religious people are right, consciousness must be based on
> information processing, that's why I don't understand your theory. You're
> saying, I think, that information exists outside our physical universe,
> if so then the information in my head and the information in my upload a
> thousand miles away must be the same, after all you also say " a physical
> translation in space of the substrate is irrelevant". Also, I don't see why
> you keep distinguishing between "originals" and "copies", if your theory is
> right then everything in the physical universe is a "copy".
>
>
> >There can be many instances of a single brain, each with its
> >own consciousness.
I beleive that there is a significant difference between copying a
conciousness, which operates independently of a root conciousness, and a
gradual process of gradually copying one's various files one at a time,
verifying operability, and then transfering processing tothe new file
from the old, and deleting the old. This would maintain the individual
stream of conciousness, while the first, in the instance of destructive
copying, would not, relative to the original wetware. When we talk about
stream of conciousness, we need to maintain that this is relative to the
original copy. It does not matter if the original copy shuts down for
defragging once a day, it is still itself because it only copied and
erased minute fractions of its conciousness at any one time, and we know
that a human can operate with a significant loss of brain capacity, just
not with a total loss, although facetiously, I can think of some
congressmen who seem to be doing just fine.
>
>
> Yes, I believe it would be possible in theory for a brain to generate more
> than one consciousness, and I think it's possible for one consciousness to
> run on several brains.
>
>
> >Information is only encoded on objects that exist in the
> >physical universe.
>
> I don't understand why you say "only encoded" like it's not important,
> because you also say "For information to exist, it must be encoded on a
> substrate".
>
> >If every piece of substrate containing a specific piece of
> >information is destroyed, the information is destroyed
> >permanently.
>
> Not necessarily. It would be difficult to destroy the information that
> 2 + 2 = 4 because nobody knows where or how it's encoded. Even Shakespeare's
> plays could be rediscovered by a monkey banging on a typewriter, it would
> take a long time, but not an infinitely long time.
THis is merely obfuscation. What he is saying is that a total loss of
one's wetware at once equates to total loss of ones stream of
conciouness. Small losses can be accomodated for, and if copies of
individual sectors can be used to augment ones bioware, then one could
eventually slowly transfer one's conciousness, bit by bit to a new
substrate.
Here's a metaphor: You are an octopus in a bottle that has a small neck.
THe bottle is opened and connected to another small necked bottle that
has food in it (gotta have an attractor). The octopus could:
A) Bud off a clone of itself and send it to the new bottle, whilst the
clone is still baby sized. THe downside is that the original octopus is
still in the old bottle and is getting hungrier, while the new octopus
doesn't seem to give a whit.
B) Slowly move, one leg of itself at a time, into the new bottle (I've
seen this happen) until finally it squeezes its head through the neck,
and is in the new bottle, munching away happily
>
> To my mind this does indeed give some support to the idea that information
> can exist external to the physical universe, but certainly not to the idea
> that there is a fundamental difference between a copy and an original or that
> exactly the same information can't be encoded in 2 very different ways.
>
>
> >although our intelligence could be extended fairly easily,
> >our consciousness is irremovably tied to the overall
> >structure of our brain.
>
> I think that intelligent behavior and consciousness must be inextricably
> linked, otherwise I don't see why we would be aware at all. However important
> subjective feelings may be to us, Evolution is only interested in behavior
> because only behavior enhances survival. Evolution would never have given us
> consciousness unless it was needed for intelligence, that's one reason I
> think the Turing Test works.
Right and no intelligent being would freely give up his or her own
conciousness for another, unless the stakes were pretty grim and in your
face.
We want to be immortal and uploaded as ourselves, not as somebody else
who thinks "they is us."
Mike Lorrey