Harvey Newstrom, <newstrom@newstaffinc.com>, writes:
> Hal <hal@finney.org> wrote:
> > I may have been unclear about this "replay". The brain is entirely
> > normal and fully functioning. It's just that we "happen" to be giving it
> > exactly the same inputs we did on an earlier run.
>
> I must have missed something. If the second brain is normal and fully
> functioning, how is it different than the first brain or any brain? What is
> being replayed here? The test? I though you had something hooked up to the
> brain and was forcing it to replay the same thoughts it had earlier. If its
> thoughts are being dictated from the outside, it is not conscious or
> generating thought. If its thoughts are being self-generated from the
> inside, then it is conscious and generating thought.
>
> I guess I am having a hard time following all your examples. What exactly
> is it that you are trying to prove? Every example seems to devolve into a
> discussion about the minutia of the example. What is the overlaying
> supposition you are trying to support?
I am sorry that my examples have seemed confusing. What has happened is that I gave the first experiment of the divided brain. You responded that there is no consciousness because there is no flow of information, it is just a pattern which is being imposed from the outside.
I said then, if that were the case, that the flow of information and connection and causality were the true determining factor of consciousness, then how would you explain what is happening in this *other* experiment?
The divided-brain experment shows that this is true with regard to the pattern model, and the other experiment which I have tried to describe to you yesterday shows that this is true with regard to the information processing, connection based, causality based model that you support.
Does that help give you an overview of what I am trying to do here?
My message with the *second* experiment, the one which attempts to lay
the groundwork for showing that causality is not a coherent concept for
a model of consciousness, is at
http://www.lucifer.com/exi-lists/extropians/4335.html. I will explain
it again and perhaps try to eliminate any superfluous elements.
Now we claim that the result is not "actually" a case of functional information processing, but merely a passive replay. Each neuron is given inputs from a recording, and its responses go nowhere. This is essentially the same as the case where you said there would be no consciousness.
The point is that we took a situation where the brain was conscious, by the premise of causal connectivity based functionalism, and by substituting one set of signals for an identical set of signals, which is arguably no substitution at all, we produced a brain which is passively running a replay. So either this seemingly ineffectual substitution has eliminated consciousness, which seems hard to understand, or passive replays are as conscious as functional brains, which you deny.
Hal