From: Leevi Marttila (lm+extropians@sip.fi)
Date: Wed Dec 17 1997 - 21:36:01 MST
In another message Brent Allsop <allsop@swttools.fc.hp.com> wrote:
> I'm sorry I don't have a reference, but there was recently a
> discovery made of some chemicals that neurons can release that effect
> large numbers of near by neurons, even ones that have no direct
> synaptic connections. Any simulation would have to take into account
> such chemical behavior and properly simulate it's changing effect on
> the behavior of other neurons. The fact that we are still discovering
> things like this still leaves open the possibility of many things.
I did took this into account.
===
Brent Allsop <allsop@swttools.fc.hp.com> writes:
> Leevi Marttila <lm+extropians@sip.fi>
>
> > What if some advanced civilization replaced one of your neurons with
> > artificial one that behaved from the viewpoint of other brain like
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
> > original. What if they replaced all neurons one by one with
^^^^^^^^
> > artificial one. Would your consciousness change? Would your
> > phenomenon change?
>
> Yes. I think this kind of substitution is a fallacy which
> would not work as many think it would.
[...]
> If you simulated the entire process abstractly, you could
> reproduce the behavior, but the subjective experience would not be
> there. Once you replace the real phenomenal red with abstractly
> represented red, the subjective experience is gone. You would
> recognize this as soon as the first part of the visual cortex switched
> to be abstract since it would then be a blind spot in your conscious
> visual awareness.
Are you claiming that replacing some neurons with artificial one would
change phenomena and therefore consciousness would be changed, but
those changes would not change any way other part of brains?
-- LM lm+signature@sip.fi
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:45:14 MST