From: Andrea Gallagher (drea@alumni.stanford.org)
Date: Wed Apr 02 1997 - 13:29:23 MST
At 11:08 PM 3/30/97 GMT, Guru George wrote:
>
>On Sun, 30 Mar 1997 21:25:49 +0200 (MET DST)
>Anders Sandberg <nv91-asa@nada.kth.se> wrote:
>
>[snip]
>>It would be meaningless to upload into a system that didn't duplicate the
>>limbic system - why upload only *part* of the brain? Uploading will most
>>likely encompass everything from the brainstem and upwards (and I'll
>>admit that I have been thinking about the spinal cord too). The point of
>>the Strong AI hypothesis is that duplicating emotion only is a matter of
>>the right software structure, the hardware is irrelevant as long as it
>>can run it.
>>
>I get you. Yes, the types of hardware used are irrelevant - it might
>even be found expedient to use some sort of genetically engineered
>brain-like substance! I just wanted to clarify (more for myself than
>anyone else) that*some* implementation of R-complex and limbic
>*functions* would be necessary for fully human experience as an
>uploaded entity. And I see what you mean in that it would make an ideal
>first base camp for AI, being relatively less complex than neocortical
>functions, therefore easier to implement.
But isn't it all a matter of where you decide to draw the boundary between
mind and other? What do you see the limbic system doing, processing input
and turning it into emotion, or packaging input into a form that lets the
brain turn it into emotion? I don't really know much about the limbic
system, so it may well be true that a lot of the necessary processing for
emotion gets done there.
I regect Gregory's arguement that emotion is non-cognitive, distinct from
"thought". I suspect that what Gregory is calling "cognition" is only the
stuff that he is aware of, that sounds like language in his head, that
seems coherent and algorithmic. Cognition is really a matter of
multitudinous processes acting on a wide variety data input, much of which
never percolates up to what we call consciousness. If vision is part of
this process, there's no reason to think that emotion isn't. Emotion is
just one way the mind/brain chooses to represent and respond to some of the
input it gets. If we replicate the brain, I bet we replicate emotion.
On the other hand, I also bet that it's critical that an uploaded mind gets
enough sensory input, and that it's of the right format. If I were
uploaded into something that replicated my brain, I wouldn't be worried
that I couldn't feel emotions. I would be more worried that I wouldn't
have enough to "think" about to stay sane. That's the nice thing all those
nerves in the body and spinal column provides: lot's of information about
the world.
Drea
(Who also recently returned to Extropians after a long hiatus, but expects
to leave as soon as she gets a job.)
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:44:20 MST