From: Ramez Naam (Exchange) (ramezn@EXCHANGE.MICROSOFT.com)
Date: Tue Nov 11 1997 - 18:52:05 MST
Brent, for a moment consider this proposition:
The subjective experience of "Red" is a consequence of the
neurophysiological state of seeing (or imagining) light of a certain
frequency.
If we traced this through your brain we would find that red light causes
a certain neural firing pattern in the V1 area of your occipital cortex
to fire, while blue light causes a different neural firing pattern.
We can verify analogous neurophysiological states for various sounds,
bodily sensations, smells, tastes, etc..
Given this, can we not reasonably suppose that the subjective experience
is a consequence of the neurophysiological state triggered by the
stimuli?
If this is the case, can we not learn to distinguish those "neuroqualia"
which are essentially the same among all humans from those neuroqualia
which vary among individuals?
If so, would not the first set represent a set of "experiences" which we
could then share with another individual by instantiating the same
neurophysiological state in that individual?
Indeed, among individuals of different species, or between (for example)
a human and an AI, could we not construct a mapping table that allowed
us to translate these experiences to the appropriate internal
representation?
Direct answers to some of your points below>>
> From: Brent Allsop [SMTP:allsop@swttools.fc.hp.com]
>
> Anders Sandberg <asa@nada.kth.se> responded:
>
> > Actually, I consider this "stuff of consciousness" to be
> > information, or more accurately instantiated information. So I have
> > no problem imagining a computer experiencing the color red if it has
> > the right software, and transferring my mind into the computer is
> > just a problem of replicating the pattern. But I cannot be sure
> > until I try.
>
> Yes "instantiated information" is more accurate. Instantiated
> information is represented by something fundamentally real. The state
> of a transistor can be an instantiation or representation of the word
> stop. So can the color red. A distinguishably different state of
> that same transistor can be an instantiation of the word go. And
> green could similarly be taken to represent go. But, there are some
> important qualities that red/green has that the particular states of
> the transistors do not have. It doesn't matter what the fundamental
> state of the transistor is as long as it is interpreted correctly. If
> you reverse the interpretation and the setting of the value the
> abstract meaning stays the same even though the fundamental or
> physical representation is now opposite. If we swap the meaning of
> red and green, the abstract meaning can be the same, but the
> fundamental subjective or phenomenal experience is very different.
> Such phenomenal qualities of experience must be taken into account in
> order to reproduce consciousness.
I beg to differ on a number of grounds.
1) This thought experiment may in fact be impossible. Much like me
imagining (If I could go 10x the speed of light..)
2) Research has demonstrated a psychological effect of color on humans.
Different colors produce different levels of stress, relaxation,
attention, arousal, etc.. in fairly predictable and consistent ways. So
I ask you, if I switch red and green, and all fires are now green,
haven't I actually made some difference in the meaning of fire to the
individual? In the sense that the fire will now evoke a different
response than when it was red?
>
> Would you really be happy if all you knew of "salty" was
> whether or not some abstract salty bit was set or not and all you had
> was a corresponding look up table containing all the strings people
> had ever used to try to verbally describe what salty is like? "It's
> kind of a puckery non sweet taste" just contains no meaning at all
> about what salty is really phenomenally like. The difference is the
> actual fundamental and phenomenal nature of the particular
> "instantiation" and this difference is most definitely important to
> consciousness.
You assume a gulf between sensation and realization that I do not
believe exists. Imagine that sensation is a consequence of the physical
system that instantiates the information that is represented in
consciousness. Or, put another way, sensation /is/ the stuff of
consciousness. A being that represents salty internally in a
sufficiently rich manner necessarily has an "experience" of salty.
>
> How many abstract bits of information would it take to store
> the various possible responses to the question: "What is salty?" in a
> discussion kind of way? Wouldn't an abstract machine producing such
> responses really be a liar since it really had no idea what salty was
> phenomenally like?
>
> How many bits do you think the brain uses to reproduce such
> "information"? Or does it simply know what salty is like? And might
> such phenomenal abilities to "instantiate information" be one reason
> for such "common sense" intelligence that abstract machines still
> lack?
>
> Wouldn't an honest and intelligent abstract machine be able to
> recognize that it doesn't really know what salty is like, just as you
> must admit that you don't know what salty is like for me, unless salty
> is the same for me as it is for you?
Hmm. You posit an incredibly simple machine and then grant it
intelligence, which immediately strikes me as contradictory.
Clearly humans do not represent "salty" with an on/off bit. Instead we
have thousands of sensors which trigger the firing of at least tens of
thousands of neurons, with varying firing speeds of each neuron and
various spatial firing patterns throughout the region. This is a great
deal of information. To me it seems completely intuitive that we would
represent this via what our folk psychology calls a "sensation" rather
than some "abstract" knowledge as you suggest. The closest thing we
have to "abstract" knowledge of salty is the association of the
word-label "salty" to the sensory state we experience upon tasting salt.
So, long and short: if you had a significantly different neural
structure from me, I would expect the experience of tasting something
salty to be quite different for you from me.
Given our species-born similarities (I'm just guessing here that you're
of my species. :) ) I presume that our experiences are quite similar.
mez
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:45:07 MST