> The subjective experience of "Red" is a consequence of the
> neurophysiological state of seeing (or imagining) light of a certain
> frequency.
Yes, but imagining red and actual red from real 700nm light
are different experiences, each undoubtedly the consequence of
different though possibly similar neurophysiological states and/or
systems.
> We can verify analogous neurophysiological states for various
> sounds, bodily sensations, smells, tastes, etc..
> Given this, can we not reasonably suppose that the subjective
> experience is a consequence of the neurophysiological state
> triggered by the stimuli?
Yes of course.
> If this is the case, can we not learn to distinguish those
> "neuroqualia" which are essentially the same among all humans from
> those neuroqualia which vary among individuals?
Once we understand what "neuroqualia" are yes. I can't
imagine any reason that would prevent such.
> If so, would not the first set represent a set of "experiences"
> which we could then share with another individual by instantiating
> the same neurophysiological state in that individual?
> Indeed, among individuals of different species, or between (for
> example) a human and an AI, could we not construct a mapping table
> that allowed us to translate these experiences to the appropriate
> internal representation?
I described an "effing" process earlier in this thread in
which a sensing machine observes the neurophysiological firings of all
relevant "neural correlates" to a particular sensation. Then this
information is communicated to another brain which is augmented with a
cortex that is able to produce generic sensations in the consciousness
of that person. Upon receiving the information from the sensor
observing sensation process in the other brain, it would produce in
the consciousness of the augmented brain an identical sensation. It
may be that the person doing the effing might say: "That isn't what
salt tastes like to me".
I can imagine being able to "eff" sensation in other animals,
like say a bat. We would then know what it is like to be a bat. I
also imagine machines or computers being endowed with such generic
qualia experiencing devices so that they too could experience and know
what salty is. I would imagine that computers with such phenomenal
representation abilities would be many times more intelligent and have
much more common sense than any computer that uses mere abstract
representations.
The important thing is is that salty is salty. Sure, you can
abstractly represent any and everything you might want to represent
about salty and instantiate this abstract information any way you
want, but salty is still salty. Anything that is fundamentally
different than salty is not salty even though it may abstractly
represent salty. Only salty is like salty everything else is a mere
and fundamentally different representation.
> 1) This thought experiment may in fact be impossible. Much like me
> imagining (If I could go 10x the speed of light..)
All you would have to do is something like create a city of
streets where red equals go and green equals stop. The driving would
all be the same, but the subjective experience of the drivers would be
different.
> 2) Research has demonstrated a psychological effect of color on
> humans. Different colors produce different levels of stress,
> relaxation, attention, arousal, etc.. in fairly predictable and
> consistent ways. So I ask you, if I switch red and green, and all
> fires are now green, haven't I actually made some difference in the
> meaning of fire to the individual? In the sense that the fire will
> now evoke a different response than when it was red?
I would say yes to these questions. The various levels of
"stress, relaxation, attention, arousal, etc.." is a very complex
process. Sure it could be related to particular qualia or to various
neural responses prior to and/or beyond the particular ones that
produce particular colors because of memory associations and
everything else.
> You assume a gulf between sensation and realization that I do not
> believe exists. Imagine that sensation is a consequence of the
> physical system that instantiates the information that is
> represented in consciousness. Or, put another way, sensation /is/
> the stuff of consciousness. A being that represents salty
> internally in a sufficiently rich manner necessarily has an
> "experience" of salty.
Yes, any old abstract representation can be sufficiently rich
to represent or model salty. But only my salty is precisely like my
salty.
> Hmm. You posit an incredibly simple machine and then grant it
> intelligence, which immediately strikes me as contradictory.
Sorry for the confusion. I'm talking about the kinds of
abstract machines people produce these days which use abstract look up
tables and such to try to carry on conversations like humans do. I'm
saying that a poor programmer of intelligence would try to make his AI
be a liar and try to have it spout verbiage a human might spout, via a
lookup table or whatever, when asked what salty is. A truly
intelligent AI would logically recognize that it has no real
subjective experience of salty or whatever and would try to explain
this in any conversation on the topic, unless of course it was really
trying to deceive someone that it was not a computer and intentionally
being a liar.
> So, long and short: if you had a significantly different neural
> structure from me, I would expect the experience of tasting
> something salty to be quite different for you from me.
> Given our species-born similarities (I'm just guessing here that
> you're of my species. :) ) I presume that our experiences are quite
> similar.
Yes, of course. But until we discover what salty really is
and can do things like eff such sensations, produce the same in
artificial machines and so on, we just don't yet know for sure. But
we do know what our sensations are like and that they are phenomenally
real. Red is most definitely not anything like salty. We know such
things more than we know anything else for indeed everything we
consciously know is represented by such.
Brent Allsop