Re: The copy paradox

From: Brent Allsop (allsop@swttools.fc.hp.com)
Date: Tue Nov 11 1997 - 15:46:27 MST


Anders Sandberg <asa@nada.kth.se> responded:

> Actually, I consider this "stuff of consciousness" to be
> information, or more accurately instantiated information. So I have
> no problem imagining a computer experiencing the color red if it has
> the right software, and transferring my mind into the computer is
> just a problem of replicating the pattern. But I cannot be sure
> until I try.

        Yes "instantiated information" is more accurate. Instantiated
information is represented by something fundamentally real. The state
of a transistor can be an instantiation or representation of the word
stop. So can the color red. A distinguishably different state of
that same transistor can be an instantiation of the word go. And
green could similarly be taken to represent go. But, there are some
important qualities that red/green has that the particular states of
the transistors do not have. It doesn't matter what the fundamental
state of the transistor is as long as it is interpreted correctly. If
you reverse the interpretation and the setting of the value the
abstract meaning stays the same even though the fundamental or
physical representation is now opposite. If we swap the meaning of
red and green, the abstract meaning can be the same, but the
fundamental subjective or phenomenal experience is very different.
Such phenomenal qualities of experience must be taken into account in
order to reproduce consciousness.

        Would you really be happy if all you knew of "salty" was
whether or not some abstract salty bit was set or not and all you had
was a corresponding look up table containing all the strings people
had ever used to try to verbally describe what salty is like? "It's
kind of a puckery non sweet taste" just contains no meaning at all
about what salty is really phenomenally like. The difference is the
actual fundamental and phenomenal nature of the particular
"instantiation" and this difference is most definitely important to
consciousness.

        How many abstract bits of information would it take to store
the various possible responses to the question: "What is salty?" in a
discussion kind of way? Wouldn't an abstract machine producing such
responses really be a liar since it really had no idea what salty was
phenomenally like?

        How many bits do you think the brain uses to reproduce such
"information"? Or does it simply know what salty is like? And might
such phenomenal abilities to "instantiate information" be one reason
for such "common sense" intelligence that abstract machines still
lack?

        Wouldn't an honest and intelligent abstract machine be able to
recognize that it doesn't really know what salty is like, just as you
must admit that you don't know what salty is like for me, unless salty
is the same for me as it is for you?

        If you get a chance, you should ask Marvin Minsky what he
thinks about this kind of stuff. I believe he once thought very
similar to the way you now think.

                Brent Allsop



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:45:07 MST