On Fri, 27 Jun 1997 Brent Allsop <allsop@swttools.fc.hp.com> Wrote:
>It doesn't matter to the abstract intelligences that we currently
>produce whether the representation is punched on paper tape or stored
>in high speed volatile transistor arrays. What the fundamental nature
>of the representation is like is not relevant to the particular
>information being represented in such a machine.
I agree, except my intelligence has always concluded that all intelligence
is abstract. What on earth would non abstract intelligence be like?
>This is very different than us.
I see nothing different in the slightest. You will treat this post much the
same if you read it off a cathode ray tube, a liquid crystal screen, listen
to it with a voice synthesizer or print it on a dead tree.
>If you completely rewire an abstract computer so that what once
>physically represented red now represents blue and visa versa, there
>would be no difference in behavior of the machine.
Why? There would be a difference in behavior, unless you change the machine's
memory of color too. The relationship of one color to another and how that
connects to objects in the real world would be different. The machine once
said that red, orange and pink were similar, not now. It once thought red and
black neckties looked great, but now they look ugly and he never wares them.
This rewiring would be just as disconcerting for a intelligent machine as it
would be for me or you.
>But if you did the same thing in the optic nerve of a human so that
>it now used a red quale to represent blue wavelengths of light and
>visa versa, his favorite color quale would still likely be the same
>sensation which would now represent a different wavelength of light.
Yes, but why wouldn't the same thing be true of a machine, after all,
a machine can have a favorite color too.
>What the world would be phenomenally "like" to him would be
>drastically different after such a change.
A red quale in isolation is meaningless, if all light produced a red
sensation then color would signify nothing. The red color only has meaning
if there is contrast, if there are other colors that are not red to compare
it to. If you changed my perception of red and blue today then obviously my
subjective experience would be different as would be my behavior, just like
the machine. If you also changed my memory of those colors or made the switch
the day I was born then my behavior today would be no different, just like
the machine. You say my subjective experience would be different but give no
evidence to support your claim.
>His answer to what red and blue were like would become inverted.
You'll have to take my word for it but I am not a machine, I am a human,
but I can not answer what the sensations of red and blue are and I'm quite
sure you can do no better. I can give examples but no definition, I can point
to red and blue things but that's all. I can't say what red and blue are.
>we are trapped inside our own little spirit world and can't YET know,
>other than abstractly, the phenomenal nature of anything beyond our
>skull.
Knowledge is an abstract quantity. Knowledge of anything is abstract.
>Our senses only know abstractly what the world is like. But, this
>will not always be the case.
You don't really think the debate on this issue will EVER come to an end do
you? OK, it's the year 2525 and you have just thought up a bright shiny new
theory explaining exactly what subjective experience is all about.
How do I know if it's even approximately correct?
>When we finally objectively discover what and why these fundamental
>qualities of our conscious representation are we will eventually be
>able to pierce and escape this mortal spiritual veil [...] We will
>be able to endow machines with the ability to have more than abstract
>knowledge. "Oh THAT's what salt tastes like!" they will eventually
>be able to honestly say after being properly endowed.
No you're incorrect, that's not what salt tastes like to me at all, it's not
even close. Prove I'm right. Prove I'm wrong.
Brent Allsop is an intelligent fellow but he is not conscious, in fact,
all human beings are conscious, except Brent Allsop. Brent has some pitiful
little thing he calls consciousness, but compared to the glorious subjective
experience every other human has it's the difference between a firefly and
a supernova. Prove I'm right. Prove I'm wrong.
John K Clark johnkc@well/com
-----BEGIN PGP SIGNATURE-----
Version: 2.6.i
iQCzAgUBM7SCyX03wfSpid95AQG76ATwxmueDDNAg+QSAtEQTiczLvnPN/eU041a
+v2UJ8/HlfsR2U7FvOLdWvbj/rGQ23fTIo9z8caJW5HbS6Dw8dOHWf6x5hP5nsIc
Hio74vzs56DTdpYSe0I1rIkZN7RNxyOohoiBqZD1qMdXUUCykhs/ZxyJHbR9UHyV
i5bM9BwNsTUnZj1WOdWa3uVQLGUOLWBHrCN5AIAMdzqQOSu4W0c=
=jIfa
-----END PGP SIGNATURE-----