From: John Clark (jonkc@worldnet.att.net)
Date: Mon Nov 29 1999 - 15:07:06 MST
Brent Allsop <allsop@fc.hp.com> On November 29, 1999 Wrote:
>Color detecting machines are far more aware, intelligent and functional
> than we are about color.
I don't believe that's true even about color in the abstract, and today's
machines certainly know much less than we do about the way color interacts
with the rest of the universe, that's much more important.
>The kicker difference is: when you ask a human what "red" is
>like, he honestly tries to express the very real and phenomenal
>sensation.
And a human always fails miserably when he tries to explain what
the sensation "red" is like, perhaps because he doesn't really
understand "red" as well as he thought he did, but more likely because
red is not "like" anything except red.
>Any abstract machine
Huh?
>that represented color with mere abstract representations rather
>than a real phenomenal qualia
It seems to me that phenomenal qualia are certainly real but about as
abstract as you can get.
>abstract representations of knowledge and awareness information aren't
>phenomenal or conscious at all.
There is absolutely no way, even in theory, that could ever be proven right or
wrong, so it has nothing to do with science. Yet I have a very strong hunch it's
wrong because Darwin's theory of Evolution has worked wonderfully well
in explaining things but if what you say is true then Darwin is definitely wrong.
No creature should be conscious but I know for a fact that at least one of
them is.
If you want to investigate consciousness start with the only example you
know of with certainty, but don't ask yourself how it works because that's
too hard, ask yourself what physical mechanism would demand it exists.
That's what was so revolutionary and made people so mad when
Darwin wrote his book in 1859, things that experience consciousness
could be produced by randomness and natural selection, mumbo jumbo
was no longer needed. I need hardly mention that natural selection can
not select for consciousness so the only alternative is that it hitches a
free ride on intelligence's coat tails. The same will be true of our machines.
>If a machine really had been given the proper machinery we
>have to produce, say, a salty qualia, it could then honestly respond
>with something like: "Oh THAT'S what salt tastes like!".
But that's exactly the problem, there is no way you can ever prove the
computer has the proper machinery and you could say exactly the
same thing about your fellow Human Beings.
>if it was representing the sodium chloride with abstruct numbers or
>something, though it could easily act even more convincing, it would
>still be blatently lieing
How would it even know it was lying, for that matter how would you?
Perhaps you were born with no sense of taste, what you call "salty"
is just a pale pitiful reflection of the true powerful sensation everyone
else has.
>about that fact that it could feel walt a salt quale
>was like
Although it can be produced in countless ways (various chemicals,
nerve stimulation, hypnotic suggestion, etc.) I see no evidence that
a salt quale is like anything at all except itself, I think that's why we
can't describe it and neither can a machine.
John K Clark jonkc@att.net
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:05:53 MST