From: ChuckKuecker (ckuecker@mcs.net)
Date: Tue May 05 1998 - 21:23:15 MDT
At 11:34 5/5/98 -0700, you wrote:
>
>Wouldn't the functionalist perspective suggest that consciousness is not
>an "incidental" by-product, but rather an "inevitable" one? In other
>words, you automatically get consciousness when you have the kind of
>complex system Damien is describing. Any system which has a sufficiently
>rich representation of itself and the world is conscious - that is what
>consciousness is.
I think the determination is where the organism has an internalized sense of
being - it 'knows' it is an individual, separate from the world outside.
This is easily felt when one works with animals - but it is terribly hard to
define a test to prove this. I 'know' I am conscious - but how can I prove I
am not a cunningly written 'Eliza' program?
>One problem with this suggestion is that it implies that a relatively
>simple computer program, say CYC with its thousands of interrelated
>facts about the world, would be conscious to some degree. Such a
>program is far from being able to pass the Turing test, and we might
>not be comfortable setting the bar for consciousness so much lower than
>that for human equivalent intelligence. But on the other hand, it
>appears in nature that conscious awareness is in fact much easier to
>produce than human intelligence, so perhaps this is not so objectionable
>after all.
>
There is the key - awareness. A computer program filled with facts may not
actually be aware of the length of the Nile, but it can tell you that fact
on query.
Chuck Kuecker
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:49:03 MST