From: Francois-Rene Rideau (fare@tunes.org)
Date: Mon Jun 11 2001 - 16:37:35 MDT
On Sun, Jun 10, 2001 at 03:38:39AM -0700, Samantha Atkins wrote:
> Mike Lorrey wrote:
>> Yes. Take humongous lookup tables, some versatile pattern matching and
>> association programs, and a self teaching chatterbot, and you've got something
>> with some sort of sentience.
> No, you have a compicated chat machine. There is nothing in
> the above construction that calls for or is likely to lead to
> any sort of self-awareness. Without that, I would not grant it
> has "sentience".
So what? The hypothesis says "humongous lookup table".
You have to argue why such a table couldn't include enough special cases
to deal with the environment of the machine, modulo the mechanisms
that reduce the I/O/S to the lookup table.
One interesting theoretical argument would be one of size and complexity:
the lookup table might have to increase in size exponentially
with the complexity of the environment, whereas the size and complexity
of the environment itself will tend to asymptotically grow linearly
with the size of the table (since possibilities of interactions between
environment and the table increase with its size). Hence, there is
a technology-dependent limit on the complexity of behaviour that be
aware of self-vs-environment-interaction that can be effectively encoded
in a lookup table, as opposed to other mechanisms that may constitute
a conscious mind. Now of course, if the limit is large enough, that
mightn't even be a problem.
In practice, who gives a damn, either way?
NB: the computer's memory is but a humongous lookup table,
and the CPU is its versatile pattern matching and association program.
Any digital pattern can be encoded as a program on a digital computer.
Do you or not believe that "artificial intelligence" is possible
to achieve with digital computer technology?
>> Adding more pattern matching, associating, and learning programs gives
>> it more sentience. Keep in mind that a cockroach has
>> sentience dictated by less than a handful of pattern matching programs...
>> our level of sentience requires a bit more than that...
> I don't call unconscious pattern matching sentience.
You're just evading the problem with what looks like an emotional response.
Why would the pattern matching be "unconscious"?
Just what makes you "conscious"?
[ François-René ÐVB Rideau | Reflection&Cybernethics | http://fare.tunes.org ]
[ TUNES project for a Free Reflective Computing System | http://tunes.org ]
Man usually avoids attributing cleverness to somebody else -- unless it
is an enemy.
-- Albert Einstein
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:08:05 MST