From: Charles D Hixson (charleshixsn@earthlink.net)
Date: Wed Aug 30 2006 - 11:45:36 MDT
Olie Lamb wrote:
> ...
> There is a large set of possible goals for an intelligent entity to
> have. A number of us happen to think that no particular goal is
> required for intelligence. Some people frequently assert that some
> goals are "necessary" for any intelligence. I've yet to have
> difficulty finding a counterexample, but I'm not quite sure how to go
> about demonstrating my contention...
> ...
> --Olie
I would assert not that goals are necessary for the existence of
intelligence, but rather that goals are necessary for an intelligence to
act, even as basic a act as deriving a conclusion.
OTOH, goals could be either explicit or implicit. A program could be
designed to automatically derive all possible conclusions from the input
data (until it was interrupted). In this case I would assert that it
had an implicit goal, even though there was nowhere given an explicit
goal. So perhaps a part of the question is "what do you mean by goal?"
I guess that my first stab would be "the tendency to achieve a selected
subset of results out of all possible results". I tend to model it
(mentally) as an attractive force...but that doesn't translate easily
into multiple contexts. Still, if consciousness is chaotic, thing of
goals as "attractors". (Consciousness does have some features that
appear to be chaotic, but I haven't convinced myself that it is really
chaotic. And haven't seen any advantage in closely investigating.
Still, the mutability of memory feels like it has some implications
here. If so, then goals may literally be attractors.)
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:57 MDT