From: Olie Lamb (neomorphy@gmail.com)
Date: Wed Aug 30 2006 - 18:37:52 MDT
On 8/31/06, Charles D Hixson <charleshixsn@earthlink.net> wrote:
> Olie Lamb wrote:
> > ...
> > There is a large set of possible goals for an intelligent entity to
> > have. A number of us happen to think that no particular goal is
> > required for intelligence. Some people frequently assert that some
> > goals are "necessary" for any intelligence. I've yet to have
> > difficulty finding a counterexample, but I'm not quite sure how to go
> > about demonstrating my contention...
> > ...
> > --Olie
> I would assert not that goals are necessary for the existence of
> intelligence, but rather that goals are necessary for an intelligence to
> act, even as basic a act as deriving a conclusion.
Sorry, I meant that I don't think that any /particular/ goal is
necessary, as in "All intelligences must have the goal to gain
complete control over the universe" or even "all intelligences must
want to spread its memetic material".
I agree with your assertion that having some goal(s) ("explicit" or
"implicit" by your term'gy) may be necessary to act. Some goals
necessary. Just not any particular one.
>
> OTOH, goals could be either explicit or implicit. A program could be
> designed to automatically derive all possible conclusions from the input
> data (until it was interrupted). In this case I would assert that it
> had an implicit goal, even though there was nowhere given an explicit
> goal. So perhaps a part of the question is "what do you mean by goal?"
> I guess that my first stab would be "the tendency to achieve a selected
> subset of results out of all possible results". I tend to model it
> (mentally) as an attractive force...but that doesn't translate easily
> into multiple contexts. ...
I've yet to find a decent "technical explination" of a goal, so I'm
still confused.
This is a very important issue, methinks.
Hint, hint.
--Olie
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:57 MDT