From: Max More (maxmore@primenet.com)
Date: Tue Aug 19 1997 - 14:44:48 MDT
At 02:03 PM 8/19/97 -0500, you wrote:
>Anders wrote:
>>
>>On Tue, 19 Aug 1997, Prof. Jose Gomes Filho wrote:
>[...]
>>
>>IMHO Extropy is a rather loose term, a bit like "complexity". We have
>>an intuitive feeling for it, but when studied closely it likely falls
>>apart into some more elementary (and perhaps different) concepts such
>>as complexity and -entropy.
>
>Yes, it simply does not have a rigorous, scientific definition as yet.
As I've noted many times, "extropy" is not intended as a technical term. It
is, as Anders says, a loose term. We use it as a label, a metaphor for the
life and intelligence-promoting things we value. The inverse of entropy
already has a term, "negentropy". This had been given both a thermodynamic
and an information-theoretic definition. I feel no need to attempt a
rigorous, scientific definition for "extropy" since it is not intended to
function in that way.
In regard to negentropy, I still am unsure how to relate the thermodynamic
definition to the information-theoretic definition. The former appears to
be objectively well-defined but, as far as I understand it, the latter is
not since the notion of "information" is not objectively defined. Perhaps
AIT fans can explain this (if they agree that information is not
objectively defined in the context of the definition of entropy). The
difficulty with objectively defining "information" arises from because a
signal only conveys information when apprehended by a mind, and the
information content of any signal seems to depend on the existing contents
of a mind.
Max
Max More, Ph.D.
more@extropy.org
http://www.primenet.com/~maxmore
President, Extropy Institute: exi-info@extropy.org, http://www.extropy.org
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:44:45 MST