From: Romana Machado (romana@glamazon.com)
Date: Sun Dec 15 1996 - 23:11:05 MST
In the interest of clear understanding, and developing a clear way of
saying exactly what extropianism is - I've been looking at the present
definitions of extropy.
Of the two definitions of extropy available on Anders' site
(http://www.aleph.se/Trans/Words/e.html#EXTROPY) I prefer "the collection
of forces which oppose entropy." It's still kind of hard to use, though,
because it assumes that the person you are talking to knows what entropy
is. That's a big assumption, because entropy stumps most people.
Michelle Coltart has published an essay on this topic at:
(http://canyon.ucsd.edu/infoville/schoolhouse/class_html/michelle.html)
>The term "entropy", with it's multiple uses, is a confusing and often
>>misleading
>term. In 1949, Claude Shannon defined entropy as a measure of the 'amount of
>information' contained in a message sent along a transmission line (Denbigh &
>Denbigh, p. 101) This is often referred to as informational entropy: [1] H = -å
>pi log2 pi i = 1...n where n stands for the number of different symbols. Before
>Shannon introduced the idea of informational entropy, however, the term
>entropy was used in physics. In statistical mechanics, the term entropy
>refers to
>a very similar equation: [2] SG = -kå pi ln pi i = 1...n where n stands for the
>number of different microstates. Are these two equations the same? They are
>certainly similar in appearance, but seem to be different in meaning. The fact
>that the two equation have the same name, coupled with the fact that they
>are so
>similar has caused much of the confusion surrounding the term entropy.
...
> It is said that when Shannon came up with his equation he was inclined to
>name it either 'uncertainty' or 'information', using these terms in their
>technical
>sense. However he was: subsequently persuaded by von Neumann to call it
>entropy! 'It is already in use under that name,' von Neumann [reportedly
>said],'and besides it will give you a great edge in debates because nobody
>really
>knows what entropy is anyway.'(Denbigh & Denbigh, p. 105) Thus much of our
>confusion over entropy stems from this incident. Another source of confusion
>about entropy stems from the lack of understanding of physical entropy. Since
>people tend to erroneously think of physical entropy as meaning disorder and
>chaos, thus informational entropy gets confused with disorder and chaos as
>>well.
Wasn't the second definition of extropy - "a measure of intelligence,
information, energy, life, experience, diversity, opportunity, and growth"
- recently discussed here? I missed it, so I'm bringing it up again.
If extropy is a measure - a metric - what does it measure? To what system
is it intended to be applied? What sort of thing's extent or dimensions
does it determine?
Some of the qualities it purports to measure can be measured, and some -
well, I'm in the dark.
For instance, there are rigorous and well-accepted ways of measuring
"energy" - at least within physics - and "information" (well, as we saw
above this is the amount of entropy in a signal)."Growth", a change of
dimension over time, is clear enough, as is "diversity", how many different
forms of some thing there are.
The measure of intelligence is often a hotly debated topic, though there
are metrics of questionable validity currently in use. It would be hard, I
think, to come up with a measure of life, a measure of opportunity, or a
measure of experience, though if any you of have a suggestion I'd like to
see it.
Romana Machado romana@glamazon.com
erotic site: http://www.glamazon.com/ "Romana Machado's Peek of the Week"
personal site: http://www.fqa.com/romana/ "Romana Machado World Headquarters"
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:35:53 MST