John K Clark replied:
<I don't think the two are as strictly segregated as some think and I
don't you need humans of even intelligence for semantics. A sequence
of amino acids MEANS a shape, the way the resulting protein folds up
in water at the temperature and acidity found in life.>
OK. I should have written that these two measures are *mostly*
syntactic.
http://turing.pacss.binghamton.edu/dietrich/table-of-contents.html
provides an outline of a new book called _Thinking Computers and
Virtual Persons_ edited by Eric Dietrich. One of the papers is called
"Syntactic Semantics" by William J. Rapaport. The following is from
the summary of this paper:
< Rapaport argues, contra Searle, that computers can understand
natural language and thus be humanly intelligent . . . Rapaport agrees
that computers are syntactical, but argues that syntax suffices for
semantics, hence computers are not purely syntactical. The internal
relationships between representations turns out to be sufficient for
semantics. Rapaport's arguments are made all the more compelling
because he and his colleagues have a system that understands some
English.>
This seems similar to your argument using the DNA code. I haven’t
read the Rapaport paper, and you both seem to have a point that there
is some semantics in the syntax, however, I intuitively question
whether *that’s all* there is to semantics. (In the same spirit that I
question statements like "evolution is stupid" or "evolution is
blind". And, no, I’m not talking about God or ‘spirits’, I’m talking
about levels of control in cybernetics.)
I wrote (without really thinking through the implications) that
complexity could possibly be measured by the number of different
functional relationships a system can maintain, and might further be
weighted by the number of functions it can perform simultaneously.
John replied:
<Not a useful definition because we can't know how many relationships
a system can have.>
A good point - I wasn’t really thinking about how *all* those
relationships could be *measured*. Instead, I relied on Bruce
Edmonds' somewhat more formal definition of complexity: "It is
proposed that complexity can usefully be applied only to constructions
within a given language".
John replied:
<A language doesn't need an observer, consider the Genetic code. The
nucleotide triplet CAU in messenger RNA is a "real" object it MEANS
the amino acid histidine but I grant you, only in the context
(language) of life, their are no special chemical characteristics that
relate one to the other.>
Perhaps ‘life’ is the observer of that language and organisms are the
speakers. My point, and perhaps Edmonds’ point, is that when we
examine any descriptive language (or formal system) *from the outside*
there is always something at another level that the language fails to
describe.
My quote from Edmonds was:
<A definition of complexity is proposed which can be summarised as
"that property of a language expression which makes it difficult to
formulate its overall behaviour even when given almost complete
information about its atomic components and their inter-relations.>
John replied:
<This could be AIC or depth, I can't tell which because Edmonds
doesn't say what he means by "difficult".
One of the main points of my post was that there are other measures of
complexity besides just the two you mentioned. Edmonds specifically
describes at least five measures: Computational Complexity, Kolmogorov
Complexity, Bennett's Logical Depth, Löfgren's Interpretation and
Descriptive Complexity, Kauffman's number of conflicting constraints.
John continued:
<At any rate, by this definition the gibberish produced by a monkey
would certainly be more complex than any article in the journal
"Nature". I want a definition that is not the opposite of our
intuitive understanding of the word, I haven't found it.>
Because, in the monkey example at least, you haven’t described the
system completely enough. The monkey has no knowledge of the language
for which the typewriter is a recording device. So, the monkey plays
with it and produces gibberish. If I were to sit down at a piano
(having no musical knowledge), I’d just produce a cacophony rather
than a symphony because I don’t understand the language of the piano.
I’m reminded of something Hara Ra wrote in the post you were
originally responding to:
< Mathematical theorems and conjectures are statements of how to
generate and investigate strings finite in length, grammar and syntax
which describe the mathematical objects and operations in relation to
same (the strings representing the objects only!)>
I’m not very competent at either chemistry or math, but you seem to be
seeking something similar to describe human-level intelligence. I’m
suggesting that there are many levels of complexity operating in
nature and that a reduction of the kind you are suggesting might be
useful for some purposes (possibly for making upload copies) but
wouldn’t suffice for more complex (oh oh, there’s that word again!)
goals (such as the ‘automorphing’ of that upload once it was
established).
Mark Crosby
_____________________________________________________________________
Sent by RocketMail. Get your free e-mail at http://www.rocketmail.com