From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Fri Mar 08 2002 - 12:20:40 MST
hal@finney.org wrote:
>
> It's also hard to see how anyone could put a message into Pi. That
> is such a simply defined number, how could it have a message? It's
> almost provably impossible. It would make more sense to put a message
> into Newton's gravitational constant G, something physical rather than
> mathematical. Of course it's a lot harder to find the 100th digit of
> G than to do so for Pi.
Pi certainly appears to be the product of a very simple mathematical
sequence that can be expressed in a number of simple equivalent ways. The
computational complexity of Pi is very, very low. The ability to insert an
extended message into Pi requires that the computational complexity of that
message be inserted into Pi. The total amount of computational complexity
available to "construct" a message in Pi is the computational complexity of
Pi, the computational complexity of the starting point within the sequence,
and the base used for the sequence. Since the total computational
complexity of the formula for Pi is so small that selecting among the
possible computable processes of approximately this complexity does not add
enough degrees of freedom to place an extended complex message, the only way
to "insert" a message into Pi is if, by some impossibly fortuitous
coincidence, it is already there - if there is already a complex message
somewhere in the reasonably early digits, of a reasonably small base, in the
set of computable numbers whose computational complexity is in the range of
Pi. In this case we can imagine aliens constructing a universe that would
have such a number as a mathematical constant, with the number containing a
message. Now it is theoretically possible that one of the simple numbers
contains a long message from aliens by pure coincidence, but it is
improbable in proportion to 2^(message length).
To violate these constraints and place an arbitrary, long, high-complexity
message into Pi requires violating the basic rules of information theory,
mathematics, and logic, in order to exert fine-grained control over
arbitrary details of the construction of alternate arithmetics which appear
entirely natural and computationally simple to the inhabitants of that
universe - such that the global rules appear to arise from a set of simple
local rules, and there are no extraneous causes perceived as interfering
with the local rules. And yet the external aliens can decide to make
12*12=160, altering only this one fact and none of those around it, making
the universe conform to that rule in all ways, on both microscopic levels
and macroscopic levels, without causing the arithmetic users to perceive an
extraneous cause or an inconsistency. In other words, the aliens need to
violate the fundamentals of logic and make simple causes with no degrees of
freedom give rise to complex effects which have many degrees of freedom.
They need to 'make' simple causes give rise to complex effects without the
intervention being visible in any way as additional complexity.
I would sooner believe that the aliens had total control of every atom in my
computer and were messing with the computation than that they could insert
complex messages into Pi. If I personally did the arithmetic, I would
sooner believe that they were messing with my neurons. If somehow I had
absolute assurance that the arithmetic really did check out, and arithmetic
still seemed simple to me, I would sooner believe that they were messing
with my neurons to cover up the signs of their intervention in arithmetic
and causing me to falsely perceive arithmetic as simple and having no
degrees of freedom. This does not prove impossibility. It just proves that
if the aliens can mess with Pi, they have technology that is very much
distinguishable from magic, in that no imaginable magic is that powerful.
In fact, they have technology that is distinguishable from God, in that God
isn't that powerful. So your Bayesian prior on "technology more powerful
than God" - which, out of respect to the Singularity, should be at least 5%
- is your Bayesian prior on "putting messages in Pi".
You Know You've Been Thinking About The Singularity Too Long When:
8) You know exactly what would be required to place arbitrary messages in
Pi.
-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:12:51 MST