Re: MATH/COMP/PHIL: "Omega Man"

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Tue Apr 03 2001 - 01:46:08 MDT


Spudboy100@aol.com wrote:
>
> In a message dated 4/2/2001 3:15:18 AM Eastern Daylight Time,
> sentience@pobox.com writes:
>
> << Except for the idea that the Goldbach Conjecture (NOT the Riemann
> Hypothesis, as stated in the article) might be true but TOTALLY
> unproveable (i.e., the consequence of an infinite number of independent
> mathematical facts). I've heard this hypothesized before in connection
> with Chaitin's work, and I find it both plausible and chilling.
> >>
> I wonder why E.Y. finds this "chilling"?

Turing already demonstrated that no Turing machine can solve the *general*
halting problem. This just means that, for any given problem-solving
Turing machine, you can cook up a problem that it can't solve. You're
still allowed to cook up a *different* Turing machine that *can* solve
that problem.

Godel demonstrated that the Godel Statement is unproveable within
Principia Mathematica. However, it's easy enough to produce an alternate
mathematical system that can prove the Godel Statement; just add the Godel
Statement to Principia Mathematica as an axiom. This expanded "Principia"
then has its own Godel Statement, of course, which can only be proven in a
doubly-expanded Principia. And so on.

If the Goldbach Conjecture is true as a consequence of an infinite number
of independent mathematical facts, it would mean that there's a *specific*
problem that *no* Turing machine can solve. *No* Turing machine would
*ever* be able to predict whether a Goldbach machine halts. *No*
mathematical system, however many axioms or axiom schema are added(*),
would ever be able to solve that one specific question. The problem shall
remain open from now until the end of time, forever and amen, or until
someone develops an Oracle device.

Now isn't that chilling?

(*) You could, of course, "cheat" by adding GC as an axiom without having
any idea whether or not it was true, just as a Turing machine might get
the halting problem for a GC machine correct by coincidence - but it's an
odd use of the word "proof" to describe a result that contributes nothing
whatsoever to your beliefs about the problem one way or the other.

-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:06:48 MST