Re: Big Bang demiurges (was: Re: El Aleph)

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Mon Jan 04 1999 - 14:00:31 MST


New terminology:

I dub our Universe the "Socrates Universe".
Terra::Sol::Milky Way::Local Group::Virgo supercluster::Socrates Universe

Yudkowsky's Razor: "That simplest explanation is true which, given the
past, most strongly predicts the present as a unique outcome."

Anders Sandberg wrote:
>
> "Eliezer S. Yudkowsky" <sentience@pobox.com> writes:
>
> > Let me try an analogy. Suppose that your computer had all RAM
> > randomized. After interpreting these random instructions, eventually a
> > sort of order emerges where 1s clump together and 0s clump together and
> > most computational transactions occur when two areas of 1s and 0s mix.
>
> This is not an ideal example, since the thermodynamics of this system
> is rather iffy (in addition, whether complex structures appear depends
> on the evolution rule). It has an arrow of time (if the rule is
> irreversible) but the amount of computation it can do is finite.
>
> The conditions near the big bang was closer to having the memory
> constantly randomized; in this system no order can emerge other than
> in the functionalist "stones implementing windows" sense.

What do you mean, "constantly randomized"? The only random operation I
know of is state-vector reduction, and we who are built on it seem to
get along just fine. If all the particles are bumping into each other,
you say "random", I say "interaction". I will agree that Alpha potence
seems to depend on there being some configuration of particles that can
actively defend itself. Why is this impossible?

> > >From the perspective of any systems that may evolve in the 1s-and-0s
> > world, our computers are so much random noise. No useful
> > information-processing could possibly occur with 1s and 0s so mixed
> > together! There's no arrow of time, either.
>
> Now I'm confused. This seems to be more like the second paragraph
> above, but your example is a deterministic dynamical system with a
> fairly standard dynamics.

I'm trying to make a distinction about levels of abstraction. Human
life is built on several levels of abstraction, starting with quarks and
moving up to cells. As with neural networks, there's a tradeoff between
resilience and efficiency. If your fundamental operations are
statistical properties of vast groups of underlying particles, you're
less sensitive to blips in the twentieth decimal place. This makes you
more resilient and less efficient for exactly the same reason. To
distinguish between active and platonic Powers, we might add the
Hofstadterian caveat that there is top-down ("holistic") causality; the
high-level explanation compactly predicts low-level events.

The point I'm trying to make is twofold: First, that usable statistical
properties are not limited to "hot" and "cold"; "matter" and
"antimatter" groupings might do as well, for example. Thinking that no
computation is possible at thermal equilibrium is anthropocentrism.
Second, that operations can take place at the lowest level without any
statistical grounding at all.

> Nick gave you it, but I would like to add that you should really
> integrate the light-cones to make sure about the horizons; I think
> they get tricky near t=0.

What, you mean that any interaction at all might be impossible since
particle A is inflating away from particle B faster than light? I
thought of that, but I'm not sure how to compute it.

> (Actually, Max Tegmark's paper "Is the TOE really the ultimate enseble
> theory?" (available on the net, look in xxx.lanl.gov) suggests that
> this can be useful and actually have explanatory power).

It has too much explanatory power. If I sprout wings and fly, there was
some computation somewhere that embodied it. With Platonic computation,
anything can happen. The fact that anything doesn't happen suggests
that this is not the case. Certainly our Universe is predicted along
with everything else, but it is not uniquely predicted.

State-vector reduction is a great ugly blemish on modern physics. So
why suppose it exists at all? Occam's Razor says there's no such thing.
 If there was no state-vector reduction, our current world would exist
anyway, right? So why postulate this additional cause, which screws up
the whole of physics?

(Yes, it's Singularity Analysis, where Occam's Razor doesn't work!)

You need to use Yudkowsky's Razor: "That simplest explanation is true
which, given the past, most strongly predicts the present as a unique
outcome."

> > For the purpose of Alpha Point behavior, the question is not really a
> > mathematical point of defining instantiation, but whether the Alpha
> > "Powers" are powerless digits of pi or active entities (like us) capable
> > of controlling their environment, and particularly of surviving or
> > escaping the Great Freezing.
>
> Exactly.
>
> > The question is this: If you were the Singularity, could you set up the
> > Big Bang in such a way as to encode your consciousness AND your
> > technology into the initial conditions? If this is possible without
> > introducing large-scale anomalies, then it would happen as a result of
> > pure chance in some region. Remember, we're talking about infinity
> > here. Larger than 3^^^^3 or Graham's Number.
>
> I'm not quite buying that infinity factor; but even if I accept it,
> the Alpha would have a problem with encoding initial conditions due to
> the infinity of *other*, different, Alphas co-existing and doing the
> exact same thing, as well as the even larger infinity of noise
> processes corrupting the data.
>
> (If Alpha has X bits, it could randomly appear with a good chance in a
> field of 2^X random bits. Which suggests that for every Alpha there
> are 2^(X-1) bits of noise - the bigger the gods, the noisier :-)

No, I think the Big Bang is analogous to an infinite-speed CPU operating
on a large but finite amount of RAM. There are around 10^80 (?)
elementary particles around, call it a max of 10^100 to leave room for
error. So when the first Alpha is born out of 10^10^7 subjective years
of randomness, it is the only Alpha around. This doesn't apply if
simple self-replicating particle groups are possible; you'd probably get
a bunch of evolved mortal Bang People writing AIs.

The second thing to bear in mind is that if Big Bangs are useful enough
that we would want to make our own, our Big Bang is almost certainly
artificial. That sentence was incomprehensible. I dub this the
"Socrates Universe". If we Terrans (or our SIs) would want to make Big
Bangs, the Socrates Big Bang was almost certainly artificial.

Anyway, the "random chance" argument really only applies to the very
first Big Bang there ever was, which may not have been a Big Bang at
all. I find the Big Bang and our physical laws to be very contrieved.
I find it very easy to believe that the actual First Cause was a lot simpler.

> > * = Mitchell Porter has suggested this is due to the inflationary
> > scenario, which could have made computing power unusable due to
> > lightspeed problems. Our Universe may be optimized so that no Alpha
> > Powers could exist!
>
> Have you told Vinge? Maybe his zones of thought are reasonable after
> all! :-)

I see four possibilities:

1) Something has kept both the Powers and other mortal civilizations
from walking all over the Milky Way. (Lightspeed limits plus the
Anthropic Principle? Infinite space plus the Anthropic Principle?)
2) The Powers have kept other mortal civilizations from using up the
Milky Way because they need it (us?) for something.
3) The mortal civilization obeys the Prime Directive and the Powers
aren't home.
4) This is a computer simulation.

Take your pick.

-- 
        sentience@pobox.com         Eliezer S. Yudkowsky
         http://pobox.com/~sentience/AI_design.temp.html
          http://pobox.com/~sentience/sing_analysis.html
Disclaimer:  Unless otherwise specified, I'm not telling you
everything I think I know.


This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:02:43 MST