From: Ramez Naam (Exchange) (ramezn@EXCHANGE.MICROSOFT.com)
Date: Mon Oct 13 1997 - 11:52:55 MDT
Let me reiterate that I'm not positing /hard/ or unbreakable constraints
on future entities or technologies. They may exist. They may not. No
one is in a position to say definitively.
In my observation of the world, though, I note that the rate of
advancement of /technology/ often far eclipses the rate of advancement
of /theory/. Technology grows at an amazing rate for some time, till it
approaches the boundaries of our theoretical models of the world. At
that point its rate of growth slows. Eventually new theoretical models
are put forth that facilitate a new round of amazing technological
growth. Thus advancement of theory becomes a /bottleneck/ on the
advancement of technology.
That being the case, let me rephrase the question of the thread
somewhat. What are the most significant theoretical bottlenecks that
we're approaching? How far can we get before hitting those
bottlenecks?
(I'm an adherent of the philosophy of asking questions for a purpose, so
let me explain this one: These theoretical bottlenecks present likely
targets for our research and contemplation. Knowing how far off they
are helps us prioritize our allocation of resources to the various
areas.)
> From: Eliezer S. Yudkowsky [SMTP:sentience@pobox.com]
> What about negative matter? You can have an arbitrary amount of
> computing
> material in a given volume, with net mass zero.
Interesting. I'm not familiar with negative matter, any recommendations
on a primer?
> > Given the likely mass, age, and size of the universe, and the
> > constraints listed above, what is the maximum achievable
computational
> > power of the universe?
>
> Infinite. There exist physical processes which are not
> simulable-to-arbitrary-accuracy by Turing machines. Even if all
> physical
> processes *are* simulable, they still use real numbers. Perhaps a
> clever
> Being could exploit a chaotic Mandelbrot-like boundary to perform
> calculations
> of arbitrary complexity in constant time.
Hmm. Not sure I agree with your premise. It seems that to really
exploit this you run up against Planck space/time again, and what you
really get is Quantum Computing. Which, while it provides truly
mind-boggling computational power for certain classes of problems,
certainly does not provide infinite computing power.
This is what I mean by a theoretical bottleneck. Perhaps some future
model of the fundamentals of space/time/matter/energy will provide us a
way to manipulate physical processes of arbitrary complexity in constant
volume. However, our current models do not.
> > Given c, the age, size, and rate of expansion of the universe, how
> long
> > would it take an earth-spawned power to infest the galaxy? 1/10e6
of
> > the universe? 1% of the universe? 10% of the universe?
>
> General relativity makes the speed of light fundamentally arbitrary.
> They can
> infest the entire Universe in zero time, and finish before they
started.
This is totally at odds with my understanding of GR. Relative to an
observer anywhere else in the universe, c is a very real constraint.
Certainly relative to the action here on earth (or whatever starting
ground the power has), c appears to be quite a real constraint. Please
explain.
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:45:02 MST