Constraints on the "singularity"

From: Ramez Naam (Exchange) (ramezn@EXCHANGE.MICROSOFT.com)
Date: Sun Oct 12 1997 - 13:19:02 MDT


> From: Dan@Clemmensen.ShireNet.com [SMTP:Dan@Clemmensen.ShireNet.com]
> I use the term "singularity" in deference to Vinge, who is a professor
> of mathematics. It's only a qualitative analogy, however: the actual
> simplest equations that describe the simplest qualitative predictions
> don't actually have mathematical singularities.

This is an important point that seems to be often ignored in discussions
of the "singularity".  While a simple interpolation and future
projection of (for example) available computing power may show a
vertical asymptote, the underlying equations that govern the rise of
computing power are subject to constraints that may result in a
flattening of the curve.

I'd be very interested in seeing (and would write myself, if time and
knowledge allowed) an analysis of what a post-"singularity" plateau
might look like, given the most stringent constraints we know of. 
Obviously these constraints are products of our current understanding of
the world, and thus may be suppressible given some future physics &
mathematics, but I think it a bit disingenuous to use a term like
"singularity" when our best current models of the world show bounded (or
at least, merely exponential) growth.

There are much wiser heads out there on this list, so let me just start
off the conversation with a few constraints that I can see (again, given
our current physics & mathematics):

Size Constraints:  Our current understanding of QM essentially
constrains the precision with which we can build physical structures (or
direct energy) via Planck's constant, thus creating an apparent lower
bound on our engineering ambitions.  This in turn has consequences for
Moore's Law, as to date progress has been made largely via finer and
finer lithography techniques. 

Speed Constraints:  Barring some revolution to Relativity, c limits the
rate of expansion of even a Vingean Power.  In addition it limits the
speed of signaling within that entity's "brain".  Again suggesting
possible constraints on the rate of computational growth.

Chaotic Computability Constraints:  The most ambitious nanotech
scenarios posit universal assemblers that can be programmed or designed
to build specific structures.  IMHO this is a clearly chaotic system. 
The structure of any complex object created by this sort of nanotech
would seem to display an exquisite sensitivity to tiny variations in
design of the assembler (or the assembler "software"), and possibly to
the local environment itself.  I question whether we'll be able to
design assembler protocols for very complex objects through any means
other than trial and error, which have their own pitfalls (e.g., grey
goo). 

(This is not to say that I'm skeptical of nanotech.  I think the more
prosaic nanotech scenarios have an enormous potential to revolutionize
our world.  If we can design nanomachines to construct individual parts
of more complex systems, for example, and then use macro-machines to
assemble them into the finished product, we will have gone a long way
towards eliminating scarcity.)

I'm very interested in thoughts from the list on:

1) Where a Spike subject to these constraints will begin to level off.

2) Just how severe and insurmountable these constraints are.  Will
superstring-engineering allow us to bypass Planck's constant?  Will some
new, higher-level theory of complex systems provide more computationally
efficient means of simulating chaotic systems?  Will quantum computing
have any benefit here?  Can a sufficiently advanced entity construct a
pocket universe where Planck's constant and c are different from our
own?  Is there any way to communicate between that universe and this
one?

cheers,
mez



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:45:01 MST