Re: Constraints on the "singularity"

From: Dan Clemmensen (Dan@Clemmensen.ShireNet.com)
Date: Sun Oct 12 1997 - 16:05:25 MDT


Ramez Naam (Exchange) wrote:
>
> > From: Dan@Clemmensen.ShireNet.com [SMTP:Dan@Clemmensen.ShireNet.com]
> > I use the term "singularity" in deference to Vinge, who is a professor
> > of mathematics. It's only a qualitative analogy, however: the actual
> > simplest equations that describe the simplest qualitative predictions
> > don't actually have mathematical singularities.
>
> This is an important point that seems to be often ignored in discussions
> of the "singularity". While a simple interpolation and future
> projection of (for example) available computing power may show a
> vertical asymptote, the underlying equations that govern the rise of
> computing power are subject to constraints that may result in a
> flattening of the curve.

Oh, dear. Just because there is no singularity in the equations, there
is no reason to postulate a flattening. My grossly-oversimplified
curve is an exponential. My next-level approximation is an exponential
with step rate increases over time.
>
> I'd be very interested in seeing (and would write myself, if time and
> knowledge allowed) an analysis of what a post-"singularity" plateau
> might look like, given the most stringent constraints we know of.
> Obviously these constraints are products of our current understanding of
> the world, and thus may be suppressible given some future physics &
> mathematics, but I think it a bit disingenuous to use a term like
> "singularity" when our best current models of the world show bounded (or
> at least, merely exponential) growth.
>
Vinge's other reason to use the term "singularity" is in the sense of
an event horizon: in his model we cannot predict beyond it because the
superintelligences are incomprehensible. I agree with him in this.

> There are much wiser heads out there on this list, so let me just start
> off the conversation with a few constraints that I can see (again, given
> our current physics & mathematics):
>
> Size Constraints: Our current understanding of QM essentially
> constrains the precision with which we can build physical structures (or
> direct energy) via Planck's constant, thus creating an apparent lower
> bound on our engineering ambitions. This in turn has consequences for
> Moore's Law, as to date progress has been made largely via finer and
> finer lithography techniques.

Planck's constant is many orders of magnitude away from affecting
information
storage structures.

Moore's "law" (actually, and observation of existing trends) is
currently
driven by lithographic techniques, but it extrapolates backward fairly
well
all the way to the mechanical calculator. There are other techniques,
architectures, and technologies that can step in to take up the slack,
even before we begin to apply superintelligence to the problem. As a
trivial
example, while the current technology is lithographically at .25 micron
or
so, we use it to build chips that are placed in plastic packages that
are
placed on PC boards that are placed in chassis. The 2-dimensional
densisty
is perhaps 10% fo the chip density, and the spacing between transistors
in
the third dimension is thousands of times less dense.
>
> Speed Constraints: Barring some revolution to Relativity, c limits the
> rate of expansion of even a Vingean Power. In addition it limits the
> speed of signaling within that entity's "brain". Again suggesting
> possible constraints on the rate of computational growth.

This is true, but I suspect the ability to acquire mass is a much more
severe one. For a given mass, you can decrease the effect of the speed
limit by increasing the density, at least until the resulting material
can no longer support a computational structure or until you form a
black hole. these constraints are IMO far beyond the singularity.
>
> Chaotic Computability Constraints: The most ambitious nanotech
> scenarios posit universal assemblers that can be programmed or designed
> to build specific structures. IMHO this is a clearly chaotic system.
> The structure of any complex object created by this sort of nanotech
> would seem to display an exquisite sensitivity to tiny variations in
> design of the assembler (or the assembler "software"), and possibly to
> the local environment itself. I question whether we'll be able to
> design assembler protocols for very complex objects through any means
> other than trial and error, which have their own pitfalls (e.g., grey
> goo).

I don't understand. The idea behind nonotech is design to atomic
precision. Assemblers and their output are digitally-described objects.
a system built by assemblers will be essentially perfectly manufactured
by comparison to today's systems, in the same way that playing a CD
yields
the same result every time since the recording is digital. Sure, I can
conceive of a assembler process using stochastic techniques, but this is
certainly not necessary.
>
> I'm very interested in thoughts from the list on:
>
> 1) Where a Spike subject to these constraints will begin to level off.

Not within the predictable future.

> 2) Just how severe and insurmountable these constraints are. Will
> superstring-engineering allow us to bypass Planck's constant? Will some
> new, higher-level theory of complex systems provide more computationally
> efficient means of simulating chaotic systems? Will quantum computing
> have any benefit here? Can a sufficiently advanced entity construct a
> pocket universe where Planck's constant and c are different from our
> own? Is there any way to communicate between that universe and this
> one?
>
Not a problem for humans. Consult an SI for the answers!



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:45:01 MST