From: Billy Brown (bbrown@conemsco.com)
Date: Mon Dec 14 1998 - 08:49:24 MST
I was reading some of the previous debates on the Singularity in the list
archives recently, when it struck me that there is a major factor that does
not seem to have been seriously considered.
Simply put, the more advanced a technology becomes, the more work it takes
to improve it. As technology advances there is a general tendency for
everything to become more complex, which means more work for the engineers.
Sometimes you can counteract this trend with better information technology
(the Internet is a good example), but not always (look at the amount of
human effort needed to build successive CPU designs - its a pretty steep
upward trend).
IMO, this principle has several important implications:
First, it means that advanced nanotechnology is not possible without major
breakthroughs in automated engineering and/or intelligence enhancement. Why
not? Well, diamondoid parts might be simple repeating structures, but
something like smart matter or utility fog requires that you decide what to
do with every single atom (that's about 10^22 design decisions per pound of
object!). It isn't practical to design that with human minds.
Second, a self-enhancing AI can't expect to optimize its way into an SI
unless it has SI-level hardware to run on. It might, if it is very lucky,
but it is more likely to proceed in a series of sharp upward jumps separated
by lulls while it waits for faster hardware. You probably need nanotech to
build the computers to run an SI, and you can't design the nanotech unless
you are pretty close to being an SI anyway.
Because of these factors, a Singularity is likely to have a slow takeoff.
You may have sudden jumps (when the first sentient AI goes online, or the
first general-purpose assembler goes into operation), but each improvement
simply leads to a new plateau while you wait for the rest of your tech base
to catch up. The open-ended, geometric nature of the critical enabling
technologies (computers, telecommunications, and eventually AI and
intelligence enhancement in general) means that the overall rate of progress
will continue to increase, but a sudden discontinuity is unlikely.
Something that looks like a Singularity from a human perspective is still
quite likely, but to the people who make it happen it will look like just
another day of steady progress.
So what do you guys think?
Billy Brown, MCSE+I
bbrown@conemsco.com
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:49:59 MST