From: Billy Brown (bbrown@conemsco.com)
Date: Tue Dec 15 1998 - 09:07:53 MST
Robin Hanson wrote:
> Similar issues have been considered. For example, here is an exerpt from
> http://www.extropy.org/eo/articles/vc.html#hanson
Yes, I've read the Singularity debate. My argument is a rather different
one that your comment about IQ enhancement, and it leads to different
conclusions.
If you make an AI that increases its own IQ by 1%, that tells you nothing
about how difficult the next 1% improvement will be. It could be 10%
harder, or 0.5% harder, or not any harder at all. The only way to find out
is to try it and see what happens.
What we can predict is that if you try to repeatedly enhance one narrow
ability, you will eventually reach a point of diminishing returns. At some
point, you will find that further improvements to ability Y are impractical,
because ability X can not handle the increased complexity of your design.
This effect is not unique to intelligence enhancement. Something similar
occurs whenever any narrow field of research pulls ahead of the general rate
of progress. Eventually it reaches a point where further advances require
improvements in some unrelated technology, and you have to wait for those
improvements to be made.
Now, this does not mean that the whole thing is a game of diminishing
returns. Recent history demonstrates that if you research enough different
things, you can create a situation in which the ability of your society to
make technological advances increases faster than the difficulty of taking
the next step. A reasonable extrapolation of the trend would predict a
century or two of steadily-accelerating progress before things begin to
change so fast that an unenhanced human can't cope.
Intelligence enhancement (IE) of any kind would, however, add a new
dimension to this saga. Roughly speaking, the our rate of progress is
determined by:
R * P * I
Progress / unit of time = -----------
T * C
Where R represents the resources available to each researcher, P is the
population of researchers, I is the average intelligence of the researchers,
T is our current level of technological sophistication, and C is a measure
of the time and effort required for researchers to communicate. Most of the
increasing rate of change in recent times comes from a slow geometric
increase in both R and P, and a steady drop in C. Since the changes in R
and C are both due to technology, the whole process tends to feed on itself.
Meaningful IE would make I increase in roughly the same fashion as R. Not
only would this dramatically speed up our rate of advance, it would also
increase the rate at which our rate of advance speeds up.
This post is long enough already, so I'll leave off speculating about the
Singularity itself until next time.
Billy Brown, MCSE+I
bbrown@conemsco.com
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:50:01 MST