Re: Singularity vs. Temporal Pollution

From: Anders Sandberg (asa@nada.kth.se)
Date: Fri Jul 19 2002 - 02:43:38 MDT


A good point. Vernor Vinge describes a kind of negative
singularity in _A Deepness in the Sky_ where the software
inefficiencies and bugs of a civilization finally overtakes it.
Patches on patches on patches become rather unstable
eventually, and redundancy is costly.

I think many hope the singularity will clean out all the
obsolete structures, weird code and other junk when it arrives.
But that assumes it is somehow outside the system, a force not
part of the problem. In my opinion the singularity is better
seen as a change/state of global society, and that means it
will be built upon and composed of these flawed parts.
Increased intelligence and capability enables the fixing of
more bugs, but bugs/misfeatures locking each other in place
(you can't fix bug A, because that requires fixing bug B which
is actually necessary for the important but badly implemented
function C which cannot even be turned off for a second due to
the legal misfeature D. And replacing the whole system would
cost more than it is worth).

On average, professional coders make 100 to 150 errors in every
thousand lines of code they write, according to a Carnegie
Mellon study. So if a smarter coder (human + software, AI or
whatever) requires L lines of code, that will introduce around
0.1 L bugs. Presumably most of these will be discovered through
testing, but a few will remain as ticking bombs (i would
estimate perhaps one in a hundred of the bugs).

Let's look at a progression of coders of increasning "level"
where each level writes its successor. The length of code will
grow as L(n), and lets assume bugs/line decreases as 0.1 L(n) /
n (of which 1% will remain undiscovered for a long time).

If L(n) is just proportional to n, then the number of bugs in
level n will be constant. It is unclear how the irreducible
bugs of previous levels will affect it, but it seems likely
they will if not directly included likely be part of the
backwards compatibility or assumptions, making the amount of
misfeatures grow quadratically with n!

However, a linear increase of L(n) is likely too slow. Windows
3.1. was 3 Mloc, 95 15 Mloc, 98 18 Mloc, NT (92) 4 Mloc, NT 5.0
(98) 20 Mloc and Win2K 35-60 Mloc (source
http://www.counterpane.com/crypto-gram-0003.html#SoftwareComplexityandSecurity)

If L(n) is growing as n^constant with constant>1, then the
number of bugs at level n will increase as n^(constant-1). And
the legacy misfeatures will remain as another exponential
burden. This suggests that beyond a certain number of levels
the amount of debugging necessary grows to extreme amounts.
Things get even worse if code size grows exponentially.

The assumption that new levels reduce bugs as 1/n is of course
highly debatable. One could imagine that each new level reduces
bugs with half instead. Then the linear L(n) would give us
fewer and fewer bugs for each generation, and the legacy bugs
would converge to a constant number. The power law case would
have a temporary increase in bugs per level (a threshold to the
singularity?) followed by a decrease. The exponential L(n) case
would now give constant bugs per level and quadratic
misfeatures.

However, I don't know if it is likely we could get this kind of
exponential increase in programming skill. Human programmers of
different levels of intelligence doesn't seem to be making much
different amounts of bugs. Knowledge and experience seems to be
what prevent most bugs, while intelligence helps understand the
problem well enough not to get sidetracked into cul-de-sacs.

So my conclusion would be that the singularity (or a seed AI,
if such is possible) would be filled with bugs and misfeatures.
Note that I have not assumed these bugs will be so bad that
they (after removing the 99% obvious ones) hinder the
development of new levels, just that they will persist as
misfeatures. It could be that a postsingularity world would be
just as defined by its idiosyncratic and highly contingent bugs
as the biosphere today is defined by the random choices done in
evolution.

-- 
-----------------------------------------------------------------------
Anders Sandberg                                      Towards Ascension!
asa@nada.kth.se                            http://www.nada.kth.se/~asa/
GCS/M/S/O d++ -p+ c++++ !l u+ e++ m++ s+/+ n--- h+/* f+ g+ w++ t+ r+ !y


This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:15:34 MST