> ----------
> From: Billy Brown[SMTP:ewbrownv@mindspring.com]
> Reply To: extropians@extropy.com
> Sent: Thursday, 8 July 1999 13:24
> To: extropians@extropy.com
> Subject: RE: Human minds on Windows(?) (was Re: Web site up! (GUI vs.
> CLI))
>
>
> > > CountZero <count_zero@bigfoot.com> said
> >
> > > ... Windows (tm) ...,
> > > is a bloated mess, likely impossible for any single human to
> understand,
> > > _it works_, it gives me what I want at the moment and I'm more than
> > > willing to throw hardware at it as long as the hardware is cheap since
> > > the alternative is to wait (possibly for a long time) to get the same
> > > capabilities in properly optimized code.
> > >
> > This is a *very* *very* scary thought. Since we can expect the
> > hardware to keep getting cheaper at least through 2012 (when they
> > hit the five atom gate thickness limit), then probably transition
> > over to nanotech (whence comes 1 cm^3 nanocomputers) -- the implication
> > is that we will have an extended period in which to develop increasingly
> > sloppy code.
>
> Actually, Microsoft's defects-per-LOC figures (the only comparison of code
> quality that really tells you anything) are in the upper 30% of the
> software
> industry. The reasons why their products often seem porely written have
> nothing to do with code quality - their problems lie in other areas (such
> as
> user interface design).
>
> However, you have definitely hit on the single biggest challenge facing
> the
> software industry today. Simply put, it is not possible for humans to
> write
> 100% defect-free code. Faster computers allow you to tackle more
> complicated problems, but that leads to ever-bigger programs. As the
> programs get bigger, more and more of your development effort gets
> diverted
> into getting the number of defects down to an acceptable level.
> Eventually
> you reach the point where every time you fix one bug you create another,
> and
> it becomes impossible to add new features to your program without breaking
> old ones.
>
> Judging from what data are currently available, this effect comes into
> play
> when the number of human-generated instructions gets into the 10^7 - 10^8
> LOC region. High-level languages should therefore make it possible to
> write
> bigger programs (because each human-entered instruction generates a lot
> more
> machine code), but the level of abstraction in these programs is not
> increasing very quickly at all. If we want to actually be able to exploit
> the processing power we're going to have in 10-20 years, we need to get to
> work on this problem *now*, instead of sitting around pretending it
> doesn't
> exist.
>
> Billy Brown, MCSE+I
> ewbrownv@mindspring.com
>
Software layering (or modularity) surely does a lot of work to combat this.
If you can build a layer of software that uses a lower level, adds value,
and is bug-free (possible with enough effort when working on a finite sized
level), you have firm ground on which to build. The only problem is that, as
you add layers, the replacement cost grows.
This is closely related to the higher-level language argument above, in that each layer of services provides more functionality that the existing languages can access, at increasing levels of abstraction. So your 3/4 GL today looks a lot higher level than your 3/4 GL did 10 years ago. I would argue that in some practical sense, abstraction is increasing strongly, just not at the level of pure languages. Rather, the OS and all layers on top must be included. Paradoxically, this means that as you get more "abstract", you are also bound to more and more arbitrary technologies.
As the overall complexity grows in your total system, the number of layers increases, which means that you are increasingly bound to the layers in which you have already invested. This is because it would just take too much time & money to replace those layers. Once, you might have built your own OS from scratch for a special purpose task. Now, not only are you usually bound to a particular OS, you are bound to layers and layers of software on top of that (ODBC? COM? Winsocks? X-many others).
The trend increases toward choosing an OS, a set of APIs, protocols, protocols on protocols on protocols, and sticking with them. It's the only way we can manage the complexity of our desired systems, and must become more and more prevalent. This will lead to worse homogenisation of the IT world, and more of the monoculture problems we are getting (catastrophic effects from bugs, viruses, worms, crackers, monopolies, etc...)
What can we do to get on top of this?
Build AIs that can take over the job.
Then arm yourself for bear and head for the hills.
Emlyn