RE: Human minds on Windows(?) (was Re: Web site up! (GUI vs. CLI))

From: Eugene Leitl (eugene.leitl@lrz.uni-muenchen.de)
Date: Wed Jul 07 1999 - 23:50:35 MDT


Billy Brown writes:

> Actually, Microsoft's defects-per-LOC figures (the only comparison of code
> quality that really tells you anything) are in the upper 30% of the software
> industry. The reasons why their products often seem porely written have
> nothing to do with code quality - their problems lie in other areas (such as
> user interface design).
 
If the other guys' code bombs more often than Microshaft's, the much
worser for the industry. Nowadays I'm doing most of my stuff in XEmacs,
simply because it almost never crashes. GC rules -- unfortunately
languages with true GC haven't hit the mainstream yet -- after all,
it's just a fairly recent invention, the second-oldest HLL in
existance after Fortran. It is really amusing to see how much
vogues/religions dominate the industry. An operating system from
the beginning of the 70s, developed and peddled by political
zealots, eats a commercial early-90s OS alive, how amusing. Not.
If this is the best we can do shortly before way too kay (another
blazing beacon of monkey incompetence), god help us.
 
> However, you have definitely hit on the single biggest challenge facing the
> software industry today. Simply put, it is not possible for humans to write
> 100% defect-free code. Faster computers allow you to tackle more

Don't we know this since early 70's? When was the Mythical man-month published?

> complicated problems, but that leads to ever-bigger programs. As the
> programs get bigger, more and more of your development effort gets diverted
> into getting the number of defects down to an acceptable level. Eventually
> you reach the point where every time you fix one bug you create another, and
> it becomes impossible to add new features to your program without breaking
> old ones.
 
The obvious solution would seem to dispense with brittleware
altogether. Gracefully failing systems suffering smooth performance
degradation instead of catastrophic failure. Of course the IT
industry would then have to admit the SantaFe people are not all
freaks on acid -- which is tough to face. Also, it is somewhat
late to throw half a century of IT traditions away and start from
scratch. Taking lessons from squishware? God, I'd rather shoot myself.
 
> Judging from what data are currently available, this effect comes into play
> when the number of human-generated instructions gets into the 10^7 - 10^8
> LOC region. High-level languages should therefore make it possible to write
> bigger programs (because each human-entered instruction generates a lot more
> machine code), but the level of abstraction in these programs is not
> increasing very quickly at all. If we want to actually be able to exploit
> the processing power we're going to have in 10-20 years, we need to get to
> work on this problem *now*, instead of sitting around pretending it doesn't
> exist.

Strangely, the awareness for this among programmers is not at all
widespread. Another bunch to be put to the wall when the revolution
comes.



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:04:25 MST