From: Emlyn O'Regan (emlyn@one.net.au)
Date: Thu Sep 14 2000 - 09:49:55 MDT
Samantha (hello, nice to meet you in the weird world of this list) wrote:
> Emlyn O'Regan wrote:
>
(snipped the first set of stuff; my response is "yes")
>
> > >
> > > I worry about another problem. Technology is still marching forward.
I
> > > support this 101% of course. But, as it advances it automates more
and
> > > more job levels. I predict that within 5-10 years it will automate
most
> > > of what is called programming today. This is only one example. As we
> > > get more powerful computers at lower cost with more AIish abilities
more
> > > and more jobs will fall to them. We had best be thinking really hard
> > > right now about how we plan to have people have a decent life without
a
> > > regular for pay job. Despite the current rather slanted rosy
> > > unemployment statistics in the US, the problem is real and it will
> > > eventually hit even many of us.
> >
> > Hee hee, the old "programmers will be redundant in 5-10 years" line.
Haven't
> > heard that for a while; supposedly it's been kicking around since at
least
> > the 70s. It's no truer today than it has ever been.
> >
>
> Don't bet on it. What changes between then and now is the raw power of
> the hardware. As it becomes more powerful it becomes more tractable to
> automate large segments of the work programmers currently do. Of
> course, the demand for programming is so huge that it will take a while
> before the automation cuts into the need for programmers too much. But
> the level of programmer needed will continue to rise also, stranding
> many who are now in that profession. Don't believe me? Wait and see.
I don't believe you, I will have to wait and see.
Seriously, the key here is that the work that programmers *currently* do
will be automated. But no one will automate the new stuff that we 1:want to
do, and 2: can do because we automated all the stuff that used to be hard,
but 3: can't actually automate yet. Because we wont be able to automate it
yet. But if there's advantage to be had, people will still *do* it, just
using more people.
I agree that the job of the programmer will probably get even harder. I
think we are already seeing the most drastic shortages in the newest areas,
especially where the work is actually really hard. That's probably going to
keep happening.
Or maybe it just seems harder as you get older; maybe the new guys & gals
will swim like fish in the latest version of the techno-whirlpool. Can fish
swim through a whirlpool? I guess we'll find out.
>
>
> > It is true that we continue to automate things which used to be done
> > manually. So why aren't all the coders on the dole queue? It's because,
as
> > something is automated, it becomes qualitatively different (easier) to
build
> > systems based on that something. In terms of possible systems, entirely
new
> > vistas open up, which although reliant on the newly automated something,
are
> > not themselves automated, and require lots of people to build them.
> >
>
> Actually, having been in the software world myself for 20 years, I
> suspect the truth is a bit dimmer than that. Many of the tools I use
> haven't evolved hardly at all in all of that time. C lead to C++ but
> the tools used to graft C++ have not exactly gone through any major
> revolution. C++ itself is quite primitive in many ways. Java?
> Interesting things have been done to exploit some of its features but
> the language itself is not that powerful and not sufficient for many
> types of problems. Any interpreted or semi-interpreted language with
> equal or more reflection could be used in most of the contexts that Java
> is used. Some of these languages, such as Lisp and Smalltalk, are or
> have been much more powerful and advanced in capability, usage, or
> development environment than Java, C++, VB and so on are today. Most of
> the central abilities in languages were first invented and explored in
> Lisp.
... and none of them are really relevant to this discussion. Sure, the ideas
are have been around a long time, everything's been thought up before in any
case. It's all about how you put them together into new things.
I would point to things like the libraries of prebuilt functions and
components which are available, the infrastructure that is now a given, the
system architectures which are commonplace now, as places where you see the
advancements. Sure, I might write a system in C++ today, just like someone
might have done 10 years ago. But things I can achieve with that system now,
the toolset I have available to do it, the very fact that the system itself
is less important than the mighty global super-system in which it takes its
humble place... this is the march forward. And it's not a march, so much as
a sprint.
It's a long time since I got to write some software with the fundamental
assumptions that it would run on one computer and be used by a handful of
people, in isolation, sitting on top of an OS and not much more.
>
> We are beginning to address problems of programming in the large but
> frankly many of the solutions are giant kludges that are severely
> over-hyped and over-sold. I have gotten quite disgruntled with this
> industry. We spend more time trying to lock up "intellectual property"
> and out-hype the competition than we do actually designing and building
> good systems. And fixing our development tools themselves takes a
> backseat to even that. I designed and built things in the 80s ( and I
> am not unique in this at all) that are as or more advanced than some
> parts of the current highly-hyped baseline.
>
Point is, they were a fair way from the baseline then, and probably aren't
now. I know lots of technocrats get cranky because what's just come out as
some Windows 2000 feature "was old news back when I was programming Eniac".
It doesn't mean anything. Excel uses maths, that's been around for millenia.
I still run on essentially the same wetware that the fishies have been using
for ever.
I've had people tell me that the internet is nothing new, and hey, the
French had a bigger system way back when, with computers linked together in
the hands of the general populace (a centralised system I think). Sure the
net is old (relative to what?), but it's also new; it's not the same thing
that it was five years ago, two years ago, last week. Sure the french had
some automated phone book thing, big deal. Would you swap it for this?
The baseline is one of the big things. Now we've got distributed
transaction/object systems, multi-tiered system infrastructure, all these
neat things that have been around for donkey's years. So what? So, they are
turning up in the baseline. As a lowest common denominator. Admitedly it's
all heterogeous madness still, but it's coming together. That's not a cause
for cynicism, that "yeah, been there, done that" attitude. It really ought
to blow your mind.
It's all in how you look at it... for instance, is Napster a hopelessly
technologically boorish file transfer utility, or is it a major social
revolution, the outcome of which, while unknowable, is certain to be major
and irreversible?
Because you can do things with it that you could never conceive of doing
before. This "new economy" thing is not some hyped up joke; it's happening.
The technology, important as it is, doesn't matter at all. It's what we can
do, and are doing, that matters.
> Sorry. Most of that is an aside and off-topic. I needed to rant. But
> personally I don't think software development will get significantly
> better until something like Open Source (better add Open Design) and
> changes in the basis of software business occur. I don't see how the
> current model has room to get out of its own way.
Open Source... hmmm. I've been in the MS world too long, I don't have more
than a passing knowledge of how it works. I think you are right; it is the
way forward. It's going to be a bumpy, crazy, messy, perilous way, but it'll
become clear that it's the only road open I think.
I don't think software will get any better, however. The best we can
probably hope for is that it will remain feasible to develop software. And
that'll do fine!
>
> > Once upon a time, to write software, you had to write your own OS. Then
OSes
> > started to appear, and provided basic functionality (loading a progrm
into
> > memory, talking to disks, basic screen handling, and all kinds of other
> > stuff). Later, programmers spent a lot of time writing (console based)
user
> > interface code, reading and writing from primitive disk files, all those
> > good things. Then databases started to become common, and GUIs became
> > common, and other things became the programmer's task. Now people wanted
> > nicer printouts, programs that interoperated, pretty gui front ends,
and,
> > urm, networking! Programmers wrote lots and lots of code by hand,
performing
> > those tasks; evetually it was again subsumed into the OS. At some point
> > Microsoft invented VB, and programmers were redundant forever shortly
> > thereafter.
> >
>
> GAG! VB is one of the worst blights exemplifying the very problems in
> the industry I spoke of above. The very idea of inflicting on millions
> of programmers a language environment that has a badly scaled Basic at
> its core pretending to be object oriented (laugh) and which is not even
> powerful enough to handle full COM components and on top of this has the
> IDE, your program and the debugger in one process so any problems crash
> the entire mess - well, the whole thing is simply and utterly
> infuriating. I can and have designed better programming environments in
> my sleep. Microsoft should be shot for gross ineptitude and programmer
> abuse. Of course they were never really about making the world better
> for software development They are only in it to capture the market and
> eliminate the competition. Or at least any good tendencies in its
> people are subsumed by that bottom line over and over again.
Crocodile tears? I agree with the technological assessment; a shocking
language, horrible, the costs of which will become apparent as time wears
on, and the legacy mountain of VB grows.
But then again, from what I've seen, you can throw together some pretty
amazing stuff in very short time. Go back to '85, and ask someone how long
it'll take to get an app together with a slick user interface, which talks
to a database backend (on another computer! gosh!), and, let's see, checks
for and downloads new config files from the internet, and, urm, displays
some slick graphs & pie charts, and, ah, yeah, plays a full motion video for
a tutorial. And it's got to generate some letters to customers in an
interchangeable word processor format. The '85 programmer will say "long".
And wait till you see that slick interface. Console apps; whoa; remember
those?
Then compare the skill required of each of those programmers to achieve the
same thing. The modern vb coder can be a relative doofus. Meanwhile, the
genius who coded this app back in '85 is now doing... something... hard and
scary. Like trying to make that damned VB app scalable (good luck!).
> Putting various things in the OS is not a very bright idea. The things
> in the OS should be the minimum that can be handled more efficiently and
> cleanly there. A web browser is not an example of such.
>
Maybe it's a question of terminology. What's an OS? What's my computer?
What's the internet? Where's the boundary? I'm starting to side with MS on
the browser issue; that browser really truly is part of the OS (well, the
GUI anyway); and why not? The computer's part of the internet... why
shouldn't the OS provide support for that?
I'm surprised to see someone who understands COM holding such an opinion,
actually.
> > Or that's the hype... oddly enough, there is such a thing as a VB
> > programmer, with possibly more positions available than any other
> > language/environment. Because the things it makes easy to do are easy to
do,
> > sure. But now people can imagine even more complex systems, which they
would
> > not have attempted before, which they can now just accomplish by writing
> > lots and lots of code. If they do these things in VB, it's often even
> > harder.
> >
>
> VB will not allow you to do most of these things. Back when I (shameful
> to admit) hacked some VB I had to write anything interesting in C++/DCOM
> and then do the GUI in VB and have it call the C++ COM objects. It
> works, but is totally proprietary to Microsloth.
Write it in Delphi.
There are far better development tools than MS provides, for the windows
environment. Don't believe the hype; they actually turn out fantastically
good code. Delphi, for instance, does the interface as easily as VB, and the
COM/DCOM/COM+/(add acronym of choice) as well as, and far more easily than,
C++. I assume you are talking about MS Visual C; there are other kinds, with
better toolsets available.
>
> > This continues. As far as I can see, the tools have improved incredibly
> > beyond what I was using 10 years ago when I started uni. So much more is
> > possible; it's astounding. And yet, paradoxically, it is harder than
ever to
> > be a programmer. The job is getting more and more difficult. Why?
> >
>
> You must be in a very different world or you don't see beneath the
> wizards and the push-button IDE. The underlying stuff has largely not
> improved at all except for the addition of binary components using CORBA
> and COM. Even those two are not a great improvement over a
> distributed/persistent peer-peer object application environment I
> created in 1986-1987. It was unfortunately ahead of its time and
> created under the auspices of a company that had no idea what to do with
> it.
>
You ought to start a dotcom to market this thing, go out and change the
world.
It's all in the baseline. In 1986, you had to build it yourself. Every
company that wanted it had to build it themselves (if they could be
bothered). Now it's out there, or an unmanageable, barely workable, hacked
up heterogeneous version of it is, anyway. Next week there'll be a new
version, a bit better, the week after that another version, even better, and
three weeks later it'll be stable and perfect, and meanwhile the next big
thing will have taken us all in a new direction; another 20 year old
technology whose time has come, which is cantankerous and immature and
horrible for developers, and which will kick off the next quiet (?) little
revolution, and have it's fifteen minutes of fame.
If it's mature, it's obsolete.
>
> Our present underlying compiler and linker technology is not much better
> than it was then. Linkers have changed almost not at all. DLLs? Came
> out in the mid 80s. The productivity tools for programmers leave a lot
> to be desired. In the Microsloth world to browse the call graphs of
> functions and objects the software entity must first be fully compiled
> with a bunch of special purpose flags set. From one component you can't
> browse into such details of another one. The information is not unified
> into some database you can query about various software artifacts and
> their interaction and inter-dependencies. What data that is gathered is
> in a Microsoft proprietary format that you cannot use to develop
> something more intelligent. Yet Lisp and Smalltalk environment have had
> such abilities for the last decade or even two. I wrote such an
> information extractor myself for some <gasp> Cobol legacy stuff I got
> stuck with once in 1984.
Why aren't we using Lisp and Smalltalk then? I mean this as a serious
question; there's a serious answer there somewhere. Anyone?
>
> The reason it is getting more difficult is the systems needed are more
> sophisticated and the tools and infrastructure have not kept up with
> them. It is difficult to pour energy into better programming tools when
> that kind of product doesn't pay off so well at the bank to make the VC
> happy. It is difficult to build them in house without the sponsoring
> management looking like they don't have an eye on the bottom line. So
> we race faster and faster armed with inadequate tools and every more
> pressing requirements, busily trying to automate everyone else's work
> except our own.
That's true; we automate everyone else's work except our own. Meanwhile
someone else is bustin their behind to automate our work. It's not sensible
to say that the tools are there because no one can be bothered building
them; providing the best tools is a market worth a fortune. So there must be
another reason.
I think the tools are barely adequate to the task. Only just enough to be
able to build what needs building. If they weren't adequate, we wouldn't be
able to build things. If they were mature, they'd be obsolete.
Again, it's that drive to reach as far as we can possibly reach. The front
line is always out where it is barely possible to do what is required. So of
course the tools are only just coping, of course things are cobbled together
and crazy and kludgey; there's no other way to succeed. When there is a
better way, the race will have moved on.
>
> How does this tie-in to extropian interests? The future and the tech
> we all so crave is built largely on top of software. Keep most software
> proprietary and don't invest in better software tools and the future
> will be much more stunted than it could/should be.
>
So do it. Build the better tools (again, apparently). If you are right,
you'll be really, really rich.
> > Well, because people want to interoperate. Heterogeneous networks,
> > applications, software layers, layer upon layer like a crazy croissant,
> > building new, more ambitious, more interesting systems than ever before.
The
> > new economy, with the 'net as the centerpiece, is a mish mash of
components
> > and chunks and layers and stuff, all trying to talk to each other, all
> > idiosyncratic, brittle, cranky code stretched out like a net weaved of
scrap
> > metal. Making this stuff hang together is a mad job, done at a frenetic
> > speed as old technologies die and new technologies are born. There's no
> > comfort zone. There's no pause for breath. There's certainly no sign of
AI
> > taking over any time soon!
> >
>
> It is more cranky and brittle than it needs to be in part because of the
> problems I mentioned. There are also some real bearish problems in some
> of our currently dreamed up systems. Things that will take real R&D
> projects to solve and then under the current model would come out as a
> bunch of hacked up tools positioned to maximize profit instead of being
> shared across the entire industry that needs them so desparately.
This is not like aid to the poor or something. It's a wealthy industry. Us
poor, struggling, underappreciated developers, crushed under the weight of
ill-informed feudal overlords, are doing very nicely indeed. There's no
martyrdom available in the world of IT.
The hacked-up tools are all about time. You can get the hacked up tools out
there way before the wonderfully designed, standards driven, mana from the
gods, perfect tools. And weirdly, by the time those perfect tools come out,
there's no demand. They're obsolete.
Actually, that's not fair. By the time the really good stuff comes out, the
tower of technology has been built higher. But, the hacked up tools that you
are replacing are stuffed under the next couple of levels; this makes those
levels shaky and unstable (and they of course are also cobbled together out
of crap, so it's a scary place to be). If you're lucky, people will whip out
the hacked crap, replace it with the excellent new stuff, and the tower can
climb even higher.
>
> > People have been worried about the contrary point of view; that our
systems
> > are getting so big, so unwieldy, that at some point we cross a failure
> > threshold, beyond which we cannot, as a bunch of humans, reliably
maintain
> > the systems any more. Why would this be true?
> >
>
> Without the proper tools and without more and better automation it is
> inevitable.
I'm not holding my breath. It's impossible to do when you are running.
>
> > Well, much of the problem is that, even though basic OS features are
> > "automated", even though the user interface (gui), persistent storage
> > (databases), communications (networks) are "automated", they actually
> > aren't. These things work well now, but they are never quite perfect
(some
> > people would say they are not even close!). Just ask UI programmers,
> > database programmers, OS coders, network engineers. All the hordes of
> > people, working to make/maintain the "automated" systems, the stuff
that is
> > already "done".
> >
>
> My greatest expertise is in object persistence. Persistence is far, far
> from "automated". Persistence cross-cuts applications and products but
> is often done as a series of hacks within a particular project
> life-cycle. Or a product is bought that promises to take the worries
> all away but actually seriously perverts all application building
> thereafter because its needs have to be met for the application to work
> at all and its needs are to perturbing to everything else. And the
> solution ties the product and the organization often to the solution
> provider firmly. At the moment there is not a good persistent
> middleware out there that fully meets what is needed. There are various
> attempts of greater/lesser goodness. I plan to do large parts of that
> problem better and to eventually release a series of Open Source
> persistent middleware tools. I am tired of seeing ugly solutions to
> this set of problems I know well.
>
Excellent! That's the spirit! If you don't like it, fix it. Once you build
the better tools, people will build on top of them, and reach to the kinds
of possibilities you imagine opening up, given the existence of the new
tools.
And what they'll use on top of your stuff, to reach those heights, will be
shonky, hacked together crap. But at least they'll have a firm foundation.
> We have automated certain classes of GUIs but the GUI story is far from
> complete or adequate. Many projects are now being seriously perverted
> to use only a Web Browser as their GUI! It is as if we are stepping
> firmly into the past and putting more and more energy into centralized
> servers even though we have more power on our desktops than we dared
> dream of in the very recent past. We need good massively distributed
> peer-peer systems. Not a sleazy 21st century rework of time-sharing
> systems.
>
Yep, I'm building some of those evil server-based, browser front-end gui
systems. They're shonky, the technology is immature and brittle and cranky,
and it's the right thing for the times. I can write software that can be
used by zillions of people all over the world, and yet I have central
control (and all the hideous problems, like scalability, which go with it).
I can do stuff which only recently became possible. These systems,
reminiscent as they are of the mainframe days of yore, are qualitatively
different, because they are doing the right job in this particular world.
If I'm still building the same things in 2 years, someone come round and
give me a good slapping!
> > Something really different will have to happen to change this picture.
Super
> > intelligent AI could do it (so could the wish fairies). Genetic
programming
> > advances might have a shot, but will still require humans to coordinate
them
> > at some level, especially given that our ambitions will increase by
orders
> > of magnitude as our abilities increase, to keep us focused on what is
just
> > out of reach. The job of configuring such magical genetic programmed
> > automated systems to reach these new goals will look a lot like, well,
> > programming.
> >
>
> Sure. Although of a quite different sort. What disturbs me is how
> often I am still doing the same tired old tasks in much the same tired
> way. There is not often enough time to both meet the current
> over-inflated deadline for an underdesigned product and automate as much
> of my own process as would satisfy me.
>
We programmers have to get passed trying to automate our own tasks; it's not
our job. Our job is to do the tasks. Somewhere, someone will have the job of
trying to automate the tasks we are charged to do, and yet they will be
rankling because *their* job is less automated than they would like.
When your job is automated, it doesn't need you. The things you do will
always be unautomated, and hence a manual slog, by definition I think.
>
> > Sure the techniques will change. Sure the skill set required will
change.
> > But the basic programming job will remain, and grow wider in my opinion.
> > Lots of things will begin to look like programming in the future, which
do
> > not now. Biotech might get to a point of "automation" where it starts
using
> > programmers. Through nanotech, even the bricks & mortar world will start
to
> > become a programming concern.
> >
>
> We don't yet do enough with well-defined and trusted components and with
> good tools for finding the right components and simulating their
> interaction. Much of our code base is still language and OS dependent
> and not componentized at all. Most of code is still application
> stovepipes with little/no reuse or reuseability. In short, almost no
> automation or next-level engineering applied to our own work. It had
> better not continue like this.
>
I think this is getting better. Reusable components exist en-masse now;
that's pretty new. They're still in their early days, but damned impressive
really. I think it'll get better; open source may be the way.
At some point, people will get genetic programming to the point where you
can make formal specs for a bit of a system (a component), with rigourous
definitions of the pre-built pieces (components) that system depends on and
the interface it will provide, and you will be able to evolve it. Arguably
that technique could build some very complex systems. Eventually you might
be able to evolve the design, as well as the implementation; and then we
begin to move into uncharted territory. Yet I still think that, even then,
there'll be a level for humans, hacking away at something which is still not
amenable to automation, standing on the generated systems and reaching for
the stars.
Emlyn
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:30:58 MST