From: Samantha Atkins (samantha@objectent.com)
Date: Thu Sep 14 2000 - 13:18:34 MDT
Emlyn O'Regan wrote:
>
> Or maybe it just seems harder as you get older; maybe the new guys & gals
> will swim like fish in the latest version of the techno-whirlpool. Can fish
> swim through a whirlpool? I guess we'll find out.
>
>From what I see people nearly *gasp* half my age doing much of the
difficulty is experienced by all. People who have pretty much grown up
with computers and especially the WEB are often more adept at webifying
things. But programming environment issues and infrastructure concerns
are no more transparent to them.
> >
> > Actually, having been in the software world myself for 20 years, I
> > suspect the truth is a bit dimmer than that. Many of the tools I use
> > haven't evolved hardly at all in all of that time. C lead to C++ but
> > the tools used to graft C++ have not exactly gone through any major
> > revolution. C++ itself is quite primitive in many ways. Java?
> > Interesting things have been done to exploit some of its features but
> > the language itself is not that powerful and not sufficient for many
> > types of problems. Any interpreted or semi-interpreted language with
> > equal or more reflection could be used in most of the contexts that Java
> > is used. Some of these languages, such as Lisp and Smalltalk, are or
> > have been much more powerful and advanced in capability, usage, or
> > development environment than Java, C++, VB and so on are today. Most of
> > the central abilities in languages were first invented and explored in
> > Lisp.
>
> ... and none of them are really relevant to this discussion. Sure, the ideas
> are have been around a long time, everything's been thought up before in any
> case. It's all about how you put them together into new things.
>
Actually they are powerfully relevant. The shape of the tools and
languages at our disposal determine what can be thought and done and how
easily which is a huge determiner on our future. A huge determiner of
the shape is the socio-political/business climate that does not often
allow the best technology, especially in development tools to be built
or if it is built, to win.
> I would point to things like the libraries of prebuilt functions and
> components which are available, the infrastructure that is now a given, the
> system architectures which are commonplace now, as places where you see the
> advancements. Sure, I might write a system in C++ today, just like someone
> might have done 10 years ago. But things I can achieve with that system now,
> the toolset I have available to do it, the very fact that the system itself
> is less important than the mighty global super-system in which it takes its
> humble place... this is the march forward. And it's not a march, so much as
> a sprint.
>
What libraries are you refferring too? Most libraries have been around
a long time. I've been building libraries and components since 1983.
Most libraries today aren't done with much better technology and don't
address the issues inherent in such any better than back then. With
the exception of libraries of components, beans and COM/Corba things.
The system architectures frankly mostly suck. They are unnecessarily
fragile and constricting. C++ is a modern aseembly language, little
more. The global super-system is mere hand-waving in this context. In
the trenches of software design and development the "global
super-system" is largely irrelevant where it is not downright pernicious
to the work. A sprint? More like millions of lemmings bolting for the
sea with a few standing still or trying to strike out in a different
direction.
> It's a long time since I got to write some software with the fundamental
> assumptions that it would run on one computer and be used by a handful of
> people, in isolation, sitting on top of an OS and not much more.
>
So, I guess you don't write much VB after all? :-)
> >
> > We are beginning to address problems of programming in the large but
> > frankly many of the solutions are giant kludges that are severely
> > over-hyped and over-sold. I have gotten quite disgruntled with this
> > industry. We spend more time trying to lock up "intellectual property"
> > and out-hype the competition than we do actually designing and building
> > good systems. And fixing our development tools themselves takes a
> > backseat to even that. I designed and built things in the 80s ( and I
> > am not unique in this at all) that are as or more advanced than some
> > parts of the current highly-hyped baseline.
> >
>
> Point is, they were a fair way from the baseline then, and probably aren't
> now. I know lots of technocrats get cranky because what's just come out as
> some Windows 2000 feature "was old news back when I was programming Eniac".
> It doesn't mean anything. Excel uses maths, that's been around for millenia.
> I still run on essentially the same wetware that the fishies have been using
> for ever.
>
That rather misses a lot of the points I raised. The central question
is why the advances are largely lost and when incorporated why more
energy is spent/wasted over-selling them than actually fully
incorporating and extending them? This is a quite crucial question for
the future of this industry (and for my continuing part in it). I think
you are seeing played out early in software some central conflicts as
the world heads more towards information being the most essential
commodity. I see the problems in software but I don't believe they stop
there.
> I've had people tell me that the internet is nothing new, and hey, the
> French had a bigger system way back when, with computers linked together in
> the hands of the general populace (a centralised system I think). Sure the
> net is old (relative to what?), but it's also new; it's not the same thing
> that it was five years ago, two years ago, last week. Sure the french had
> some automated phone book thing, big deal. Would you swap it for this?
>
The web is new but is it not as good as it should be and needs to be and
must be to be used for what we wish to use it for. What is to be done
about it? How is it symptomatic of deeper concerns/currents in software
systems?
Don't get hung up on the way back then vs. now subthread. It is used to
illustrate trends rather than being primary to the point I was
attempting to make.
> The baseline is one of the big things. Now we've got distributed
> transaction/object systems, multi-tiered system infrastructure, all these
> neat things that have been around for donkey's years. So what? So, they are
> turning up in the baseline. As a lowest common denominator. Admitedly it's
> all heterogeous madness still, but it's coming together. That's not a cause
> for cynicism, that "yeah, been there, done that" attitude. It really ought
> to blow your mind.
>
We have toy transaction/object systems that break badly on real world
problems outside of the small window they can handle and get held
together with chewing gum. Multi-tiered systems have been built (with
less hype) for 20 years minimum and we still don't solve some of the
issues that were unearthed then, although we pretend we do. J2EE and
EJB do tuck a lot of plumbing issues out of the way although there is
still much to be done to make that sort of progress less intrusive and
more responsive.
I am the last person that it can even remotedly be suggested that I am
cynical. I am a wild, flag waving idealistic optimistist who keeps
charging out onto the bleeding edge to make it better. I am not cynical
but I am quite tired of assumptions that the current state of the art is
oh so admirable and that the tide of software progress is flowing well.
It isn't and I've wrestled with enough broken software systems and
company politics to have a pretty good idea of what the damage and
problems are.
> It's all in how you look at it... for instance, is Napster a hopelessly
> technologically boorish file transfer utility, or is it a major social
> revolution, the outcome of which, while unknowable, is certain to be major
> and irreversible?
>
Both and.
> Because you can do things with it that you could never conceive of doing
> before. This "new economy" thing is not some hyped up joke; it's happening.
> The technology, important as it is, doesn't matter at all. It's what we can
> do, and are doing, that matters.
>
Without the tech you can't do shit. Without a deep understanding of the
tech you will miss more than you think of doing. Sure we are in a
revolution in many ways but some of us like me are attempting to see
where the underlying tech needs extension and cleanup and where the next
great thing will/should be. It doesn't mean we don't see what is now.
It doesn't mean something so facile as that we are "cynical".
> > Sorry. Most of that is an aside and off-topic. I needed to rant. But
> > personally I don't think software development will get significantly
> > better until something like Open Source (better add Open Design) and
> > changes in the basis of software business occur. I don't see how the
> > current model has room to get out of its own way.
>
> Open Source... hmmm. I've been in the MS world too long, I don't have more
> than a passing knowledge of how it works. I think you are right; it is the
> way forward. It's going to be a bumpy, crazy, messy, perilous way, but it'll
> become clear that it's the only road open I think.
>
> I don't think software will get any better, however. The best we can
> probably hope for is that it will remain feasible to develop software. And
> that'll do fine!
>
Now who's cynical?
Software must get significantly better. What is demanded of software
systems is ever more crucial. Serving roughly the same old slop can
kill people and economies. As we step toward ever greater automation
and to nanotech this is even more true. There is no conceivable way
that most current software practice and languages will enable building
massively fine-grained parallel, distributed programming environments
that will be very much required in many nanotech solutions dreamed of
today. There are some mostly academic things on the shelves gathering
dust that may provide some keys to solutions. Even in far less exotic
software spaces, the Web economy is running on systems that are quite
poorly adapted to the load. There are problems like handling long
transations and massively interacting transactions that we don't have
even academic general solutions to yet they are endemic to many things
we do or need to do out on the global network. Some of these things
require not just new breakthroughs but new ways of designing, developing
and fielding software. It is a very exciting time in the industry.
> > GAG! VB is one of the worst blights exemplifying the very problems in
> > the industry I spoke of above. The very idea of inflicting on millions
> > of programmers a language environment that has a badly scaled Basic at
> > its core pretending to be object oriented (laugh) and which is not even
> > powerful enough to handle full COM components and on top of this has the
> > IDE, your program and the debugger in one process so any problems crash
> > the entire mess - well, the whole thing is simply and utterly
> > infuriating. I can and have designed better programming environments in
> > my sleep. Microsoft should be shot for gross ineptitude and programmer
> > abuse. Of course they were never really about making the world better
> > for software development They are only in it to capture the market and
> > eliminate the competition. Or at least any good tendencies in its
> > people are subsumed by that bottom line over and over again.
>
> Crocodile tears? I agree with the technological assessment; a shocking
> language, horrible, the costs of which will become apparent as time wears
> on, and the legacy mountain of VB grows.
>
> But then again, from what I've seen, you can throw together some pretty
> amazing stuff in very short time. Go back to '85, and ask someone how long
> it'll take to get an app together with a slick user interface, which talks
> to a database backend (on another computer! gosh!), and, let's see, checks
> for and downloads new config files from the internet, and, urm, displays
> some slick graphs & pie charts, and, ah, yeah, plays a full motion video for
> a tutorial. And it's got to generate some letters to customers in an
> interchangeable word processor format. The '85 programmer will say "long".
> And wait till you see that slick interface. Console apps; whoa; remember
> those?
>
I remember developing a distributed persistent development and
deployment system at that time personally. The interface was Display
Postscript. Before that I was hacking Smalltalk and doing rapid
prototyping of interface. Before that I was working on multi-process,
multi-computer systems creating large document storage and retrieval
systems that actually did several of the things mentioned. So?
> Then compare the skill required of each of those programmers to achieve the
> same thing. The modern vb coder can be a relative doofus. Meanwhile, the
> genius who coded this app back in '85 is now doing... something... hard and
> scary. Like trying to make that damned VB app scalable (good luck!).
>
A relative doofus with a bunch of sharp tools leaves a lot of blood on
the floor. The genius is refusing to try to make VB BS be what it was
never designed to be.
> > Putting various things in the OS is not a very bright idea. The things
> > in the OS should be the minimum that can be handled more efficiently and
> > cleanly there. A web browser is not an example of such.
> >
>
> Maybe it's a question of terminology. What's an OS? What's my computer?
> What's the internet? Where's the boundary? I'm starting to side with MS on
> the browser issue; that browser really truly is part of the OS (well, the
> GUI anyway); and why not? The computer's part of the internet... why
> shouldn't the OS provide support for that?
>
Because the browser is an application rather than a set of services that
enable all applications. It is an application framework if you squint
just so but not bedrock system services. Pulling the browser into the OS
also potentially pulls in everything you might want to associate with a
browser like rendering engines, scripting language support, various file
format supports on and on. The OS becomes a black hole sucking in
everything else. You end up with a massively entangled system that can
be disentangled to build subsets in embedded systems or to easily port
OS swallowed things outside that OS environment. And the debugging and
maintenance gets much more convuluted. You also introduce the
possibility of far deeper security holes when you move more software
into or closer to the kernel shell.
The computer being part of the internet is supported by OS level support
for internet protocols like TCP/IP. A particular set of HTML/CSS/XML/?
rendering tools and interfaces to the WEB is not a fundamental enabling
protocol and is something you would like to be able to examine,change,
modify, replace without hacking your OS (which means begging Microsoft)
to do it.
> I'm surprised to see someone who understands COM holding such an opinion,
> actually.
>
Perhaps it is because I do understand COM and what it is and is not.
> >
> > VB will not allow you to do most of these things. Back when I (shameful
> > to admit) hacked some VB I had to write anything interesting in C++/DCOM
> > and then do the GUI in VB and have it call the C++ COM objects. It
> > works, but is totally proprietary to Microsloth.
>
> Write it in Delphi.
>
And make it proprietary to Borland...
> There are far better development tools than MS provides, for the windows
> environment. Don't believe the hype; they actually turn out fantastically
> good code. Delphi, for instance, does the interface as easily as VB, and the
> COM/DCOM/COM+/(add acronym of choice) as well as, and far more easily than,
> C++. I assume you are talking about MS Visual C; there are other kinds, with
> better toolsets available.
>
I know. As long as Microsoft doesn't hack parts of their system to
destroy compatibility yet again.
> >
> > You must be in a very different world or you don't see beneath the
> > wizards and the push-button IDE. The underlying stuff has largely not
> > improved at all except for the addition of binary components using CORBA
> > and COM. Even those two are not a great improvement over a
> > distributed/persistent peer-peer object application environment I
> > created in 1986-1987. It was unfortunately ahead of its time and
> > created under the auspices of a company that had no idea what to do with
> > it.
> >
>
> You ought to start a dotcom to market this thing, go out and change the
> world.
>
> It's all in the baseline. In 1986, you had to build it yourself. Every
> company that wanted it had to build it themselves (if they could be
> bothered). Now it's out there, or an unmanageable, barely workable, hacked
> up heterogeneous version of it is, anyway. Next week there'll be a new
> version, a bit better, the week after that another version, even better, and
> three weeks later it'll be stable and perfect, and meanwhile the next big
> thing will have taken us all in a new direction; another 20 year old
> technology whose time has come, which is cantankerous and immature and
> horrible for developers, and which will kick off the next quiet (?) little
> revolution, and have it's fifteen minutes of fame.
>
These better versions won't get out there without Open Source. We are
building deeper layers of IP laws now than anything I had to deal with
in 1986. The current IP laws would have either stopped me dead in the
water or made me extremely rich at the price of a lot of innovation
being stifled. Today, if I actually took the time to do due diligence
search for software patents that my current project may be in violation
of, I would be stopped dead. Without Open Source the guts of these
infrastructures and frameworks are not exposed and not open to evolution
except by or with the express consent and usually prohibitively high
licensing fees of the originating company. In the meantime the
originating company is fighting to increase sales and acceptance and
casting the core technology in stone that even they will not dare break
open and reinvent because they have too much riding on it and are too
busy fixing the bugs, especially systemic ones that grow out of the
architecture's limitations and the ever expanding attempt to sell it as
all things to all people.
> If it's mature, it's obsolete.
>
Unless it is flexible enough to evolve more or less gracefully. Unless
the societal/marketing environment is such to allow more evolution.
> >
> > Our present underlying compiler and linker technology is not much better
> > than it was then. Linkers have changed almost not at all. DLLs? Came
> > out in the mid 80s. The productivity tools for programmers leave a lot
> > to be desired. In the Microsloth world to browse the call graphs of
> > functions and objects the software entity must first be fully compiled
> > with a bunch of special purpose flags set. From one component you can't
> > browse into such details of another one. The information is not unified
> > into some database you can query about various software artifacts and
> > their interaction and inter-dependencies. What data that is gathered is
> > in a Microsoft proprietary format that you cannot use to develop
> > something more intelligent. Yet Lisp and Smalltalk environment have had
> > such abilities for the last decade or even two. I wrote such an
> > information extractor myself for some <gasp> Cobol legacy stuff I got
> > stuck with once in 1984.
>
> Why aren't we using Lisp and Smalltalk then? I mean this as a serious
> question; there's a serious answer there somewhere. Anyone?
>
Lisp is almost too powerful. It takes a rarer programmer to use it
elegantly and well than to use languages like VB, C++ and Java. In Lisp
you often design a programming environment for tackling the type of
application you are building before building the application. This is
quite good and productive if you are good at it but that first new
application of a particular kind might take longer and be less
understandable to less talented programmers. Lisp was also mis-sold as
requiring dedicated hardware to run well by a company in the business of
creating such dedicated hardware. In actuality lisp interpreters,
compilers and optimizers were written for many heterogeneous platforms
and by the early 90s the optimized speed was only 2 or 3 times optimized
C. That is faster than most of the Java machinery today. Lisp also
suffered from the repurcussion of the great AI oversell.
Smalltalk got hurt imho by not evolving fast enough. It had and has a
killer programming environment except that it was pretty much a private
user/developer sandbox not well integrated (generally speaking) to team
level projects. There were tools developed to get around this limit but
too little and too late. Also, producing standalone executables was not
so well addressed. And, until say 93 or so, average machines were less
than stellar running Smalltalk. Perhaps the largest gap was not
extending smalltalk into the scripting space (quite easy to do) into
inter-process communication and into applet building when the Web came
along.
Thruthfully though, there are way too many business, media and cultural
factors that also played a part in the outcome for me to pretend to
untangle them all.
> >
> > The reason it is getting more difficult is the systems needed are more
> > sophisticated and the tools and infrastructure have not kept up with
> > them. It is difficult to pour energy into better programming tools when
> > that kind of product doesn't pay off so well at the bank to make the VC
> > happy. It is difficult to build them in house without the sponsoring
> > management looking like they don't have an eye on the bottom line. So
> > we race faster and faster armed with inadequate tools and every more
> > pressing requirements, busily trying to automate everyone else's work
> > except our own.
>
> That's true; we automate everyone else's work except our own. Meanwhile
> someone else is bustin their behind to automate our work. It's not sensible
> to say that the tools are there because no one can be bothered building
> them; providing the best tools is a market worth a fortune. So there must be
> another reason.
>
Actually it is not so lucrative a business. Consider. I build the best
say, Java, environment around. If I price it inline with current
products I can charge, oh, say $600 for it. At that price a lot of
developers will not try it and probably can't get their manager to risk
their development schedule on this new tool. But assume I can get it.
Probably no more than 10% of the developers are going to buy this tool
in the first year of its being on the shelf. To get to that first year
probably took on the order of 2-3 years of business startup cost,
development and marketing. Assuming roughly 2 million java developers
gives me 200000 x $600 = $12 million in sales. Barely my startup costs
to date. And that is if everything goes quite well. Most VCs are not
impressed with such ROI.
> I think the tools are barely adequate to the task. Only just enough to be
> able to build what needs building. If they weren't adequate, we wouldn't be
> able to build things. If they were mature, they'd be obsolete.
>
They are not adequate to build some of what is needed. We hack it
together the best we can anyway. All turing machines are equivalent in
capability but not in what they enable at what cost. As someone that
has been around for awhile I don't buy into mature == obsolete. :-)
> Again, it's that drive to reach as far as we can possibly reach. The front
> line is always out where it is barely possible to do what is required. So of
> course the tools are only just coping, of course things are cobbled together
> and crazy and kludgey; there's no other way to succeed. When there is a
> better way, the race will have moved on.
It is not quite that simple. Bread and butter programming is being done
today with tools that suck. The results suck so much that most
companies will not even market them without huge disclaimers stating
that they are not fit for consumption. Producing really good dependable
software is not easy. The ugly truth is we don't know that terribly much
about how to do so and that we ignore a great deal of what we do know
because we are in too much of a hurry to field something/anything that
might be at least no worse than what the competition is fielding.
Beware the coming backlash.
> >
> > How does this tie-in to extropian interests? The future and the tech
> > we all so crave is built largely on top of software. Keep most software
> > proprietary and don't invest in better software tools and the future
> > will be much more stunted than it could/should be.
> >
>
> So do it. Build the better tools (again, apparently). If you are right,
> you'll be really, really rich.
>
Money is just another resource, an enabler. Thinking about it overmuch
in terms of how rich you can get takes the focus off of what you are
attempting to enable. It is a trap.
> >
> > It is more cranky and brittle than it needs to be in part because of the
> > problems I mentioned. There are also some real bearish problems in some
> > of our currently dreamed up systems. Things that will take real R&D
> > projects to solve and then under the current model would come out as a
> > bunch of hacked up tools positioned to maximize profit instead of being
> > shared across the entire industry that needs them so desparately.
>
> This is not like aid to the poor or something. It's a wealthy industry. Us
> poor, struggling, underappreciated developers, crushed under the weight of
> ill-informed feudal overlords, are doing very nicely indeed. There's no
> martyrdom available in the world of IT.
>
Nope. It is a commons, a shared bedrock of experience and knowledge
that all our success and greatness can build upon. We instead build
fenced in little claims on the commons and keep the information from
flowing freely to invigorate all our efforts.
Software developers, some of us at least, make good money. But that
does not equate automagically with our skills being well utilized to
maximize the creation of ever greater real wealth and benefits. Our
making a bushel of money does not auto-translate to maximally enabling
the extropian dream futures.
Actually, we already have a software saint that largely changed the face
of computing - Richard Stallman. He was not alone. Ask yourself why
he managed to succeed and why that sort of working outside the rich
rewards was needed to accomplish what he did. Then you will perhaps see
my point.
> >
> > > People have been worried about the contrary point of view; that our
> systems
> > > are getting so big, so unwieldy, that at some point we cross a failure
> > > threshold, beyond which we cannot, as a bunch of humans, reliably
> maintain
> > > the systems any more. Why would this be true?
> > >
> >
> > Without the proper tools and without more and better automation it is
> > inevitable.
>
> I'm not holding my breath. It's impossible to do when you are running.
Sure, but are we progressing or spinning the little wheel in the cage
ever faster? Or both? How fast would we progress if we spent less
energy spinning the wheel and going nowhere?
>
> > We have automated certain classes of GUIs but the GUI story is far from
> > complete or adequate. Many projects are now being seriously perverted
> > to use only a Web Browser as their GUI! It is as if we are stepping
> > firmly into the past and putting more and more energy into centralized
> > servers even though we have more power on our desktops than we dared
> > dream of in the very recent past. We need good massively distributed
> > peer-peer systems. Not a sleazy 21st century rework of time-sharing
> > systems.
> >
>
> Yep, I'm building some of those evil server-based, browser front-end gui
> systems. They're shonky, the technology is immature and brittle and cranky,
> and it's the right thing for the times. I can write software that can be
> used by zillions of people all over the world, and yet I have central
> control (and all the hideous problems, like scalability, which go with it).
> I can do stuff which only recently became possible. These systems,
> reminiscent as they are of the mainframe days of yore, are qualitatively
> different, because they are doing the right job in this particular world.
>
It is not the right thing for the times. It is merely what the
interlocking forces are producing now. What is is not "right" simply by
being what is.
Scalability is a problem even in massively distributed systems.
Massively distributed systems and their enabling technology are how you
deal with scalability problems. Centralization is not required.
Actually all the stuff we are doing with client-server web applications
has been possible and done in some environments ever since the internet
became ubiquitous which greatly predates the WEB. I disagree that these
systems are so qualitatively different. Things like software agents are
different. But most of our web applications servers are relearning
lessons learned by timesharing systems folks some time ago. Yes some
new solutions and ways of incorporating old and new solutions are being
invented. But the overarching architecture is still a step into the
past.
> If I'm still building the same things in 2 years, someone come round and
> give me a good slapping!
>
OK. Are previews acceptable? :-)
> >
> > Sure. Although of a quite different sort. What disturbs me is how
> > often I am still doing the same tired old tasks in much the same tired
> > way. There is not often enough time to both meet the current
> > over-inflated deadline for an underdesigned product and automate as much
> > of my own process as would satisfy me.
> >
>
> We programmers have to get passed trying to automate our own tasks; it's not
> our job. Our job is to do the tasks. Somewhere, someone will have the job of
> trying to automate the tasks we are charged to do, and yet they will be
> rankling because *their* job is less automated than they would like.
>
There is no one but us to do the job of automating our own work. Who
else knows as well what is needed or has the required automation skills?
> When your job is automated, it doesn't need you. The things you do will
> always be unautomated, and hence a manual slog, by definition I think.
>
When more is automated I get to move on to hopefully more interesting
and newer challenges. That is progress. I am not crying because what
is not automated is not but because it can/should be and I want to get
on with doing it and do not have the time and money to do so yet.
> >
> > We don't yet do enough with well-defined and trusted components and with
> > good tools for finding the right components and simulating their
> > interaction. Much of our code base is still language and OS dependent
> > and not componentized at all. Most of code is still application
> > stovepipes with little/no reuse or reuseability. In short, almost no
> > automation or next-level engineering applied to our own work. It had
> > better not continue like this.
> >
>
> I think this is getting better. Reusable components exist en-masse now;
> that's pretty new. They're still in their early days, but damned impressive
> really. I think it'll get better; open source may be the way.
>
I am not sure what you mean by components existing "en-masse". Most
code is still not written with reuse as a high priority much less
produced as full software components.
> At some point, people will get genetic programming to the point where you
> can make formal specs for a bit of a system (a component), with rigourous
> definitions of the pre-built pieces (components) that system depends on and
> the interface it will provide, and you will be able to evolve it. Arguably
> that technique could build some very complex systems. Eventually you might
> be able to evolve the design, as well as the implementation; and then we
> begin to move into uncharted territory. Yet I still think that, even then,
> there'll be a level for humans, hacking away at something which is still not
> amenable to automation, standing on the generated systems and reaching for
> the stars.
>
I agree. So why aren't we out building that system today? Wanna put
together a business plan and go looking for funding? Seriously.
- samantha
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:30:59 MST