It doesn't have to. One of the necessary starting conditions for the
above scenario is that computers be advanced enough to have
intelligence roughly equivalent to human intelligence. Beyond that,
advances in speed alone will be enough to bring about the singularity
even if no other fundamental improvement in intelligence is made.
> 2. You'll need a bit more detailed argument to show that computer power
> doubles every two subjective years.
I can see where the "doubles every two subjective years" assumption
might originate, but I also see that it has a significant flaw. Right now,
computers are being developed by human beings. The intelligence of
the human designers isn't based on the technology, and thus isn't
increasing along with the increasing computing power. Assuming that
this is correct, it would stand to reason that if the intelligence
(or just speed) of the designers were increasing right along with the
technological advances, then the improvement that now occurs in two
real years would accelerate in real time, staying constant in
subjective time.
The flaw in this is the assumption that the technology is not already
contributing to its own advancement. In reality, it is. Although
the machines aren't actually designing their successors, they are
providing powerful tools that improve the human designers' speed
and _effective_ intelligence by performing more and more of the
routine tasks and by helping the designers manage the large quantities
of information that they must deal with. Hardware description
languages & compilers, sophisticated CAD/CAE tools with
autorouters and design rule checkers, digital and analog simulators,
thermal analysis tools -- All of these things contribute to the
development of more sophisticated computers. And all of these
things then benefit from that advancement and it goes around the
circle again. Even mechanical CAD tools and "office" applications like
spread sheets and database software contribute to it. And on the
software front, the same thing is happening with languages, compilers,
code generators, etc. It should also be noted that the number of
designers working on the technology has also increased considerably
over the last few decades, thus demonstrating yet another way in
which the total intelligence developing computer technology has not
been constant.
The bottom line is that the "two subjective years" assumption is
probably not correct. My own inclination is that having intelligent
machines involved in their own development will yield a curve similar
to what we see now, possibly with a shorter, but still constant, time
between each 2x increase in computing power. And just as it is now,
it will not be a very consistent curve. It will have the usual lulls
and occasional bursts when breakthroughs occur. There does not
appear to be any natural law behind the 2X in 2 years (or whatever it
is) progress rate. So even that is too much to assume, let alone
assuming that it will be proportional to subjective years.
The rate of improvement is likely to be very dramatic, but I don't
think that a "singularity" is a valid assumption. I can't say that it won't
happen, but I don't think there is adequate cause to assume that it will.
Still, no need to feel disappointed. We're in for one heck of a ride
into the very near future. Like the old BTO song says: "B-b-b-baby
you just ain't seen n-n-nuthin' yet
---Peace, William Kitchen
bill@iglobal.net