The term "Singularity" was invented by Vernor Vinge in his novel,
"Marooned in Realtime". It did have basically the same meaning you are
applying, a case of runaway intelligence amplification. Interestingly,
in the novel it was largely a matter of the creation of a group mind, with
people able to interact directly at the mental level. This is somewhat
different from typical Extropian ideas, which are more individualistic in
nature and would picture augmented individuals rather than groups.
The notion of a singularity as the culmination of technological promise
is interesting, but the future will undoubtedly be a very different place
even if it never happens quite like this.
Also, keep in mind that if a singularity happens as a result of subjective
speedup of human-like minds, then the qualitative effects will be different
from what a non-participant will see. If everyone is thinking and living
life ten times as fast as before, then compared to clock time progress may
occur ten times faster. But by subjective time progress will occur no
faster than before. We have just re-scaled the passage of time so that
more things happen between each clock tick.
> Title: "History Ends In 2025, If We're Smart Enough."
> Reasons: Introduce "End of History" to grab attention, "2025" to
> emphasize immediate importance, "If We're Smart Enough" to dissociate
> from tired doomsday memes (Greenhouse Effect) and religious/apocalyptic
> memes that would only appeal to groups of a particular faith.
While the 2025 date may be effective for dramatic impact, I think there
is a lot of uncertainty over how quickly the various changes you predict
will actually occur.
> 1. Computers double in power every two subjective years.
> 2. Recursive intelligence amplification.
As I believe we have discussed before, the doubling we are seeing already
does depend on the existence of ever-faster computers. So we are already
in the middle of a feedback loop, and in fact that is really why we are
able to see exponential progress (feedback is a characteristic of such a
growth curve).
As for actually getting intelligence amplification through the literal
creation of either artificial intelligence or uploading, both of these
are very uncertain in the time frame of three decades. People have been
predicting breakthroughs in AI since the 1950's, but actually progress
has been almost non-existant. And uploading will require breakthroughs
in a very large number of areas. There are also significant ethical
questions about exposing conscious observers to the possibly painful or
horrifying experiences which this research would entail, which might
further retard experimentation in these areas. Our current AI is so
far from reality that even consideration of such hazards seems almost
laughable, but if real progress is ever made this could become relevant.
> 3. Defines "Singularity."
We have discussed different meanings for "The Singularity" on this list.
Some people take it to mean a literally infinite rate of progress, as the
name suggests. Others see it as more of an event horizon, a point in
history where things become so different that people who come before
can't comprehend how things will be afterwards.
Really in a lot of ways we are in the middle of a singularity by this
definition right now. Our species goes back hundreds of thousands of
years at least. Yet people from only a few thousand years ago would
find most of our present activities utterly incomprehensible. In this
view the "comprehension horizon" will continue to shrink as progress
increases in speed.
> 4. Is this a good thing? Names factions.
> 5. Doomsday faction.
> 6. Uploading faction.
> 7. Intro to nanotechnology.
> 8. Nearness of nanotechnology.
I am a skeptic about the short term success of nanotech. This technology
has a fundamental chicken and egg problem: it is very hard to build an
assembler without having an assembler. Once we get there things will be
fine, but that path is very uncertain.
There are going to be millions of atoms in an assembler. It is going to
take a long time before we are able to put that many together accurately
using macroscopic techniques. How much progress has there been in the
last few years since they spelled out "IBM" with atoms? These things
move very slowly.
Personally I think the biotech or self-assembly paths are more
promising although they get less attention these days from nanotech fans.
But any way you go there are a lot of research problems to solve, and it
really isn't possible to lay out a timeline with any degree of accuracy
right now.
> 9. Nanotechnology replaces economy.
> 10. Gray goo problem.
> 11. Intelligence amplification.
> 12. Runaway positive feedback of IA.
> 13. Replacement of human society: End of History.
> 14. Singularity provides Interim Meaning of Life.
> 15. Summary: History is about to end.
I'm not sure about this Meaning of Life stuff though. By (some people's)
definition, we can't really understand what life after the Singularity
will be like. Maybe the people will have more meaningful lives, but
maybe not! We really can't know.
Even where I disagree with the specifics about the timeline, I do think
this is a plausible "big picture" approach to human history and it is
a good outline of issues that are worth thinking about.
Hal