From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Mon Sep 14 1998 - 18:37:36 MDT
Damien R. Sullivan wrote:
>
> Conclusion? Much of what we anticipate has already happened, at fast rates,
> and with the creation of sharp dichotomies. I already hear that no one person
> can fully understand a Boeing 747, or MS Excel. We can already produce
> superintelligences capable of producing things our minds aren't big enough to
> grasp. The consciousness isn't superhuman, but a human CEO makes decisions
> based on superhuman levels of prior processing, and with superhuman (in
> complexity, not just gross scale) consequences.
Doug Bailey wrote:
>
> Are we SIs? This question might seem silly but I'm serious. To answer this
> question we need a definition of intelligence.
We need to keep a clear distinction between Singularity::Horizon and
Singularity::Transcendence. (Which of these constitutes the "real"
Singularity is a matter of semantics.) Also, as many people have pointed out,
the Horizon isn't at all clear-cut, so arguments about whether we're currently
in the Horizon are usually "Yes" if you think of the past and "No" if you
think of the future.
Questions of the Horizon become much more clear-cut if you consider the
introduction of sub-Horizons which I call Speed Phases. One might view mortal
history as having five major Speed Phases: Hunter-gatherer, agricultural,
printing-press, industrial, and collaborative filtering. Each marked by a
characteristic speed: Hunter-gatherer has doubling times measured in eons,
agriculture in millennia, printing-press in centuries, industrial in a few
decades, and collaborative filtering in a few years or months. We are
presently in the industrial stage, of course.
Corporations are recognizably and squarely in the Horizon, even though they
have some non-human capabilities. It might be argued, in fact, that the
quality of corporations determine the Speed Phase. And our present-day
belongs to a different Speed Phase than that of an ancient Greek, but we are
still the same species.
An Internet where you can instantaneously find exactly the piece of
information you'd most want to read - an Internet with collaborative
filtering, Firefly tech - would make the world as incomprehensible to us, as
we would be to Newton - although not as much as to Socrates. A world of
neurohackers, which belongs to the same Speed Phase as collaborative
filtering, would likewise be greatly altered.
Whether or not you call this a Singularity is a matter of taste. To keep
things clear, I call advances in Speed Phase a Singularity::Horizon. The
total remaking of the world by superintelligences is a Singularity::Transcension.
The semantic question ("Is this SI?") has now been resolved. From the
perspective of an earlier Speed Phase, this is Horizon. From the perspective
of a later Speed Phase, we're still plodding along. The questions of how
corporations determine Speed Phases has still to be discussed, of course.
I happen to feel that "superintelligence" should only apply to
Singularity::Transcendence; corporations might count as enhancements, but they
aren't _super_ intelligence, which has a useful connotation of "something a
LOT greater than human".
But why are corporations kept separate from Singularity::Transcendence?
Instantaneous speed is probably the most fundamental characteristic of the
Transcendence, from our present-day viewpoint. We've come such a long way in
the past few hundred years that people who've never even heard of the
Singularity find it difficult to imagine humanity in 2050, much less 2500. If
intelligence becomes faster, all the change of the next ten thousand years
could be compressed into hours. Even if the superintelligence is no smarter
than an ordinary corporation operating at a thousand-to-one speedup, a year of
technological progress can still occur every few hours. You don't have to
believe in positive feedback for this to be an enormously wrenching change.
I do believe in positive feedback, and I don't think there will be anything at
all left of our world - but that point is not why I'm posting this; I'm
posting this to try and clear up some terminology.
-- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/AI_design.temp.html http://pobox.com/~sentience/sing_analysis.html Disclaimer: Unless otherwise specified, I'm not telling you everything I think I know.
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:49:34 MST