From: Robert J. Bradbury (bradbury@aeiveos.com)
Date: Sat Sep 14 2002 - 16:51:34 MDT
On Sat, 14 Sep 2002 dalec@socrates.Berkeley.EDU wrote:
Aha.... Newbies for lunch... Come into my parlor said
the spider to the fly...
[Gratuitous google links:
http://www.ongoing-tales.com/SERIALS/oldtime/POETRY/spiderfly.html
or
http://fairytales4u.com/fable/poem03.htm
]
> I don't understand this very common usage of the term "singularity".
I think most extropians use it in the Vingean sense:
http://www.aeiveos.com/~bradbury/Authors/Evolution/Vinge-V/TCTSHtSitPHE.html
> I thought the idea of singularity was introduced to express the sense that
> technological development was accelerating so much that there is an
> impending horizon beyond which we cannot confidently predict what it will
> mean to live in the world.
This seems accurate to me. It implies a time when knowledge quantity and
creation rate are sufficiently great that humans without significant
enhancements are essentially irrelevant.
Of particular concern is Robin's "If Uploads Come First"...
If Uploads Come First: The Crack of a Future Dawn. Extropy 6(2):10-15 1994.
http://hanson.gmu.edu/uploads.html
>"Singularity" expresses an idea similar to Clarke's chestnut that
> any sufficiently advanced technology is indistinguishable from magic.
No. It goes a bit further than that "magic". I can understand "magic" if I
educate myself sufficiently and have it explained to me. The Singularity
implies that you can never catch up. (Actually the Singularity is a
semi-false god since it is inherently limited by many things -- local
matter and energy resources, speed of light delays, the feasibility
of sub-atomic engineering, but the immediate consequences (~1000 years
probably) are so far beyond where we are that one has gone past "magic"
and is deeply imersed in "fantasy".
> What I don't get is how "singularity" goes from a word expressing our
> perplexity about the future, to a word that expresses our confidence about
> that future.
Moore's Law?
> People talk about pre and post singularity societies with the
> confidence of lab techs stimulating their squids. Why are you so sure a
> so-called "post-singularity" society would necessarily employ the same
> typology you do?
I doubt that it will. We currently live in a survival dominated
paradigm. Now, post-singularity that will not be the case for
"normal" humans who will be mere insects compared to the demi-gods
who will be debating the proper allocation of the available resources.
Will they care to preserve mere humans? That is an open question.
But it begs the question of where one is on the sentience scale.
I will put spiders outside when it is easy to do so. When not
I squash them. (Spiders are normally my "friends" since they are
consuming the other bugs -- but I have pre-programmed genetic limits
for tolerance.)
> What is the threshhold event you imagine constitutes the
> singularity that is universally passed through in this way? Is it
> Drextech? Widespread uploading? Warp technology? What?
Drextech per se doesn't induce the singularity -- though it certainly
pushes things in that direction. We might very well get the singularity
with molecular electronics without any Drextech as most envision it.
There isn't a "threshhold event". Personally, I think the singularity
will be here when an increasing significant fraction of workforces
will need to be "enhanced" to compete. (I'll note that most
express mail delivery persons already seem to be "enhanced".)
> Why not say agriculture made us a post-singularity society?
More people survived -- but people still died.
> Bruce Sterling seems to suggest that the singularity is reached when
> longevity medicine succeeds in increasing life expectancy one year per year.
This is not a bad benchmark -- it marks the transition from living
to survive to living to evolve.
> Vinge has it happening once computers outpace human brains.
They can't simply exceed our computational capacity (a petaflop probably).
They have to use it usefully. But slowly -- bit by bit applying the
increasing computational capacity to "intelligent" applications
*is* taking place.
> As a registration of the necessity for modesty and special care when we
> talk about the likelihood of unintended consequences of technology in
> coming years, it seems fine to me, though this uncertainly, strictly
> speaking, seems to me already contained in the idea of "future" anyways.
Not so fast, friend Sancho (see other recent posts by me...). I think
there are key difference between the "future" humans have faced over the
previous several thousand years and the future they will face over
the next hundred. For the most part we and our environment were
evolving at equivalent rates -- you do one thing better -- I do one
thing better, etc. Where the Singularity takes hold is with respect
to not only changing the rate of evolution but changing rate of the
rate of evolution. The game humans have been playing for the last
several thousand, hundreds of thousands, millions of years is about
to get *very* much more complex.
Are *you* prepared for it?
> ... and to the extent that it seems a way to smuggle into scientific and
> secular outlooks the old idea of an Apocalyptic Event we can look forward to,
> to answer all kinds of deeply unfulfilled psychic longings and what have you,
> well, it looks frankly damaging, and to be an idea worth jettisoning.
I'll not disagree with the fact that the Singularity has scientific
and secular foundations. I *don't* think it will be an Apocalyptic Event
(not in the sense that 911 or Krakatoa were). It seems likely that it
will occur over time (years or months) and there will be people watching it,
pointing it out saying "look -- see this -- will you please pay attention".
But hopefully the technology that enables the singularity also enables
our evolution within its vector. So the question becomes -- "Do you evolve
or do you become irrelevant?"
Robert
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:17:02 MST