Re: Singularity: Just Say No!

From: Anders Sandberg (asa@nada.kth.se)
Date: Sat Jan 16 1999 - 10:26:35 MST


"Chris Wolcomb" <jini@hotbot.com> writes:

> People knowledgeable of the Singularity seem to be aligned within a
>quadrant of four extremes.
>
> 1) In one corner, we have those who believe the singularity is
> inevitable and want it to come as soon as possible.
>
> 2) Those who believe the singularity is inevitable but want it to
>come as late as possible .
>
> 3) Those who believe the singularity
>is not inevitable but want it to come as soon as possible )
>
> 4) Those who believe the singularity is not inevitable, and do not
>want it to come at all.

So what about us who don't believe in the inevitability of the
Singularity and don't have any view on whether we want it or not?

("Welcome to Uncle Eli's Shop of Technological Determinism, do you
want a Singularity Light, Medium or Goo for your history?"
"Err... don't you have anything else?" "No sir, all histories here at
the Shop come with Singularities!" :-)

> As extropians, we live are lives within two penetrating and
>sometimes contradictory precepts. On one hand we have a strong desire
>to create and shape the future our way by any means necessary, yet we
>see those strong desires tempered by a critical understanding of the
>laws of nature. Obviously we'll continue to search for ways to break
>those laws, but in the meantime we're stuck with them.
>
> When it comes to the Singularity, the jury is still out as to
>whether it's a good, desired, or inevitable thing.
>
> As an extropian, I want to shape my future around my desires. Any
>force which attempts to curb my desires we'll be met with the
>strongest resistance possible.
>
> To begin, I see the Singularity as a direct assault on my freedom
>and desires as I define them.

OK, I see your point and I partially agree with it. The problem isn't
the Singularity itself, it is the *concept* of it that is often
proposed here on the list. The Singularity idea that many people seem
to throw around is surprisingly defaitistic and collectivistic.

Remember Vinge's original idea? That intelligence amplification
technology accelerates the rate of progress in the whole of society to
unimaginable levels. Note that this is just a historical development,
just like we can see that energy use have changed as technology and
economy developed. It is not a personal force, it is an abstract
process like the "industrial revolution". Vinge was aware of the
possibility that one or a few individuals could get ahead in this
process (cf. "True Names") but overall his view appears to be closer
to the "Swell" model of Singularity rather than the "Spike" model
(sorry Damien, but the term has this meaning in this context) where a
few take over. This is also a more likely outcome given the laws of
economics, what we understand of the development of technology and so
on.

But where the Singularity idea gets unhealthy is when millennialist
memes (something BIG is going to happen in our lifetime!) combines
with technological determinism (this *will* occur) and transhumanism
(it is going to be called a Singularity and involve
nano/AI/uploads). The problem is that this promotes an uncritical
approach that tends to shrug off any counterarguments with that the
Singularity or the entities involved will be so smart that nothing we
can figure out now will be applicable and a tendency to turn a complex
*process* into an undivisible *thing*. Add to this the lack of
expertise in the "soft sciences" that often becomes apparent in our
discussions, and the whole mess becomes messier.

As you point out, this leads to that many important transhumanist and
extropian ideas gets thrown overboard in the rapture of the future. We
want to control our lives, we want to become more. For this we need a
clear view of possible futures, but we shouldn't just say that because
one possibility appears larger than others that it is inevitable; we
might actually change the future just by doing such analysises and
working against the trends leading towards them.

> Now, what if I were to start a movement to stop the singularity from
> occurring? I think I'll call this new political movement - the
> Anti-Singularity Party. Some might say I I'm going down the path to
> curbing people's freedom - like freedom to build seed AI's. Yet, how
> is this any different from *them* creating a singularity and forcing
> me to be either be assimilated or weeded out through cybernetic
> natural selection? I see no difference, and I challenge you to show
> me otherwise. I say, fight tyranny with tyranny!

This is of course where you strays from what I would consider
extropian thought, or at least the fairly libertarian views expressed
in the principles. Initiating force to prevent others from initiating
force is a notoriously tricky subject. But there are better ways of
preventing the big bad singularity from borganizing you, like turning
into a personal mini-singularity a la a extrosattva able to resist big
singularities. Since you would certainly not be alone in trying to
achieve this, it might be possible to make big borganizations
uneconomical (again, this depends *a lot* on the scaling properties of
intelligence, something nobody of us yet has any clues about. This is
worth a good analysis).

The realy tyrrany to defeat is the tyrrany of a meme gone bad, the big
'S' Singularity that seem to have become defined as some kind of
omnipotent borganism lurking beyond 2020. We need better analysises of
the future, not blinkered by "here be monsters" or wishful thinking.

-- 
-----------------------------------------------------------------------
Anders Sandberg                                      Towards Ascension!
asa@nada.kth.se                            http://www.nada.kth.se/~asa/
GCS/M/S/O d++ -p+ c++++ !l u+ e++ m++ s+/+ n--- h+/* f+ g+ w++ t+ r+ !y


This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:02:51 MST