From: Samantha Atkins (samantha@objectent.com)
Date: Wed Jan 17 2001 - 01:36:37 MST
Jim Fehlinger wrote:
>
> Charlie Stross wrote:
> >
> > I'm just saying that new technologies have side-effects, sometimes disastrous
> > ones, and insisting that their deployment is *always* beneficial isn't going
> > to fool anyone (and is going to make us look like idiots).
>
> Also, effects that some folks on this list can contemplate with
> equanimity
> are events that would horrify many people outside of extreme
> sci-fi/technophilic
> circles. For example, Eliezer Yudkowsky has said on many occasions
> that,
> as long as the human race survives long enough to give birth to some
> sort
> of superintelligence, the ultimate fate of humanity is of no consequence
> (to him or, presumably, in the ultimate scheme of things). I suspect
> that this
> attitude is part of what gives folks like Bill Joy the willies.
Actually, Bill Joy gets the willies considering that some of this tech
in the hands of we mortals is at least as likely to get us all killed as
it is to lead to a transcendent species.
If I am going to leave this mortal coil I would rather leave it to
transhumans (better yet become one) than to gray goo.
>
> When I heard Ray Kurzweil speak at the PC Expo last summer, he showed
> transparencies
> of various graphs he had prepared of historical data, which he claimed
> showed
> that progress on various fronts is exponential. One of these graphs was
> of
> economic data, in which (as Kurzweil pointed out) the Great Depression
> was
> a visible glitch, but one which was more than made up for by the surge
> in
> economic growth which took place after its end. It crossed my mind then
> that,
> if the evolution of post-human intelligence involves the extermination
> of most
> or all of the human race (as in the scenarios of Hugo de Garis or Kevin
> Warwick),
> a retrospective Kurzweil graph of the event might still show it as a
> barely-visible
> glitch in the exponential curve -- if the Singularitarians are right,
> the
> sheer computational capacity of the entities swarming through the solar
> system
> a few years after the extinction of humanity might be such that to them,
> the
> loss of several billion human brains' worth of processing capacity might
> be no
> more than the ongoing quota of traffic fatalities that the human race
> (or the
> industrialized part of it) is willing to bear as the price of having
> cars. Or maybe
> even less -- no more than the unnoticed and unmourned loss of a few
> cells from an
> individual human being.
It is possible but I rather doubt it. For one thing, there is no simple
way to get to the place where the non-humans (not trans-humans) are
superior to us in all ways and no longer need humans at all. For
another I would bet that diversity has some value especially when
diverse groups are not competing as directly for precisely the same
resources.
My own grounding is in expanding the capacities of humankind endlessly
(including changing what is and is not thought of as human) , not of
wiping humanity out as a useless bridge to something better. But I
cannot logically preclude that some sentients will not be inclined to
tolerate the existence of "mere humans" indefinitely. Humans that do
not wish to transcend their "natural" state might be especially
problematic if they also wish to stop others from seeking trancendence
or if they will insist on being singularly miserable mischief makers if
they find themselves even knowing of such beings. At some point their
rantings will simply and safely be ignored, although they will certainly
not like this outcome and will feel terribly and even murderously
aggrieved. Sometimes I think the nearest you could come to keeping the
peace and making everyone reasonably happy is to upload them into a VR
meeting their specifications until they decide they want to try
something different. <sigh>
- samantha
>
> >From an individual's perspective, the Great Depression was a period of
> almost unimaginable suffering, as would de Garis' Cosmist-vs-Terran
> war (either between pro- and anti-AI humans, or between humans and AIs).
> Many mainstream people would say that anybody who can contemplate such
> an
> event with detachment must be a bit of a monster. Be that as it may,
> it may prove to be an unbridgeable gulf between Singularitarians
> and the rest of humanity (even technologically sophisticated folks like
> Bill
> Joy), if the former are seen as taking a cold-blooded,
> "che sera, sera" attitude toward the possibility of the extinction of
> the
> human race. I think the motives of enthusiastic Singularitarians are
> always going to be mistrusted by the mainstream, and Extropians and
> Singularitarians are likely to continue to be portrayed by journalists
> and
> authors as they are, for instance, in Ken Macleod's _The Cassini
> Division_.
>
> Jim F.
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:04:57 MST