Re: Singularity: Individual, Borg, Death? was Re: the few, the proud...

From: Michael Lorrey (retroman@together.net)
Date: Wed Dec 02 1998 - 12:09:08 MST


Paul Hughes wrote:

> "Eliezer S. Yudkowsky" wrote:
>
> > I grew up knowing I was a genius and knowing that I could almost
> > certainly be part of something big if I played my cards right.
>
> I have no doubt of that. However since I am probably not a genius (yet), I have up
> to this point been unable to argue with you on the finer points of your reasoning.
> However, after abundant contemplation, I'm beginning to notice what may be some
> logical inconsistencies in your position.
>
> For starters, if I had to come down to choosing logic vs. survival, I'd choose my
> survival every time. I can think of at least three life-threatening situations
> where my gut instinct saved my life over all other logical objections.

In split second emergencies, the old flight or fight reflex works remarkably well.
Stopping to think tends to get you killed, but only because the logical left brain tends
to cross wires with the right brain, which is the typical seat of instinctive eye/body
coordination.

> > I am not acting on wishes. I do not at this time project that the Singularity
> > will result in the gratification of any of the desires that initially
> > motivated me, not pride, not the acknowledgement of greatness, not the fun,
> > and probably not even the knowledge of success. The personal drama that once
> > captivated me is irrelevant. I am acting on logic.
>
> Ok, so your acting on logic. If I understand you correctly, all of the things that
> makes life enjoyable (fun) are irrelevant because logically our only outcomes are
> singularity or oblivion? Either we face the inevitable or accept extinction?

More like, accept that things are going to change. Being stubbornly against change is
only going to cost you.

>
>
> > Now, I happen to think that even from humanity's perspective, a rapid
> > Singularity is the best way to go, because I don't see a feasible alternative.
>
> Can you concisely explain why a a non-rapid path to the Singularity is unfeasible?

Well, there are several possible excuses for this opinion.

One is that since the reactionary factions in society are all opposed to this change,
and will tend to try to stamp it out in a luddite/fundamentalist backlash, you need a
minimum rate of growth to attain a minimum individual power base size that is effective
in defending against such attacks, within a minimum time period such that you get too
powerful for the luddites before they notice that you (a transhuman entity) are actually
in existence. Personally I think that a soft singularity is the way to avoid such civil
strife. Many people tend to get reactionary when the rate of change exceeds a certain
multiple of the rate which the individual perceives to have existed in childhood. So,
thinking rationally, a lower rate of growth will tend to minimize interest in such
reactionary opinions in the general population, thus marginalizing their impact on
society at large.

Another rationalization for this opinion is based on some faulty notions of malthusian
scarcity. Since acheiving a transhuman state will require x amount of resources, we need
to keep technology advancing enough such that efficiency gains from technology outweigh
increases in population diluting the per capita resource base. People that still think
that population is growing too rapidly will tend to beleive that a fast track to the
singularity is needed, otherwize the trend will stall out and civilization will fall
into a dark age. Since recent studies of populations show that as populations get a)
more educated (especially the female population), and b) more wealthy, and c) more
healthy, then the per capita child production tends to drop off, such that industrial
nations now have negative population growth when immigration is not counted. All
population growth worldwide is concentrated in the 3rd world. So, the fewer nations that
remain in a 3rd world development status should lead to a lower global population growth
rate. This sort of reasoning is supported by such statistics as energy cost stats. On a
current dollar rate, energy of all kinds is cheaper today than at any time in human
history. Resource costs of all kinds is deflating on a regular, annual basis, and has
been for most of this decade. What is good about this is it makes acheiving the
singularity more affordable for greater percentages of the population.

>
>
> > The only way that any of us
> > can "not die" is through a friendly Singularity. If that's impossible, well,
> > at least our deaths will be the ethically correct thing to do. Sooner or
> > later human civilization will perish or go through a Singularity. This I
> > guarantee. What can you possibly accomplish, for yourself or for anyone, by
> > delaying it?
>
> My life for starters. Unlike you Elizier, I could care less about the singularity
> if it means the end of my existence. What I do care about is the continued
> existence of my memetic conscious self and others in my memesphere (which includes
> even you Eliezer). Now if that means that I must embrace the Singularity or face
> death, then you and I are in agreement. I'm very willing to embrace logical
> outcomes, but only under the condition of my continued existence. If somehow
> 'logic' prevented me from surviving a situation, then 'logic' would have to go.
> Call me selfish, call me egocentric; but I'm not about to put my faith in an
> unknowable singularity (god) over my own self-directed consciousness. I would
> rather lead a long and futile quest for perfection as an individual, rather than
> join a more potentially satisfactory collective borganism.
>
> I'll agree that if a non-rapid path to a friendly singularity is not possible, then
> the logical thing for anyone who cares about their continued existence is to embrace
> a rapid Singularity.
>
> > But that could all be rationalization. It's not the reasoning I use. I wish
> > to do the right thing, which is a question that is resolved by intelligence,
> > and thus I am required to create a higher intelligence to accept orders from.
>
> Can you accept the possibility that this higher intelligence could be internally
> generated rather than externally dictated? Assuming its possible, which would you
> rather have?

I would obviously want myself to be an active participant into a transhuman existence,
not merely in a parental role.

>
>
> > My allegiance is to the Singularity first, humanity second, and I'm tired of
> > pretending otherwise to myself.
>
> My allegiance is to myself first, that part of humanity I care about the most, then
> the Singularity. Like you Elizier I'm seduced by what the Singularity portends and
> want to achieve and immerse myself in higher intelligence, but only as a
> transformation of my ego rather than a deletion of it.

This is the sort of sentiment which will be a limiting factor on a fast hard
singularity. The more people feel that they have no personal stake in the singularity,
the less likely they are to support it.

Mike Lorrey



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:49:53 MST