Singularity: Individual, Borg, Death? was Re: the few, the proud...

From: Paul Hughes (planetp@aci.net)
Date: Wed Dec 02 1998 - 02:08:16 MST


"Eliezer S. Yudkowsky" wrote:

> I grew up knowing I was a genius and knowing that I could almost
> certainly be part of something big if I played my cards right.

I have no doubt of that. However since I am probably not a genius (yet), I have up
to this point been unable to argue with you on the finer points of your reasoning.
However, after abundant contemplation, I'm beginning to notice what may be some
logical inconsistencies in your position.

For starters, if I had to come down to choosing logic vs. survival, I'd choose my
survival every time. I can think of at least three life-threatening situations
where my gut instinct saved my life over all other logical objections.

> I am not acting on wishes. I do not at this time project that the Singularity
> will result in the gratification of any of the desires that initially
> motivated me, not pride, not the acknowledgement of greatness, not the fun,
> and probably not even the knowledge of success. The personal drama that once
> captivated me is irrelevant. I am acting on logic.

Ok, so your acting on logic. If I understand you correctly, all of the things that
makes life enjoyable (fun) are irrelevant because logically our only outcomes are
singularity or oblivion? Either we face the inevitable or accept extinction?

> Now, I happen to think that even from humanity's perspective, a rapid
> Singularity is the best way to go, because I don't see a feasible alternative.

Can you concisely explain why a a non-rapid path to the Singularity is unfeasible?

> The only way that any of us
> can "not die" is through a friendly Singularity. If that's impossible, well,
> at least our deaths will be the ethically correct thing to do. Sooner or
> later human civilization will perish or go through a Singularity. This I
> guarantee. What can you possibly accomplish, for yourself or for anyone, by
> delaying it?

My life for starters. Unlike you Elizier, I could care less about the singularity
if it means the end of my existence. What I do care about is the continued
existence of my memetic conscious self and others in my memesphere (which includes
even you Eliezer). Now if that means that I must embrace the Singularity or face
death, then you and I are in agreement. I'm very willing to embrace logical
outcomes, but only under the condition of my continued existence. If somehow
'logic' prevented me from surviving a situation, then 'logic' would have to go.
Call me selfish, call me egocentric; but I'm not about to put my faith in an
unknowable singularity (god) over my own self-directed consciousness. I would
rather lead a long and futile quest for perfection as an individual, rather than
join a more potentially satisfactory collective borganism.

I'll agree that if a non-rapid path to a friendly singularity is not possible, then
the logical thing for anyone who cares about their continued existence is to embrace
a rapid Singularity.

> But that could all be rationalization. It's not the reasoning I use. I wish
> to do the right thing, which is a question that is resolved by intelligence,
> and thus I am required to create a higher intelligence to accept orders from.

Can you accept the possibility that this higher intelligence could be internally
generated rather than externally dictated? Assuming its possible, which would you
rather have?

> My allegiance is to the Singularity first, humanity second, and I'm tired of
> pretending otherwise to myself.

My allegiance is to myself first, that part of humanity I care about the most, then
the Singularity. Like you Elizier I'm seduced by what the Singularity portends and
want to achieve and immerse myself in higher intelligence, but only as a
transformation of my ego rather than a deletion of it.

Paul Hughes

http://www.aci.net/planetp



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:49:53 MST