Re: Freedom or death? (Was: Re [2]: Extropy in the personal sphere

From: Eric Watt Forste (arkuat@pobox.com)
Date: Fri Aug 08 1997 - 14:52:12 MDT


den Otter writes:
> Yes, but these people were/are hardly Extropians, were/are they,
> and their "ideals" were some of the less "humane" ones in history.

Dude, I do not surrender my self-ownership to others,
*regardless* of whether they happen to share my "ideals". Now
I'm cringeing thinking about all the media-damage you are
unthinkingly doing with this bull-in-a-china-shop style of
posting. Don't you see that implicit in your remark is the idea
that a world dictatorship would be okay if only an extropian
were in charge? That's self-contradictory thinking, as far as
I'm concerned. If you're into global dictatorship, you're no
extropian, IMHO. End of story.

> If, for example, Hitler would have forced 6 million Jews to go into
> cryonic suspension after their *natural* death, could you still
> call him *morally* a criminal?

Yes, he would still be a moral criminal for violating their
self-ownership.

> Is saving a person that's drowning
> [in a sea of ignorance and denial] a bad thing?

Saving a suicide against eir will may not be a flat-out bad thing,
but I submit that it is, at the very least, morally ambiguous. If
objective judgements of sanity and insanity were possible in every
case (they are not), then it might be a good thing to rescue insane
suicidal people, but it would be a bad thing to rescue sane suicidal
people. I do believe that there are certain people (irreparable
psychologically damaged people, for instance, who present a grave
and uncontrollable danger to others) for whom the sane thing to do
might properly be suicide. But the real point here is that a
conscious mental life is owned by that mind itself, not another.
And where I come from, self-ownership necessarily includes the
power to destroy oneself. Sometimes ones long-term strategic goals
can include short-term tactical weirdness. Whether you understand
this or not is beside the point: that some people have understandings
of the world that are currently inaccessible to others is exactly
why the principle of self-ownership is so vitally important.

> Doesn't superior knowledge almost automatically put one in charge
> [or do you have to "play stupid" to save the other's ego's instead
> of their lives]?

Sure, but people claiming superior knowledge coming in with arms
(the means of *proving* their superior knowledge to me) to save me
from myself are unlikely to meet with much cooperation from me.
Unless you consider a halo flaming lead "cooperative".

You are talking war-talk, den Otter, whether you realize it or
not. Those who talk war-talk get war-responses. I don't care for
real physical war myself (I prefer argumentation).

> they will surely perish. Now this may be fine in the case of the
> anonymous masses, but not with friends and relatives. If they
> can't/won't understand the life-bringing logic behind the transhuman
> concept, they must be deemed mentally disturbed, and [thus]
> incompetend to decide for themselves (ok, ok this sounds so arrogant
> it will knock yer socks off, but that's the way it is).

So let me get this straight: you're planning to hold a gun to
your Uncle Ernie's head and *force* him to upload, whether he
likes it or not? To save him from his own bad memes? I think you
might want to inspect your nervous system and make sure you
don't have any weird viral junk lurking there yourself before
you go holding guns to people's heads.

Seriously: how clean is your system? Have you done a deep
neural-scan lately? Are you willing to claim what few people are
willing to claim: that you have examined all the information
lurking in your soul (you know, that big chordate nervous
assemblage) and all the crufty legacy code all hooked together
in there, weird hormonal things left over from therapsids and
fish and monkeys, and you know that It's All Good? Because
that's the presumption that you're pinning your whole argument
on.

Don't *make* me get Lovecraftian on your ass, now. ;)

> The only reason there is *not* to cryonically suspend [or otherwise
> save] a reluctant person, is when you *don't really care*. Period.

I'm sorry, but I see no conflict between deeply caring about
someone and deeply respecting eir self-ownership and eir ability
to control eir own life. In fact, demonstrated ability to take
responsibility for one's own life is one of the things that
elicits caring from me, when I recognize that in other people.
Inability to respect the self-ownership of other human beings,
such as you are demonstrating, is one of the few things that
tends to elicit genuine enmity in me. I'm not there yet, I just
thought I'd let you know that you are treading on ground that I
take very seriously.

> If you think that passively killing people you like/love is better
> than to "limit" their "freedom" [of decay] for awhile when they're
> dead anyway, so they too have the chance to live a full life in a
> marvellous future, you should really check your values (IMHO).

You are talking as if you have an ownership interst in the lives
of other human beings. You may have an interest, but it's not an
ownership interest.

> Most things, [including the limitation of people's freedom] are
> neutral in themselves, the context the're [used] in decides whether
> they are "good" or "bad".

You're right, den Otter. I'd better assemble a tactical
squadron, find out where you live, kidnap you, and set up a
brainwashing operation to save you from your own bad memes.

--
Eric Watt Forste ++ arkuat@pobox.com ++ expectation foils perception -pcd


This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:44:42 MST