From: Robert J. Bradbury (bradbury@aeiveos.com)
Date: Sun Sep 16 2001 - 18:32:12 MDT
Harvey wrote:
> Some recent postings seem to suggest that we should sacrifice innocents to
> speed up the development of technology in the name of The Extropian
> Principals.
Now Harvey, I didn't say that. What I *attempted* to do was set up
a reasonable extropic "corollary" in line with the Principals --
i.e. the ability to save the greatest number of lives in the shortest
amount of time is "extropic". (People should feel free to attack the
corollary *or* the means of achieving it.)
I also object to the use of the term "innocents", though I do not deny
that the use of nuclear weapons would of course kill children whom
one can reasonably consider "innocent". An unbiased observation
of the demonstrations in the streets of towns in Pakistan neighboring
Afganistan would not label the adults involved in these affairs
"innocent".
As Samantha pointed out, it may be necessary to prioritize the
Principals so that sentients have self-determination privileges
that supercede the other goals.
Now that raises a *very* sticky issue as to what rights one assigns
to (a) irrational individuals; or (b) individuals who are rational but
operating from a fundamentally misinformed belief systems? [Of course
one is free to question the degree to which all Americans or "Western"
individuals are operating from misinformed belief systems.]
Given that individuals exist who have been indoctrinated with
fundamentally unextropic belief systems -- who have the power or the
will to intervene in the extropic directions of free societies -- the
question is "How much are you willing to allow such individuals
to cost you in extropic productivity?"
I think this may be the fundamental question -- "What is the value
of a human life whose read-only-memory is based fundamentally on
a set of synthetic fabrications?" *Particularly* if those
fabrications are likely to represent a threat to individuals
who base their reality on a more rational examination of the facts?
How does one tradeoff "human compassion" against "rational meme
triumph"?
Hal wrote:
> I think the biggest problem is the idea that the end justifies the means.
> We have a goal of an Extropian future. But what we are sometimes
> forgetting is that how we get there is important, too.
Ok, Hal, can you present a reasonable argument that "how we get
there" is really important? One of the difficulties associated
with being human are feelings such as guilt or empathy. Those
feelings need not be propagated into an advanced society. Moreover,
we can simply "delete" "how we got there". If the end justifies
the means and the means were unpleasant, then one cen simply
remove them from our collective memory.
Alternatively, one can simply "adjust" the perspective that the
invocation of "unpleasant" methods to produce a result have a
high degree of value from the perspective of decreasing the time
and costs to obtain the desired results.
Here is the challenge --
How many lives are you willing to sacrifice to do it "cleanly"?
(i.e. following a logical progression of technological evolution)
vs. doing it with blood on ones hands?
(pointing to demonstrable numbers of individuals who have been
"wasted" because it is clear they are pre-programmed or likely
to be pre-programmed from an un-extropic perspective).
Robert
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:10:44 MST