From: Anders Sandberg (asa@nada.kth.se)
Date: Sun Sep 16 2001 - 17:47:51 MDT
Robert, this post was evil. Not evil in the chuckling "we could
make the potted plants of luddites start glowing in the dark with
pro genetic slogans" humorous form of micheviousness, not evil in
the sense of making life harder or worse for other people, but in
the worst form of pro-entropy, hasten the heat death of the
universe form of evil. It had been better if it just had been
stupid.
It is based on two big mistakes, one logical, one ethical. The
ethical is most serious: you assume that human lives can have
negative value, and do an essentially utilitarian calculation not
of happiness, but simply of utility towards a certain goal. Humans
not promoting this goal are a vaste of resources and a potential
threat, so eliminate them. Hmm, sounds an awful lot like Skynet
and other evil Hollywood AIs. But why are you striving in the
first place towards singularity? Clearly, it must be valuable in
some way.
The core idea of transhumanism is human development, so that we
can extend our potential immensely and become something new. This
is based on the assumption that human life in whatever form it may
be (including potential successor beings) is valuable and worth
something in itself. It must not be destroyed, because that
destroys what transhumanism strives to preserve and enhance. Even
if some humans are not helpful in achieving transhumanity doesn't
mean their existence is worthless, and if they are an active
hindrance to your (or anybody elses) plans destroying their lives
is always wrong as long as they do not initiate force against you.
A transhumanism not based on this core does not deserve the
humanism part of the name. And a transhumanism that bases itself
on the utilitarist approach of using humans as means to an end
rather than ends in themselves becomes the ideological twin of
fascism, with the sole difference that the singularity takes the
place of the national community in the ideological structure.
The logical mistake is to ignore the full consequences of your
idea, and just look at the "desirable" first-order consequences.
What you miss is that if this form of "practical genocide" is
used, then the likeliehood of other forms of "practical genocide"
are becoming far higher and harder to ethically suppress, as well
as resistance to the US or other genocidal group is likely to
become *far* more violent. In short, you make the world far more
dangerous, more likely to use weapons of mass destruction,
terrorists more desperate - while at the same time assuming
accelerating technology. Doesn't that sound like a recipe for
total disaster?
This post is going to haunt us all - it is in the archives, it has
been sent to hundreds of list participants. Be assured that in a
few years, when the current uproar has settled down, somebody is
going to drag it out and use it against extropianism in the media.
"So, Max More, is it true that *not all* extropians support the
plans for genocide you discussed on your internet list?" After
all, I would be surprised if there are no representatives for the
"luddites" on this list.
Frankly, I'm both scared and disgusted with much of the
discussions on this list over the last week. While the emotional
distress caused by the disaster is quite excusable, I don't think
some of the behavior caused by it is excusable.
-- ----------------------------------------------------------------------- Anders Sandberg Towards Ascension! asa@nada.kth.se http://www.nada.kth.se/~asa/ GCS/M/S/O d++ -p+ c++++ !l u+ e++ m++ s+/+ n--- h+/* f+ g+ w++ t+ r+ !y
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:10:43 MST