Transhumanity and "inhumanity"

From: GBurch1@aol.com
Date: Sat Oct 31 1998 - 10:47:11 MST


In a recent post to a private mail list group, one of my oldest friends and
intellectual mentors equated the concept of transhumanism (that I have
discussed with this group around many a campfire now) with the term "inhuman".
An exchange of letters has followed. This is my most recent volley in that
discourse:

Picking up the cudgels this morning, I see that in a message dated 98-10-30
11:05:21 EST, Frank writes:

> My dear human friend Greg,
> I want to be very specific in my challenge.

Well, to be precise, I believe it was me who challenged you, Frank, to
explicate and justify your equation of "transhuman" with "inhuman". This is
more than just a debating point of order, but rather goes to who has the
burden of persuasion in this matter. In response to this question, let me for
a moment put words into your mouth, Frank: You would respond (I would guess)
by saying that of course I do, since I propose a change to the status quo in
my concept of a transformation of man into, ultimately, post-man. I would
disagree (not surprisingly). I do not believe that this change marks some
"unnatural" departure from the normal order of the universe but, as I have
written before, instead is simply the natural next progression of the
evolutionary process that has given rise to what we are now.

I might see the question differently if I were sure that there will be an
INSTANTANEOUS transformation, a true "singularity" in the Vingean sense, but I
am not so sure that will happen. It is, I believe, a matter of perspective as
to time. For those who will tread the path of techno-transformation, there
will likely NOT be an instantaneous or near-instantaneous transformation. For
those who choose to not make radical changes to their nature, there may SEEM
to be a near-instantaneous change.

But I digress. I daresay that most of humanity WILL place the burden of
persuasion on the advocates of transformation, so let us move on.

> Even if we stipulate,
> temporarily for the sake of this discussion, that the singularity is
possible,

I see that you do see the importance of the time scale, Frank. Let us imagine
a dialogue:

Greg: So, Frank, what if the process of transformation of a human into a post-
human takes place over, say one thousand generations -- a period of perhaps
ten to twenty thousand years? Would your reservations regarding and
opposition to such changes be less in such a case?

Frank: Yes. With more time to adjust, we would . . .

Greg: Let me interrupt momentarily for the sake of argument . . .

Frank: Typical. So impatient.

Greg: How about one hundred generations; a period of two or three thousand
years? Would your objections then be less than they are now, but more than if
we took longer?

Frank: Yes, I'd say that's fair.

Greg: So, Madam, we have established that you are a whore, and we are simply
haggling over the price, no?

Frank: Anything for a laugh.

But the point is important, I continue. I believe that you are reasonable
enough, Frank, that you will not take the extreme position that augmentation
-- even radical augmentation -- is not in and of itself a bad thing, but
rather that you will maintain that "we" are simply not morally ready for such
enhanced power. And of course (as you make clear below), the question comes
down to who "we" are, who will make such choices.

Let me clarify: Neither I nor any other transhumanist proposes to force any
change on any person. Whether it be the growing number of "Euro-(and
interestingly, "Australio-") transhumanists" with their decidedly more
communitarian tendencies, or the (ironically) more "traditional", libertarian,
extropian American band of original transhumanists, no serious thinker
proposes mandatory transformation of any particular person's "human nature".
(Setting aside -- for the time being only -- the question of genetically
heritable augmentation.) So, the issue is, are the people who will choose to
transform themselves more fundamentally than is already possible morally
capable of making the change in such a way that the world is a better, rather
than a worse place for everyone to live in?

> why is there the least reason to think that the moral philosophizing that
> you engage in will in any way prevent a transhuman future from resembling
our
> human past, in which differences in wealth and power have, with dreary
> regularity, led to oppression?

Well, here we get to your fear, Frank's: A class of oppressive posthuman
overlords. As you close your message:

> The gods are beautiful, powerful, magnificent--but they are not good.
> Theomachus.

Before I go on, let me say that while I am a great believer in tapping the
inherited wisdom of human culture, I am by no means sure that the epigram of a
man who lived in a culture that could not invent the offset crank should be a
guide to our behavior as we consider the wisdom of altering our genomes or
augmenting our minds with implanted computational prostheses.

But let me first cast a slightly different light on the point you make. We
are talking about power conveyed by technology. And I see that power conveyed
by technology has been a little different from power conveyed by other means
(such as power conveyed by coercively imposed social orders). Again, you say
that I must distinguish the future for which I work from

> our
> human past, in which differences in wealth and power have, with dreary
> regularity, led to oppression

Let me point to some examples of technologies that have, at least, been
neutral in terms of "oppression". As we sit here today -- in 1998 -- can we
say that the following technological developments have lead ONLY to
oppression?

-- writing, the printing press, the telegraph, the telephone, radio,
television and the Internet;
-- the train, the automobile, the airplane, the rocket and the spaceplane;
-- fire;
-- eyeglasses, antibiotics and antiseptic surgery

In fact, some of these -- the Internet, eyeglasses, antibiotics -- seem to me
to have NEVER been used for oppressive purposes. Some, like fire, radio and
the airplane and automobile (when equipped with weapons), have been used for
both oppressive and liberative purposes.

Now, let us blend the light from these two different perspectives: "power" and
"technology". If, as you wish to do, Frank, we can only be guided by history,
let us ask what has distinguished the different KINDS of power that have grown
from the different types and uses of technology. The answer seems clear: The
less expensive a technology has been and the more widely distributed have been
its effects -- both good and bad, the less "oppressive" has been the power it
conveys.

An automobile vastly augments the individual human's power in a basic, animal
sense. With my car, I can move an order of magnitude faster than the speed to
which "nature" has limited me. But this power is readily available to
essentially every member of the society in which I live. Of course, one can
say that I am oppressed by the automobile, because it has transformed my
society into one where I am "coerced" into buying an automobile in order to
participate fully in the fruits offered by my country's social life. Or we
can say that we automobile owners "oppress" people who live in societies where
automobile ownership is less evenly distributed. It is, we can see, a matter
of perspective. Automobiles may seem oppressive to the Amish, or to a
Peruvian peasant. But they are a liberating technology -- in a very real
sense of personal power -- to both me and my yard man, even though I am, in a
measure of money only, an order of magnitude more wealthy than my yard man.

> What would prevent transhumans from regarding humanity as subrational
> creatures deserving of any major protections or inhibitions of their own
> transhuman desires? Moral philosophy of the variety you espouse? The track
> record of rarefied, intellectual moral philosophy is pretty effete,
> unfortunately. Unless, of course, the philosophy is linked to the
irrational,
> to religion, resentment, or communitarian values. I'm thinking of the
> abolition of the slave trade, of mass revolution, and of Zionism,
nationalism,
> and other such things.

What a series of great leaps! With all due respect, Frank, this sounds like
the sort of grandiose and incautious rhetorical hyperbole for which you have
been seeking an indicting of me for the better part of these last three
decades. Let us take a deep breath and try to make some sense of what amounts
to a rush from a catalogue of historical discontinuities, upheavals and
excesses, through a stab at "effete moral philosophy" to the oppression of
humanity by, presumably, a handful of transhuman demi-gods.

And we can start with moral philosophy. Here, let us distinguish prescription
from description. I agree that admonition is a tool of extremely limited
power. Christian preachers have railed against various flavors of immorality
for two thousand years, yet an avowedly christian president gets a hummer from
his emotionally unstable aide, something his religious ayatollahs explicitly
condemn.

I return to the point I make above: See how people actually make use of
technological power. It depends on how widespread and evenly distributed it
is. Prohibit it, and only the wealthy and/or daring will have it. Encourage
it -- or at least allow it to grow at its own pace -- and many MAY have it.
All, of course, depending on the price. If only a few can afford to or are
allowed to have some greatly augmented power, one may find oppression. Allow
people to adopt it at their own pace, as widely as they feel comfortable so
changing, and we have less potential for oppression. The more tightly you
restrict this power, the more certain you are to put it into the hands of only
a few -- and those few ones who disdain your restrictions.

> So, you don't have to prove the singularity is possible.

Good.

> You will eventually have to show that it is desirable, but that is not the
> primary concern now.

It seems like THE point.

> What you have to demonstrate, at least according to my lights, is that the
> moral structure of the future is congruent with your hopes and dreams, at
> least to the extent that these arguments, and the implementation of
> instutional sanctions for them, will provide an effective counterweight to
> the transhuman manifestations of brutality, greed, and oppression that have
> been the patrimony of us who are not yet gods.

You point to "brutality, greed, and oppression"; I point to progress. The
world IS a better place now, in the aftermath of the Enlightenment, with its
inseparable admixture of science, technology and social openness (whether
political -- as in democratic institutions -- or economic -- as in the
liberation of the market of human enterprise).

You are suspicious of "effete moral philosophy". So am I. I am suspicious of
those who would proscribe and prescribe. I see the last 300 years as a period
when a loosening of control by those who think they know best has resulted in
a real improvement in the human condition. I see humanity redeemed by itself,
not by priests or nobles or vanguards of this or that segment of humanity. I
see people defeating oppression by consensual collective action without the
assistance of their "betters".

Let me assume that you, too, are an Enlightenment man, Frank. As such, answer
this question: Which is better: People empowered, or people held to some ideal
template?

But leaving aside such broad, rhetorical questions to which only one sane
answer is possible, let us return to the crux of your question:

> What would prevent transhumans from regarding humanity as subrational
> creatures deserving of any major protections or inhibitions of their own
> transhuman desires?

I ask, what would motivate them to do such a thing? Are we inherently cruel
to our parents? Let's do a vocabulary lesson. As much as I normally hate
turning to dictionaries as sources of authority, let's see how Merriam-Webster
defines some of the terms upon which your original statement was based:

        human [2] (noun)
        First appeared circa 1533
         : a bipedal primate mammal (Homo sapiens) :
        MAN; broadly : any living or extinct member of
        the family (Hominidae) to which the primate belongs

Well, hopefully you will not be a chauvinist about this, Frank, and place some
moral value in the mere physical form of a homo sapiens. After all, is an
amputee any the less "human" because she lacks one of the two legs upon which
this definition is based? No. The "human" to which you refer is really, I
think:

        hu*mane (adjective)
        [Middle English humain]
        First appeared circa 1500
         1 : marked by compassion, sympathy,
           or consideration for humans or animals
         2 : characterized by or tending to broad
           humanistic culture : HUMANISTIC <~ studies>

It is probably the former of these that you really mean when you speak of
"humanity", I will guess. And then, there is the Frankenstein word:

        in*hu*man (adjective)
        [Middle English inhumayne, from Middle French & Latin;
      Middle French inhumain, from Latin inhumanus,
      from in- + humanus human]
        First appeared 15th Century
         1 a : lacking pity, kindness, or mercy : SAVAGE <an ~ tyrant>
           b : COLD, IMPERSONAL <his usual quiet,
           almost ~ courtesy --F. Tennyson Jesse>
           c : not worthy of or conforming to the needs
          of human beings <~ living conditions>
         2 : of or suggesting a nonhuman class of beings

The language here suggests that the concern you have is for compassion on the
one hand and a lack of it on the other.

What will characterize the post-humans I envisage? Two things: Greater mental
abilities and greater physical power. (I assume that the technicalities are
understood here, at least in their basic outlines.)

And here lies the real question: Is there any reason to assume that smarter
people are LESS compassionate than, for want of a better word, "dumber"
people? No. In fact, I believe that the evidence of our everyday experience
indicates just the opposite, as knowledge enhances empathy: The more people
learn of gorillas, the more likely they are to empathize with them. On
average, people who see pictures of Auschwitz are less likely to engage in
atrocities. I challenge you, Frank to an accounting of this phenomenon. I
think you will run out of examples of knowledge and comprehension leading to
cruelty before I run out of contrary examples.

Is greater raw power congruent, to use your term, with less compassion? I
admit that, in the abstract, this is a more problematic question. We are
creatures of a mixed nature. We are primates, with a strong sense of social
hierarchy imposed by force. But we are also capable of something more, as all
of the world around you indicates: In societies characterized by openness and
respect for liberty, we see overwhelming evidence that we can overcome our
primitive nature. Do we become perfect egalitarians in such circumstances?
No. That is the impractical dream of effete moral philosophers. But do we
consensually submit to regimes of limitations on our personal power, so that
we can express the compassion that comes with greater knowledge? The history
of the last 300 years indicates that we do. The very open societies that make
it possible to create the science that leads to the technology of
transhumanity are themselves the incubators of compassion.

Does this conclusion sound Panglossian to you, Frank? No doubt it does. What
if Hitler had had these powers? What if Stalin? But for accidents of
history, you may argue, we would all be the permanent thralls of transhuman
Nazis, you may argue. I do not think it is an accident of history that it is
the most open society ever known that is developing this technology. Closed
societies simply could not have created the OPEN INFORMATION STRUCTURE upon
which it is premised. It would have taken many thousands of years -- if ever
-- for the kind of "science" practiced by totalitarians to create what we have
created in three short generations since we contained (and now have largely
vanquished) those monsters. Long before Nazi scientists would have solved the
riddle of the human genome, that society would have crumbled into savagery.
Long before Stalinist mathematicians would have developed supercomputers,
communist totalitarianism DID crumble into savagery.

Here I return to the quip I made some time ago: "We are all Marxists now"
(after Nixon, who is reputed to have once said, "we are all Keynesians now").
In this I mean that we have all learned that social structure is inextricably
intertwined with technology in a symbiotic, feedback relationship. Particular
technologies give rise to particular social structures and visa versa. (It is
the "visa versa" in which I and other modern observers depart from Marx.)
Real science (as opposed to received recipes for crafts) is a FUNCTION of an
open society and open societies are a FUNCTION of science. By definition, the
science and technology (and the two cannot really be distinguished in this
time) of transhumanism will be the hardest problems of natural understanding
and control humanity will ever undertake -- for having mastered them, they
will transcend "humanity" in the sense of a physical description of homo
sapiens and her limitations.

So here is the paradoxical nut of the problem, Frank: Only openness makes
techno-transcendence possible and the only tonic for oppression is openness.
If you allow us to come to the brink, and then throttle the process with
coercively imposed restrictions, you have prescribed perhaps the ONE formula
for creating the very monsters you fear. And we are on the brink, have no
doubt whatsoever.

         Greg Burch <GBurch1@aol.com>----<burchg@liddellsapp.com>
           Attorney ::: Director, Extropy Institute ::: Wilderness Guide
        http://users.aol.com/gburch1 -or- http://members.aol.com/gburch1
                   "Good ideas are not adopted automatically. They must
                      be driven into practice with courageous impatience."
                                    -- Admiral Hyman Rickover



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:49:43 MST