Re: origin of ideas, civilization, reading list

From: Mark Walker (tap@cgocable.net)
Date: Mon Aug 06 2001 - 06:31:48 MDT


----- Original Message -----
From: Dan Clemmensen <dgc@cox.rr.com>

>>Transhumanism is a minor variant on an ancient theme. Plato and
> > Aristotle said the (1) telos of humanity's best (i.e., philosophers) is
to
> > become godlike, (2) that we ought to become godlike, and (3) that
> > dialectical reasoning is the means for philosophers to realize their
telos.
> > Unlike Plato and Aristotle (Hegel, etc.) we do not believe that there is
a
> > little divine element in us that needs to be nurtured. Darwin killed
that
> > idea forever. So, transhumanists substitute technology (most notably
genetic
> > engineering and AI) for (3). What separates us from Plato et al is a
minor
> > quibble about the means to become what we ought to be.
> >
>
> Sorry, but I respectfully and completely disagree. Our current concept
> of the singularity is grounded in science, not philosophy. It is a
> simple forward extrapolation of well-understood phenomena.
>
Since we disagree completely (but respectfully) let us take this in stages.
Let's work on the concept of transhumanism first, and then worry about the
connection (if any) with a singularity. The understanding I offered above of
transhumanism has a teleological and an ethical component. Perhaps we
disagree on this. Your post emphasizes the predictive nature of science.
Perhaps your understanding of transhumanism is similar to Robin Hansons:

"Transhumanism is the idea that new technologies are likely to change the
world so much in the next century or two that our descendants will in many
ways no longer be "human." "

(This quote is from Ander's site). This definition seems more in line with
your comments than the one I offered. For myself, I not sure how
enlightening it is to concentrate simply on the likelihood of various future
histories. Suppose some Joyians (followers of Bill Joy) think that it is
very likely that our descendants may no longer be human, but see this as a
terrible catastrophe. Even though they understand that there is an extremely
low probability of stopping these changes, nevertheless they valiantly
struggle to stop the march of technology. On the other hand, suppose that
there are some that think that we ought to use technology to perfect
ourselves even though they think there is a low probabilty that we will do
so. (Perhaps because they think that the disparate elements of society that
are afraid of such changes will galvanize in opposition making progress
impossible). Nevertheless, these heroic souls bravely struggle for "the
cause". It seems to me that the latter are the transhumanists, not the
former, even though they make rather bleak predictions. In other words, it
is the ethical imperative that seems to me to be more central to the concept
of transhumanism, as opposed to any predictions (short of certainty) about
the likelihood of various future histories. Mark



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:09:30 MST