Re: Heresy

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Thu Aug 09 2001 - 12:49:35 MDT


"natashavita@earthlink.net" wrote:
>
> From: Eliezer S. Yudkowsky sentience@pobox.com
>
> >I should also note that my basic viewpoint is that we should be as
> conservative as possible in applying the loaded and powerful word
> "transhuman".<
>
> And you should be very careful in trying to reinvent terms to suit your particular needs and espousing them as a given.

Sure. Obviously you have a larger creator's stake in the term
"transhuman" than I do. I've had the same problems with keeping a lid on
"Friendly AI", plus (although I didn't invent either) "superintelligence"
and "Singularity". Even so, there are some pressures sufficient even unto
shifting the meaning of a term. I think that there are powerful reasons
to fear overuse of the term "transhuman", so I'm arguing that if the
definition applies to contact lenses, then the definition ought to be
changed. I'm sorry if this isn't clear from the context; I tend to assume
that everyone already knows who Natasha is...

To be specific, the reason I fear the use of the term "transhuman" to our
immediate world is that there is an extremely important and fundamental
distinction between claiming to be transhuman and seeking to become
transhuman. Even if you extend the definition to apply to contact lenses,
so that, logically, claiming "I am transhuman" becomes a weaker claim that
can be satisfied by wearing contact lenses, the claim will still tend to
be misinterpreted by most hearers as implying a level of transhumanity
with a moral dimension. And that makes the whole endeavor into a tawdry
"my group is better than your group" argument, rather than a quest for
something we want but acknowledge we do not have.

I think that "transhuman" has inherent emotional impact regardless of the
formal definition, and I don't think the emotional impact of "transhuman"
can be safely weakened enough that "I am transhuman" will be interpreted
as "I wear contact lenses" rather than "I am your evolutionary superior,
foolish mortal".

-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:09:38 MST