From: Richard Steven Hack (richardhack@pcmagic.net)
Date: Fri Mar 08 2002 - 23:13:18 MST
At 06:48 PM 3/8/02 +0100, you wrote:
>And why do we have these scare debates? One large reason I found in the work
>of studying the gene debate was simply that many of the people developing
>the technology and industry never tried talking with the worried people (at
>least not on their level; hearing something in Higher Academese and then
>being told that this is the reason genetic engineering is safe and ethical
>is not communication and does not promote trust). When ethical or
>value-loaded arguments appeared, scientists and businesspeople either
>disregarded them (since they were outside the framework they could talk
>about as scientific experts) or viewed them as irrelevant; this was soon
>seized by some people who found that they could win debates by invoking
>"ethics" without having to do any ethical thinking.
>
>I think exactly the same thing holds true for transhumanism. As long as we
>do not deign to speak with people disagreeing with us or unaligned people,
>we will lose.
I think that is a very pessimistic viewpoint - and I'm supposed to be the
pessimist when it comes to humans :-}
You may be correct that SOMEONE should be addressing the fears of people -
I have no problem with that other than to suggest that it is not likely to
be entirely effective. What I object to is the notion that we have to
expend great effort to "make sure", as you put it elsewhere, that
Transhumanism is an acceptable part of the mainstream.
> > But I always have a dilemma
> > here...am I wanting to share information and discoveries with
> 'humanity' for
> > humanity's own good, or am I reassuring humanity so that it will not
> > interfere with my own chosen methods of personal development and growth and
> > ongoing survival?
>
>That depends on your personal moral system. I believe in enlightened self
>interest and network economics: the more good stuff humanity gets, the more
>good stuff it is likely to develop that I can use. Six billion brains think
>more than one (not necessarily in my direction, but at least a lot of
>thinking gets done).
My point exactly - the technology will get done whether people like it or
not. Because not *everyone* will dislike it. Therefore, the overall
belief structures of humans is not relevant. If it *does* become a
problem, we'll deal with it then.
>I also have a certain aesthetics where I want to make
>sure the world is as interesting as possible, and this is usually best
>served by helping it.
My aesthetic is more along the line of "how can I get out of here?"
> > > If you regard transhumanism as a move away from humanity, it would be
> > > interesting to hear what you consider it as a move *towards*, and why
> > > that move is desirable for human individuals.
> >
> > ...humans suffer. Quite horribly, and in a lot of ways. It would be
> great to
> > free people from the suffering of the body, and equally if not more
> great to
> > free people from mental suffering. Emotional anguish cripples intelligence
> > and the ability to interact, as I'm sure anyone who's ever lost a loved one
> > (or a lover!) knows.
> > Awareness of immanent death also causes emotional anguish. I don't mean the
> > scenario of thinking you're about to die because some nutter has a gun
> > against your head (although that's pretty unpleasant too) I mean a major
> > part of being human currently is the fact that you're going to die...I
> think
> > it would be desirable for humanity to escape that...if it wants to, of
> > course.
>
>I don't consider suffering and limitations as relevant definitions of
>humanity
That wasn't the question - the question was: what are we moving
towards? His answer was: we are moving towards freedom from suffering and
death. You can argue that this is more a moving *away* from something, but
that is mere quibbling. The point is the same: we are transcending human
nature because part of human nature (a "core trait", one might argue) is
that we die.
>(I know some philosophers do, and they get very upset when we say
>we want to get rid of these limits - and hence our very humanity). I think
>the important part is using our possibilities instead. Most of the suffering
>above is or will likely be irrelevant, but new forms of suffering or other
>aversive emotional states may appear as we extend ourselves.
Possibly. Some speculation on that might be interesting, but I doubt we
can have any firm conclusions.
>Again, the important thing isn't what we are running from, but what we are
>running *to*.
Continuity of existence. The prerequisite for all other "to's".
> > > I think assuming transhumanists to be genetically different or the
> > > result of some special revelation overthrowing the illusions plaguing
> > > the rest of humanity is quite premature.
> >
> > ...premature to assume, or premature to talk about openly?
>
>Premature to assume. In fact, the genetic determinants are probably just as
>irrelevant as genetic determinants for political views or taste in clothing
>- there are subtle effects there, but not anything that overrides our wills.
Ah, I would not be too hasty to say that they are irrelevant. They may not
be directly determinative, but they may be relevant. See my comments in my
other post.
> > It is just a convenient and
> > > self-congratulatory way of isolating oneself.
> >
> > ...Quite the opposite! Anyone assuming such a thing meets the very real
> > problem of how on earth to share their awareness without freaking people
> > out. This is where we need coherent, articulate speakers and
> > representatives, who don't get swayed into emotive irrationality by
> > heavily-emotionally-laden arguments and can hold their own even when
> shouted
> > down by luddites.
>
>But assuming that "we" are somehow fundamentally different from "them" sets
>up an insurmountable barrier for communication. We might be feeling the
>transhuman man's burden when we try to bring the poor benighted luddites to
>truth and progress, but the inherent smugness of that view will leak through
>and cause resentment and eventual backlash.
So don't bother...
> It is better to realize that we
>transhumanists aren't that different from anybody else, we just happen to
>have a few uncommon views and somewhat larger ambitions.
Well, that's true in the sense that we *are* still human in the
biological/evolutionary sense. But our disagreements and fundamental
conflicts with humanity, I believe, set us sufficiently at odds with humans
that it will not be possible to convince them otherwise. Even if we do not
let our "smugness seep through", our viewpoints by themselves will trigger
the "flight" response.
> That helps a lot
>when the luddite is screaming - I can emphatise with him, and try to see
>what kind of dialogue would help open up real communication rather than a
>shouting contest. The next time you see Bill Joy, Leon Kass or Rifkin on TV,
>think "there, for the grace of God, goes I" ;-)
When you convince one of those individuals about the value of
Transhumanism, I will consider your argument more carefully...
>We need to develop our ideas too. Exactly what
>do we want, why and how do we get it? If we can answer those questions we
>can start giving people answers to questions like "How can transhumanism
>help you?", "Why is transhumanism the right thing to do now?" and "Where do
>I sign up?"
>
>--
>-----------------------------------------------------------------------
>Anders Sandberg Towards Ascension!
>asa@nada.kth.se http://www.nada.kth.se/~asa/
>GCS/M/S/O d++ -p+ c++++ !l u+ e++ m++ s+/+ n--- h+/* f+ g+ w++ t+ r+ !y
I can answer those questions for humans right now. The problem is - they
won't buy it.
Richard Steven Hack
richardhack@pcmagic.net
--- Outgoing e-mail is certified Virus Free. Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.332 / Virus Database: 186 - Release Date: 3/6/02
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:12:52 MST