Re: Transhumanism vs Humanity (WAS: Singularity Card Game Alpha Test)

From: Alex Ramonsky (alex@ramonsky.com)
Date: Fri Mar 08 2002 - 02:29:55 MST


From: "Anders Sandberg" <asa@nada.kth.se>
To: <extropians@extropy.org>
Sent: Thursday, March 07, 2002 23:36
Subject: Re: Transhumanism vs Humanity (WAS: Singularity Card Game Alpha
Test)

> Are our ideas and goals served by ignoring what other people think about
> us or even ignoring them? I would say definitely no to that. In fact,
> this tendency to regard "us" transhumanists as the enlightened few who
> "get" it and the rest as either uneducated, mistaken or a luddite
> opposition is one of the factors keeping transhumanist ideas from
> becoming mainstream. If you have already decided the others won't get
> it, you will not think much about ways of helping them get it. If you
> regard them as irrelevant, you also end up regarding their economic,
> political and research impact as irrelevant. Sure, stuff we like may be
> developed by other people, but if transhumanists do not spread their
> ideas the development will be aimed by other memes - memes that most
> likely will be against transhumanist ideals.

...Everything depends on the quality of the intelligence of the particular
'bit' of humanity that you are trying to communicate with. ...I missionaries
introduced their ideas to an intelligent, thinking tribe then that tribe
would either agree with them or say, 'we don't agree with that; we're not
interested.' A less intelligent tribe might think, 'what on earth are they
going on about? I dunno; let's eat 'em.'

>
> Saying technology is the driver puts the cart in front of the horse. It
> leaves out the circular interplay of technology and culture, and makes
> for a simplistic view that technology will advance of its own and that
> it will create a cultural climate amenable to the things we desire.
> Reading up on the history of technology and ideas is a good way of
> dispelling this simple view.

Again, there's a vast difference between sections of humanity who are
open-minded about new tech, and those who daren't leave a message on my
answerphone because they don't have one of their own and it might make their
phone go wrong...or those who ring up technical support and ask, 'where's
the ANY key?' (press any key to...) Some folks genuinely fear tech, and
these are the people we'll have big problems with. Scaremongering (witness
the GM and cloning debates; the word 'frankenstein', etc, etc)

> A small step in
> the right direction, slowly building a case that will help people
> integrate ideas of technological transformation into their worldview.
> But for this to work we have to dress in suits and ties, learn to
> explain why freedom, technology and progress are good things and show
> that our vision is not just realistic but also the right thing to do.
>
...absolutely...camouflage and infiltrate. But I always have a dilemma
here...am I wanting to share information and discoveries with 'humanity' for
humanity's own good, or am I reassuring humanity so that it will not
interfere with my own chosen methods of personal development and growth and
ongoing survival?

>
> If you regard transhumanism as a move away from humanity, it would be
> interesting to hear what you consider it as a move *towards*, and why
> that move is desirable for human individuals.

...humans suffer. Quite horribly, and in a lot of ways. It would be great to
free people from the suffering of the body, and equally if not more great to
free people from mental suffering. Emotional anguish cripples intelligence
and the ability to interact, as I'm sure anyone who's ever lost a loved one
(or a lover!) knows.
Awareness of immanent death also causes emotional anguish. I don't mean the
scenario of thinking you're about to die because some nutter has a gun
against your head (although that's pretty unpleasant too) I mean a major
part of being human currently is the fact that you're going to die...I think
it would be desirable for humanity to escape that...if it wants to, of
course.

>
> I think assuming transhumanists to be genetically different or the
> result of some special revelation overthrowing the illusions plaguing
> the rest of humanity is quite premature.

...premature to assume, or premature to talk about openly?

 It is just a convenient and
> self-congratulatory way of isolating oneself.

...Quite the opposite! Anyone assuming such a thing meets the very real
problem of how on earth to share their awareness without freaking people
out. This is where we need coherent, articulate speakers and
representatives, who don't get swayed into emotive irrationality by
heavily-emotionally-laden arguments and can hold their own even when shouted
down by luddites.

 The groups that ended up controlling the meme pool
> were those who talked to people, that got involved in society and
> articulated their visions.
>
...absolutely. But there are going to be two groups here...on the one hand,
those trying to introduce transhumanist concepts gently...like, hey,
wouldn't it be great if we could extend our life spans a bit...and those who
just come out with it all, and, hey, let's all upload and bugger off to
another star system and build ourselves robot bodies, etc etc. This latter
group are going to cause some culture shock. But by comparison, the 'gentle'
group will seem like a safe option to listen to. Just the same way that
there are nutters out there hitting the press with stuff like ';we're going
to clone jesus'...by comparison, some dude cloning a pet cat seems not only
blameless but desirable.
...I think we all know which group I'm in......but the point I'm making is,
we're all necessary. And those among us who are good PR speakers are at
present the most necessary of all.

Ramonsky
Entelechy Institute



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:12:50 MST