From: Anders Sandberg (asa@nada.kth.se)
Date: Fri Mar 08 2002 - 10:48:19 MST
On Fri, Mar 08, 2002 at 09:29:55AM -0000, Alex Ramonsky wrote:
>
> > Saying technology is the driver puts the cart in front of the horse. It
> > leaves out the circular interplay of technology and culture, and makes
> > for a simplistic view that technology will advance of its own and that
> > it will create a cultural climate amenable to the things we desire.
> > Reading up on the history of technology and ideas is a good way of
> > dispelling this simple view.
>
> Again, there's a vast difference between sections of humanity who are
> open-minded about new tech, and those who daren't leave a message on my
> answerphone because they don't have one of their own and it might make their
> phone go wrong...or those who ring up technical support and ask, 'where's
> the ANY key?' (press any key to...) Some folks genuinely fear tech, and
> these are the people we'll have big problems with. Scaremongering (witness
> the GM and cloning debates; the word 'frankenstein', etc, etc)
And why do we have these scare debates? One large reason I found in the work
of studying the gene debate was simply that many of the people developing
the technology and industry never tried talking with the worried people (at
least not on their level; hearing something in Higher Academese and then
being told that this is the reason genetic engineering is safe and ethical
is not communication and does not promote trust). When ethical or
value-loaded arguments appeared, scientists and businesspeople either
disregarded them (since they were outside the framework they could talk
about as scientific experts) or viewed them as irrelevant; this was soon
seized by some people who found that they could win debates by invoking
"ethics" without having to do any ethical thinking.
I think exactly the same thing holds true for transhumanism. As long as we
do not deign to speak with people disagreeing with us or unaligned people,
we will lose.
> > A small step in
> > the right direction, slowly building a case that will help people
> > integrate ideas of technological transformation into their worldview.
> > But for this to work we have to dress in suits and ties, learn to
> > explain why freedom, technology and progress are good things and show
> > that our vision is not just realistic but also the right thing to do.
> >
> ...absolutely...camouflage and infiltrate.
Actually, suits and ties are not camouflage, they are a form of
communication :-) When you speak to a tribe in the jungles of New Guinea you
have to act according to their customs if you want them to listen, and the
same goes for a roomful of decisionmakers.
> But I always have a dilemma
> here...am I wanting to share information and discoveries with 'humanity' for
> humanity's own good, or am I reassuring humanity so that it will not
> interfere with my own chosen methods of personal development and growth and
> ongoing survival?
That depends on your personal moral system. I believe in enlightened self
interest and network economics: the more good stuff humanity gets, the more
good stuff it is likely to develop that I can use. Six billion brains think
more than one (not necessarily in my direction, but at least a lot of
thinking gets done). I also have a certain aesthetics where I want to make
sure the world is as interesting as possible, and this is usually best
served by helping it.
> > If you regard transhumanism as a move away from humanity, it would be
> > interesting to hear what you consider it as a move *towards*, and why
> > that move is desirable for human individuals.
>
> ...humans suffer. Quite horribly, and in a lot of ways. It would be great to
> free people from the suffering of the body, and equally if not more great to
> free people from mental suffering. Emotional anguish cripples intelligence
> and the ability to interact, as I'm sure anyone who's ever lost a loved one
> (or a lover!) knows.
> Awareness of immanent death also causes emotional anguish. I don't mean the
> scenario of thinking you're about to die because some nutter has a gun
> against your head (although that's pretty unpleasant too) I mean a major
> part of being human currently is the fact that you're going to die...I think
> it would be desirable for humanity to escape that...if it wants to, of
> course.
I don't consider suffering and limitations as relevant definitions of
humanity (I know some philosophers do, and they get very upset when we say
we want to get rid of these limits - and hence our very humanity). I think
the important part is using our possibilities instead. Most of the suffering
above is or will likely be irrelevant, but new forms of suffering or other
aversive emotional states may appear as we extend ourselves.
Again, the important thing isn't what we are running from, but what we are
running *to*.
> > I think assuming transhumanists to be genetically different or the
> > result of some special revelation overthrowing the illusions plaguing
> > the rest of humanity is quite premature.
>
> ...premature to assume, or premature to talk about openly?
Premature to assume. In fact, the genetic determinants are probably just as
irrelevant as genetic determinants for political views or taste in clothing
- there are subtle effects there, but not anything that overrides our wills.
> It is just a convenient and
> > self-congratulatory way of isolating oneself.
>
> ...Quite the opposite! Anyone assuming such a thing meets the very real
> problem of how on earth to share their awareness without freaking people
> out. This is where we need coherent, articulate speakers and
> representatives, who don't get swayed into emotive irrationality by
> heavily-emotionally-laden arguments and can hold their own even when shouted
> down by luddites.
But assuming that "we" are somehow fundamentally different from "them" sets
up an insurmountable barrier for communication. We might be feeling the
transhuman man's burden when we try to bring the poor benighted luddites to
truth and progress, but the inherent smugness of that view will leak through
and cause resentment and eventual backlash. It is better to realize that we
transhumanists aren't that different from anybody else, we just happen to
have a few uncommon views and somewhat larger ambitions. That helps a lot
when the luddite is screaming - I can emphatise with him, and try to see
what kind of dialogue would help open up real communication rather than a
shouting contest. The next time you see Bill Joy, Leon Kass or Rifkin on TV,
think "there, for the grace of God, goes I" ;-)
> The groups that ended up controlling the meme pool
> > were those who talked to people, that got involved in society and
> > articulated their visions.
> >
> ...absolutely. But there are going to be two groups here...on the one hand,
> those trying to introduce transhumanist concepts gently...like, hey,
> wouldn't it be great if we could extend our life spans a bit...and those who
> just come out with it all, and, hey, let's all upload and bugger off to
> another star system and build ourselves robot bodies, etc etc. This latter
> group are going to cause some culture shock. But by comparison, the 'gentle'
> group will seem like a safe option to listen to. Just the same way that
> there are nutters out there hitting the press with stuff like ';we're going
> to clone jesus'...by comparison, some dude cloning a pet cat seems not only
> blameless but desirable.
Have you noted that this is exactly what is happening in many other issues?
OK, we won't evacuate California, but we can institute a bit tighter
environmental regulations. OK, socialising the entire stock market would
cause problems, but we could use the worker's pension funds to buy up
everything over time. And so on. In a situation where the alternatives are
biased (mainstream vs. the crazies) the middle ground is biased.
> ...I think we all know which group I'm in......but the point I'm making is,
> we're all necessary. And those among us who are good PR speakers are at
> present the most necessary of all.
True. But it is not just PR. We need to develop our ideas too. Exactly what
do we want, why and how do we get it? If we can answer those questions we
can start giving people answers to questions like "How can transhumanism
help you?", "Why is transhumanism the right thing to do now?" and "Where do
I sign up?"
-- ----------------------------------------------------------------------- Anders Sandberg Towards Ascension! asa@nada.kth.se http://www.nada.kth.se/~asa/ GCS/M/S/O d++ -p+ c++++ !l u+ e++ m++ s+/+ n--- h+/* f+ g+ w++ t+ r+ !y
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:12:51 MST