From: Richard Steven Hack (richardhack@pcmagic.net)
Date: Fri Mar 08 2002 - 22:07:06 MST
At 05:51 PM 3/8/02 +0100, you wrote:
>Evidence, please?
>
>If you look at people's reactions to transhumanism you will find plenty of
>reasons they do not take transhumanism seriously or do not consider it a
>good idea. Some of them certainly are due to lack of understanding or
>incompatible values. But is the main reason transhumanism is not filling
>football stadiums with enormous revival meetings that people are
>irrationally resisting it, or due to how transhumanists spread their ideas?
Evidence that it is in the way Transhumanists spread their ideas?
>After all, plenty of other ideas that have been fiercely resisted and
>regarded as both silly or evil have eventually triumphed - Christian ideas
>of mercy rather than the roman ideal of clemency, democracy, women's
>suffrage, socialism etc. In all of these cases the vast majority considered
>them outrageous from the start, and yet they changed their opinion. That was
>hardly due to technological factors or some spontaneous mental evolution,
>but plain and simple spread of ideas. The scribblings of philosophers have
>surprising penetration power.
My point is not that ideas can be resisted and not triumph - although I
suspect it is not simply because somebody spread them around. My point is
that most of the ideas - most of the ones you cite especially - are not
particularly rational and therefore have an easier time of it being
accepted by irrational people.
> > > If you
> > >regard them as irrelevant, you also end up regarding their economic,
> > >political and research impact as irrelevant.
> >
> > Not necessarily - that doesn't follow. They are irrelevant in the sense
> > that what they believe is not relevant. What they do is relevant only in
> > the sense that we must be prepared to deal with it regardless of what they
> > do. If they assist us, all good, if not, we deal with it. But it is a
> > misallocation of resources to try to convince them ALL of the value of
> > Transhumanism.
> >
> > I don't see it as a misallocation of resources to *promote* Transhumanism,
> > as long as we realize that the purpose is to attract those who *can*
> > understand and who *can* contribute. But trying to convince the world
> that
> > Transhumanism is the future will merely relegate us to cult status or
> > worse, wake the states of the world up to the threat to their existence
> and
> > bring down even more oppression.
>
>The issue isn't converting everybody to transhumanism - what would the point
>of such a mental monoculture be?
>Rather, we need to create a culture where
>transhumanism is regarded as a valid point of view. Transhumanism should not
>be a silly fringe phenomenon, but rather a (possibly controversial)
>ideological position others that participate in society.
I did not mean to suggest that everybody would be Transhumanist - I meant
that even getting to the point where Transhumanism is an acceptable concept
to a significant percentage of the population - even the population of the
Western World only - is extremely unlikely. Also it is extremely unlikely
to occur within the time frame preceding the Singularity - unless the
technologies developed for the Singularity can be used to speed up the
process (which I suppose is possible). But then we have a sort of
chicken-and-egg question: if we use technology to help us convince people
that Transhumanism is cool, do we have to convince them first about the
coolness of that technology?
> That is the only
>way we can really get our hands on the relevant technologies (and have them
>developed in the first place), avoid being arbitrarily outlawed since our
>ideological opponents happen to own the playing field (which is bound to
>happen soon if we do not work on our political act - remember that Leon Kass
>is bioethics advisor to Bush, and he has explicitely said he will fight
>against a posthuman future) and gain allies in our projects.
I disagree that it is the ONLY way. If the ONLY way we will ever get
Transhuman technology is to convince people that Transhumanism is
acceptable, we're hit. It ain't happening. Fortunately, I don't buy
that. Most people probably won't get a damn about Transhumanism or the
technology until the technology is developed or well on the way to being
developed - by which time it may be too late (for them to stop it). Nobody
stopped the development of nuclear weapons despite the misgivings of the
scientists involved. The public had no clue and cared less - until it
became socially fashionable to be an "anti-nuke activist" - and that was an
insignificant (if vocal) part of the population...
>Ignoring the beliefs of others is not going to get us there.
It's not going to stop us getting there.
> > > Sure, stuff we like may be
> > >developed by other people, but if transhumanists do not spread their
> > >ideas the development will be aimed by other memes - memes that most
> > >likely will be against transhumanist ideals.
> >
> > We need to spread our memes to those who *can* understand and who
> > *can* develop what we need. And if necessary we need to find ways to get
> > the others to develop what we need - by whatever appeals (to greed, or
> > whatever) work.
>
>Who will pay for a developing a brain-computer implant not intended for
>treating handicaps? What pharmaceutical corporation will devote resources to
>developing a creativity enhancer drug they know will not pass the FDA since
>lack of creativity is not a disease? Why is the idea of genetics tests
>usable by laypeople scaring medical regulators?
>
>There are some people who agree with transhumanism in positions where they
>can develop useful stuff, and a few who have money. But honestly they are
>not that many - friendly billionaries do not grow on trees. Even if certain
>cool technologies would be developed in this isolated way, they would be
>extremely costly if the costs were not spread out across a big paying
>customer base. This is of course where appeals to greed come in - there are
>many ideas we like that could presumably be killer applications (just think
>of life extension).
Exactly. The technology is going to be developed by people who want to
do it either because it is emotionally or intellectually important to them
to do it, or because they hope to get rich. My own intention is to work in
AI, develop some cool stuff, put Bill Gates out of business, make one or
ten billion dollars and date Jodie Foster - :-}
>But even these fields need more than just us convincing the CEO of Novartis
>that life extension is a huge market: the surrounding attitudes in society
>need to be changed in order to make that project viable. Attitudes do not
>change just because a technology appears - look at the reactions to cloning
>and performance enhancement in current society.
My point exactly. I'm not suggesting attitudes will change when the
technology appears - but again it also depends on the technology. Most
people have trouble seeing any benefit in cloning - at least in cloning
humans for the purpose of reproducing a human - hell, I have trouble seeing
any benefit in that. There may well be people who will have problems with
the notion of life extension per se - they will argue that it will cause
over-population, or whatever. But most people I think will go along with
the idea of living another 50, 100, or more healthy years - provided it can
be demonstrated.
And while some people don't like performance enhancement, some people
obviously do or it wouldn't be an issue. So if the technology can be shown
to have the benefits claimed, people will take it up. But they will likely
do so without concerning themselves about the philosophical and ethical
ramifications of it all. They'll leave that to the people who want to
establish themselves as morally superior...
> Instead it is necessary to
>spread the idea that these technologies can be used in a constructive way.
>We need to deal with the philosophical and cultural assumptions underlying
>bans and resistance to these things if we are going to see anybody develop
>them seriously.
The bans we're talking about came about because somebody DID develop the
technologies involved for whatever purpose. Obviously, the technology came
first, and the bans came second. How then can you argue that the
technology cannot be developed without social acceptance? If you are
saying that the technology will be developed, then banned, unless we have
social acceptance, I can only refer to Drexler's comments that bans will
not work. Bans may make things harder, certainly, and if we can avoid
that, it would be desirable - my quibble is with the notion that we can
avoid that and that we must devote substantial allocation of resources to
social acceptance in order to avoid that.
> > >Saying technology is the driver puts the cart in front of the horse. It
> > >leaves out the circular interplay of technology and culture, and makes
> > >for a simplistic view that technology will advance of its own and that
> > >it will create a cultural climate amenable to the things we desire.
> >
> > It will. In this century, technology will *subsume* culture. Oh, of
> > course, culture will have its influence initially. And saying that
> > "technology can advance on its own" makes no sense - obviously there
> has to
> > be motivation and intent - whether corporate greed or the love of
> > discovery; the point is that the research will continue to be
> done. And it
> > will influence the culture if people who are interested in that make it so
> > - such as the special effects in movies, or even the Net games being
> > discussed in other threads.
>
>When I talk about culture, I talk about all the "software" our civilization
>carries around - art, traditions, stories, ideologies, language,
>institutions, laws etc. It is not just about movies.
Obviously. And I am saying that art, traditions, stories, ideologies,
language, institutions, laws, etc. will all be directly affected by
technology more so than technology will be affected by them. This effect
may not be that noticeable in the near term, but will be greater as more
powerful technologies are developed.
>My point is that the motivation and intent is largely derived from the
>cultural sphere. There are of course human drives like seeking security,
>social status, survival etc, but these are channeled and shaped by culture
>(and vice versa!): we invent not just because we are curious and enjoy it,
>but because we have ideological aims, it fits symbols and ideals we have
>grown up with (the mad scientist, the wizard, the hacker, the lone champion
>of truth and all the others), we gain various forms of recognition
>(professional, coolness, membership in clubs), we get rewards (monetary and
>intellectual) and so on. The *things* we invent are even more culturally
>derived (in a sense I would say technology is a proper subset of culture,
>and cannot subsume it more than movies could) - just look at the aims of
>software, and think of how they often reflect various more or less clearly
>expressed cultural aims within western culture.
I'm afraid I didn't get your point here. "The aims of software reflect
clearly expressed cultural aims?" - could you be more specific?
>That technology can get tremendously powerful and is likely to continue as
>long as our current style of individualist culture remains is true, but
>remember the lesson of Ming China! Progress is by no means the default
>state, and can be ended by a cultural shift. If you do not make sure your
>culture will support your development, you might end up stymied.
My point is precisely that you cannot "make sure" of that - and every
example you can cite in that regard will support that point. Do you think
nobody in Ming China wanted progress? Did they argue against the
suppression of progress? Were they successful?
> > >A small step in
> > >the right direction, slowly building a case that will help people
> > >integrate ideas of technological transformation into their worldview.
> > >But for this to work we have to dress in suits and ties, learn to
> > >explain why freedom, technology and progress are good things and show
> > >that our vision is not just realistic but also the right thing to do.
> > >
> >
> > And the whole thing will be tossed out the minute it conflicts with their
> > other basic human drives, such as the fear of death.
>
>You mean like people always toss out everything they believe to be true when
>it suits them?
Yes, when the consequences become apparent to them that what they "believe"
(superficially or otherwise) conflicts with their basic human fears. It is
a point in debate that to get your opponent to abandon his position, align
that position with what he fears. It is not a question of "what suits
them" - except in the sense that "what suits them" is to behave in an
irrational, fearful manner.
> > No, the groups that ended up controlling the meme pool were those who
> > *could* talk to the people because they operated on the same irrational,
> > emotional basis as most people do. And those who tried to use reason to
> > oppose those people didn't succeed precisely because of that.
>
>You mix up presentation with message. To be a great orator, you need to show
>both logos (logic), pathos (emotion) and ethos (being morally just); if you
>just have one or two of these aspects in your message it will be weaker.
So Hitler had logic, emotion, and was morally just?
>That doesn't say anything about the truth of the message.
My point exactly. Those who can talk to the people share the people's
irrationality.
> Many of the most
>successful groups in affecting the meme pool have spread messages that are
>opposed to much of normal human emotions - democracy, rationality, the
>scientific method, tolerance etc (the whole enlightenment tradition). They
>did this in clever, forceful ways, but the message was not about being
>emotional.
Again, you say these people were "successful" - I look around me, I don't
see it. Once again, you seem to ASSUME that simply putting forth a
rational idea in a calm, supportive manner will somehow guarantee its
acceptance even in the absence of other factors. I see no evidence of this
in human history.
>I think you should really take a look at the history of ideas, maybe check
>out Hayek's _The Intellectuals and Socialism_. Memetics is just the
>"transhumanistically correct" (and lazy?) way of talking about culture, and
>there is a far longer tradition of studying what happens with ideas and
>ideology than the current transhumanist movement. One of the main things we
>can learn from history is actually what others have tried and why they
>failed - and when it comes to the area of changing the world, many many
>groups have done that, which provides rather useful background database. I
>think you would find, if you explore this field, that many of the views you
>have espoused in this discussion are common to many of the failed movements.
And they failed because they tried to use reason - whereas those who
changed the world (Christianity, Islam, communism, ad nauseam) did not.
The bottom line is the old saw: "You can't rationally convince somebody
out of something he wasn't rationally convinced into."
>What you need to show is that either the situation now is sufficiently
>different for them to work, or that your combination actually manages to get
>some unique new result.
The difference is the technology level involved. We don't NEED to convince
anybody of anything - we just need the technology. And we can get the
technology without having to make Transhumanism mainstream.
> > Good post, Anders. Gave me a chance to rant a bit and that always helps
> > clarify my thinking - :-}
>
>The same. I think these are very important issues, and such issues always
>need clarifying.
>
>--
>-----------------------------------------------------------------------
>Anders Sandberg Towards Ascension!
>asa@nada.kth.se http://www.nada.kth.se/~asa/
>GCS/M/S/O d++ -p+ c++++ !l u+ e++ m++ s+/+ n--- h+/* f+ g+ w++ t+ r+ !y
Richard Steven Hack
richardhack@pcmagic.net
--- Outgoing e-mail is certified Virus Free. Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.332 / Virus Database: 186 - Release Date: 3/6/02
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:12:52 MST