Re: Transhumanism vs Humanity (WAS: Singularity Card Game Alpha Test)

From: Samantha Atkins (samantha@objectent.com)
Date: Sat Mar 09 2002 - 05:56:28 MST


Anders Sandberg wrote:

> Evidence, please?
>
> If you look at people's reactions to transhumanism you will find plenty of
> reasons they do not take transhumanism seriously or do not consider it a
> good idea. Some of them certainly are due to lack of understanding or
> incompatible values. But is the main reason transhumanism is not filling
> football stadiums with enormous revival meetings that people are
> irrationally resisting it, or due to how transhumanists spread their ideas?

Perhaps you could fill that football statement if it really sank
in that we could live in a world of great abundance where every
person on the planet had primo health, food, clothing, shelter
and more educational, computational and entertainment resources
than they can imagine for the asking. Or we can once a few more
technological steps are taken and once we get over insisting on
scarcity, creating the effects of scarcity, just because it is
so familar and taken for granted that we can't really grok a
world significantly without it. In the treasure room of the
gods we argue and act as if we were in a firesale at a Walmarts.

To me this biggest difficulty is envisioning and communicating a
very fundamentally different future that goes far beyond
hyper-technology by itself.

>>I don't see it as a misallocation of resources to *promote* Transhumanism,
>>as long as we realize that the purpose is to attract those who *can*
>>understand and who *can* contribute. But trying to convince the world that
>>Transhumanism is the future will merely relegate us to cult status or
>>worse, wake the states of the world up to the threat to their existence and
>>bring down even more oppression.
>>

Without a suitably widespread radical revisioning of the
possibilities I am quite pessimistic on humanity surviving. We
literally will fail to understand how to use our superior
technology to actually change and/or meet what most harms and
threatens us. Unless we create the uploads or AIs or SI that
take over everyting (not necessarily a good idea) then we need
to convince enough people to actually create a viable
fundamentally different society or societies. At the door to
being able to change ourselves very fundamentally on a physical
level we must also be free to change our fundamental assumptions
of how human institutions work and how we work. Anything less
is simply hypercharging mindsets grossly out of their proper
context.

>
> The issue isn't converting everybody to transhumanism - what would the point
> of such a mental monoculture be? Rather, we need to create a culture where
> transhumanism is regarded as a valid point of view. Transhumanism should not
> be a silly fringe phenomenon, but rather a (possibly controversial)
> ideological position others that participate in society. That is the only

It is much more than just an ideological position of course.

> way we can really get our hands on the relevant technologies (and have them
> developed in the first place), avoid being arbitrarily outlawed since our
> ideological opponents happen to own the playing field (which is bound to
> happen soon if we do not work on our political act - remember that Leon Kass
> is bioethics advisor to Bush, and he has explicitely said he will fight
> against a posthuman future) and gain allies in our projects.
>

Or we must defang governments from standing in the way. One way
is to promise significantly better circuses (to be somewhat
crass) than ever before and at ultra-low cost and deliver. Sell
a workable beuatiful vision and you will not have trouble
getting enough votes (assuming we can't fix it so such things
are not subject to an open vote) to defeat anti-technology and
anti-future legislation. But present it as mainly an
ideological position and not much else and Transhumanism is DOA
culturally.

 
> Ignoring the beliefs of others is not going to get us there.
>

Presenting better, more compelling, more hopeful and deliverable
beliefs will though. At the least we cannot act as if it is
legitimate to keep humankind in relative misery when the means
exist to end large parts of that misery. We can take the moral
high ground.

>>We need to spread our memes to those who *can* understand and who
>>*can* develop what we need. And if necessary we need to find ways to get
>>the others to develop what we need - by whatever appeals (to greed, or
>>whatever) work.
>>
>

I agree with some of what you say but it is a partial starting
point within a current context. By itself it will not shift the
context significantly.

 
> Who will pay for a developing a brain-computer implant not intended for
> treating handicaps? What pharmaceutical corporation will devote resources to

Why not form a movement toward an abundant future, gather
donations and dues and fund projects that are important to
creating that future ourselves? Give enough people real hope
and a vision and funding is not that big a deal.

> developing a creativity enhancer drug they know will not pass the FDA since
> lack of creativity is not a disease? Why is the idea of genetics tests
> usable by laypeople scaring medical regulators?
>

By the time you produce such a drug human/computer interaction,
invisible computing, ubiquitous computing and so on is likely to
make the drug less necessary. The disease only model of
medicine must change.

Many professions and business are scared because the technology
and changes can easily make their business and business model
bankrupt and meaningless. Which is one of the reason
fundamental abundance needs to be created before many people
will let go of inferior modes of "making a living" rather than
living fully.

> There are some people who agree with transhumanism in positions where they
> can develop useful stuff, and a few who have money. But honestly they are
> not that many - friendly billionaries do not grow on trees. Even if certain
> cool technologies would be developed in this isolated way, they would be
> extremely costly if the costs were not spread out across a big paying
> customer base. This is of course where appeals to greed come in - there are
> many ideas we like that could presumably be killer applications (just think
> of life extension).

Sure. So have the "movement" seed the projects and apps and
funnel a part of the profits back into the "movement".

>
> But even these fields need more than just us convincing the CEO of Novartis
> that life extension is a huge market: the surrounding attitudes in society
> need to be changed in order to make that project viable. Attitudes do not
> change just because a technology appears - look at the reactions to cloning
> and performance enhancement in current society. Instead it is necessary to
> spread the idea that these technologies can be used in a constructive way.
> We need to deal with the philosophical and cultural assumptions underlying
> bans and resistance to these things if we are going to see anybody develop
> them seriously.
>

Yes.

 
> When I talk about culture, I talk about all the "software" our civilization
> carries around - art, traditions, stories, ideologies, language,
> institutions, laws etc. It is not just about movies.
>

When I think of the "software" as culture I think of the
fundamental worldviews and assumptions, our shared context, that
all or institutions, traditions and so on presuppose. These
assumptions are often nearly invisible they are so universally
taken for granted.

> My point is that the motivation and intent is largely derived from the
> cultural sphere. There are of course human drives like seeking security,
> social status, survival etc, but these are channeled and shaped by culture
> (and vice versa!): we invent not just because we are curious and enjoy it,
> but because we have ideological aims, it fits symbols and ideals we have
> grown up with (the mad scientist, the wizard, the hacker, the lone champion
> of truth and all the others), we gain various forms of recognition
> (professional, coolness, membership in clubs), we get rewards (monetary and
> intellectual) and so on. The *things* we invent are even more culturally
> derived (in a sense I would say technology is a proper subset of culture,
> and cannot subsume it more than movies could) - just look at the aims of
> software, and think of how they often reflect various more or less clearly
> expressed cultural aims within western culture.
>

I disagree in part. Technology is not a subset of culture. Its
use and limitations are shaped by culture. These are not the
same thing. Most of these symbols, rewards, images and so on
are rife with cultural assumptions that are part of the problem
(some less so than others). For me the aims of software are
primarily the transformation of culture and the transcendence of
human limitations.

>>And the whole thing will be tossed out the minute it conflicts with their
>>other basic human drives, such as the fear of death.
>>
>

What if you show them they need not fear death because it can be
postponed indefintely? What then? Will you not then turn this
drive in your favor?

- samantha



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:12:52 MST