Converting scientists into transhumanists (was Re: seti@home ...)

From: Robert J. Bradbury (bradbury@aeiveos.com)
Date: Fri Jul 09 1999 - 10:56:00 MDT


> From: "Daniel J. Boone" <djboone@romea.com>

> Refusing to waste one's effort on their seeming folly is one thing;
> deriding it as utterly pointless is quite another, more apparently
> arrogant thing.

The problem is that "reality is a function of agreement and agreement
is a function of enrollment". There are different forms of "enrollment"
that will be more or less effective depending on the the enroller and
the enrollee.

The reality seems to be "SETI@home" as a strategy for enrolling
people works. It is a "soft-sell" based on the idea that many
people want to believe there are others out there "like us"
(witness the popularity of "Contact", which I very much enjoyed).
People want to contribute if they have have a simple means of doing so.

I don't have the platform to "unsell" the SETI@home idea.
Unselling an idea people *want to believe* is very difficult.
Nor do I have a platform to "educate" about Extropian ideas so people
can see some of the flaws in the SETI@home approach.

I do have a small window which is to invoke "the fear of being conned".
Nobody wants to appear stupid, by invoking this fear you may get their
attention long enough for them to listen to your side of the story or
examine their own beliefs. It is not dissimilar from the original
Proxmire strategy, except I'm not saying "this will never work",
I'm saying, "I believe what you believe and it will work if you
do it the right way".

But I have to get your attention for long enough for you to hear
that. That is where being arrogant comes in. If I say softly
"you are being conned", people will say "what do you know, I'm
busy, go away". But if I say, "WHAT YOU ARE DOING IS REALLY STUPID",
some people may stop for long enough to listen to me.

If I shoot my mouth off alot crying "wolf" people will learn to
ignore me. But if I back up my claim with some reasoned arguments
and demonstrate that I just might know something about a topic
people will sit down and listen for a while. [In the SETI case I've
spent the last year and a half almost full time studying the problem.]

>
> Couldn't tell you. Perhaps they dunno transhumanism from transisters? It's
> not exactly a fully-entrenched meme, plus it seems to suffer from a lunatic
> fringe that gets more attention than the serious transhumanist thinkers.

Have to agree here, transhumanism isn't even a blip on the map.
In my dreams I can see the eyebrows going up on the radioastronomers
faces when I say ... and then the logical thing for a civilization
to do in the post singularity era is for the individuals to upload
their minds into the M-Brain providing them essentially with immortality
(within the limits of the longevity of the universe of course)...

>
> Seriously, I would guess the experimental design of most SETI projects is
> constrained by the Proxmirish "these looneys are looking for Bug-Eyed
> Monsters!" reaction they always get in the press.
>
SETI projects are always funding constrained, but given that much
of their funding is coming from A. Grove and P. Allen (who are, shall
we say, not computer illiterates), then they have reason to be both
interested in and afraid of people discussing "supercomputers in space"
in ways they don't understand (the core SETI people are are almost entirely
radioastronomers).

> A SETI program informed by the conclusions of transhumanism would, to the
> general populace not so informed, seem even more loony, and would find it
> harder to attract and keep the attention of "serious" (i.e., leery of
> association with the unusual) scientists.

Agreed. You sell the extropians based on the memes they already know.
You then sell the hardcore SETI people by educating them to the
emerging technologies. You throw down Nanosystems and Nanomedicine
on the table and you say, "What? You don't believe that self-replicating
molecular nanomachines are possible. That's funny, you have 40 trillion
of them in or on your body." "What? You don't believe in nanocomputers
with 100,000 times the capacity of the human brain? Says so, right
here in this book [or it does with you cross Drexler & Moravec]."
"What? You don't believe that 1 hour doubling times allows you
to disassemble Mercury and harvest the entire power output of the
sun in 2 weeks, well here is the chart." Etc.

With the scientists, you can connect the dots (after they've gone
away long enough for them to wrestle with their unexamined "beliefs").
Ultimately we may end up with a transhumanist SETI Institute
(but it is going to take some work). Since they, as an organization,
have a fair amount of "respect" in the scientific community
(people may think what they are doing is silly, but they respect
that they are doing it well), you can then leverage transhumanist
ideas outward from that.

Robert



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:04:26 MST