From: J. R. Molloy (jr@shasta.com)
Date: Wed Oct 11 2000 - 12:34:45 MDT
"Anders Sandberg":
> Hi again! After four months of conferences, symposia and courses on
> transhumanism, psychology, neurocience, biomodelling and everything
> else I have finally time to settle back into the list. Seems
> everything is as usual here, everybody happy and gay and debating AI,
> guns, death penalties and cryonics :-)
Welcome back, Anders.
Yes, the list is about the same as ever. Same old topics, same old complaints,
same old... same old. Interesting that people who engage a discussion of
accelerating change (of the extropic kind) do not themselves seem to change
their minds, their opinions, their politics, their attitudes, or their
philosophical positions.
You took courses on transhumanism!?!? Where do they offer courses on
transhumanism?
> The idea that some will go on towards Singularity (or whatever) and
> the rest will remain, is of course one of the main Scary Arguments
> Against the Singularity (cf. Stuart Brand, _The Clock of the Long Now_
> and Jaron Lanier's edge piece). Exponential growth might also imply
> exponential growth of differences, and this is generally viewed as a
> bad thing. After lecturing some first year students on waguely
> transhumanism related stuff, I noticed this was one of the arguments
> against it that seemed to work best. Well worth considering.
I don't understand how the exponential growth of differences (a bad thing) works
best as an argument against technological singularity. Or do I misunderstand?
> The problem is relevant not just for the Singularity but any knowledge
> economy: if there are no strong rules against changing social class,
> but doing so depends merely on an act of will/ambition/ability, the
> resulting situation will still have some who due to their own values,
> passivity, culture or whatever remain down. The door is open, but they
> either think it is closed, to hard to get past or that remaining is
> good. The problem with this division is that it cannot be redressed
> using traditional means like redistributing money or rights, but would
> need some ways of redistributing or improving ambition (the later
> might actually be doable, by creating a more practically optimist
> social atmosphere). And even then we would have the people who do not
> wish to rise to the heavens.
In addition, some people know that even if technological singularity occurs in
ten years, it will be too late for them... their health will not sustain them
that long, and they can't afford cryonic suspension. Others realize that, due to
their economic situation and the fact they are past their prime, they don't
stand a chance of joining the singularity clique. Anti-geezer sentiments run
high among young turks (although you may not be as sensitive to this issue as
some of us). Sure, geezers want to "rise to the heavens" ...the problem relates
to being tied down by decades of psychic baggage and an unattractive persona.
Who would we rather recruit, a bevy of nubile young extro-girls, or some lame
old duck who can't even learn computer languages, but who still wants to join
the party? (Hint: the geezers can't even make it to the party.)
> Ethically, I would say transhumanism values diversity and individual
> development. But it is not based on some narrow definition of linear
> progress - we are a bit too postmodern for that - so what kind of
> individual development is desirable is in the end up to the individual
> to make a decision about. A bit like the subjective optimality
> definition of health in Freita's _Nanomedicine_. Now, as I see this,
> when applied to the incomplete singularity scenario, this leads to the
> conclusion that it is completely OK to decide not to join in due to
> one's value system. In fact, it increases the diversity of
> humanity.
I think transhumanism values change and extropic progress more than it values
developing individuals who, because of their need for remedial and augmented
mental prosthetics, have difficulty keeping up (never mind making a positive
contribution). For example, I'd really like to take some courses in psychology,
neuroscience, biomodelling, robotics, computer science, and network
administration, but guess what... I have trouble mastering HTML -- and by the
time I could learn it, it would be replaced by XML or VML (voice markup language
a la Kurzweil and Co.). When it comes to diversity in the technological
singularity, "many are called, but few are chosen." Speaking just for myself
(crotchety geezers don't really like others speaking for them you know) I've
noticed quite a bit of resistance to AI ideas and technological singularity from
Extropy list participants who have better credentials than I'll be able to
garner even if I wanted to. I see technological singularity the way Moravec
describes it in reference to Mind Children: I can help in some small way to
launch the process, but I'll not be able to do what SI will be able to do, so
like a proud parent giving away the bride, I just wish it the best and cherish
the promise of its bountiful future through tears of joy. Like "so long kids...
live forever and prosper infinitely." Anyway, I know >H AI doesn't want an old
fart like me around on its honeymoon. <gag me with a virtual spoon>
> What might happen is of course that the posthumans (or people of the
> new economy or whatever) want to make sure people staying behind do
> this due to their own values as a rational decision, rather than just
> due to mistaken information, habit or something else. Hence they will
> want to give an unbiased image of what they are missing out on (here I
> will just assume ethical posthumans; posthumans with an ideological
> axe to grind is another matter of course). Education becomes very
> relevant.
Posthumans should be so friendly (like AI). I'll more than likely be staying
behind not because of values, mistaken info, habit, etc., but because there's no
way for me to go without being a burden, and besides, my heroes won't be onboard
anyway (you know, Socrates, Galileo, Moses, Buddha, Mohammed, Krishna, Lao Tzu,
Osho, Kabir, et al.). The more I think about it, the more I suspect Sasha had
the right idea. When faced with years and years of toil to earn enough capital
to freely join technological singularity, it seems more reasonable, rational,
and yes, responsible to simply opt out. That way, it's like I'm joining my
heroes, the grateful dead. Furthermore, we really don't have any guarantee that
the future will not repeat the horrors of the past: Five thousand wars in the
last three thousand years. And look what's going on in the Middle East today.
It's only depressing if you plan to be around for more than a few more years.
Stay hungry,
--J. R.
3M TA3
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:31:34 MST