Re: Ye Are Gods

From: Emlyn (emlyn@one.net.au)
Date: Sat Sep 23 2000 - 02:49:24 MDT


> Emlyn wrote:
>
> > Don't get me wrong; I don't oppose any of the technologies being
developed.
> > I take issue only with the top level goals in some cases. Particularly,
this
> > goal of godhood leads to an arrogance which I can't condone. It leads to
> > believing that you know better than the other six billion people kicking
> > around on this planet, and that can direct actions which are not morally
> > supportable; for instance, the attempt to build a guardian - I haven't
> > noticed any step in the plans for such creations, which involves
obtaining
> > broad consensus before "flicking the switch".
> >
>
> On average, as a point of fact, there is not a person on this list who
> doesn't "know better" than the vast majority of the world's population
> about a great many things. I am sorry if it breaks local taboos to
> point this out. Intelligence is not distributed evenly.

...and most of those people probably know at least one thing better than
every person on this list. Scott Adams says, in the Dilbert Principle, that
people are idiots. Even the smart people are only smart at certain times and
in limited ways, and are for the most part fools. Maybe that's a stupid
reference; I must be an idiot.

>
> Do you actually ask for broad consensus for your own work, for every
> design and implementation decision? No? Is that only because what you
> are doing doesn't affect many people or is it also because most people
> wouldn't have any idea what they were talking about if they tried to
> advise you on your work?

No, I don't ask for concensus under such conditions. Mostly because it
doesn't affect many people, at least not in any way that they don't have
control over; like, if I build some or all of a business website, the
algorithm for importing log files into a database doesn't really affect a
whole lot of people. Possibly such choices might effect people through the
time it will take to implement them; if so, I'll go talk about it.

Eventually this work will affect some people, particularly at a critical
point; when it comes time to make it part of the production system. The main
people it affects will be the system owners, and the effects possibly are
quite important. So in that case, I certainly do seek consensus.

Below some level of importance (let's call it the "Pointlessness
Threshold"), seeking consensus is more of an externality, through the
imposition of people having to pay attention to something inane, than not
seeking it is. Each of us necessarily has to guess where they are, at any
point in time, in relation to the Pointlessness Threshold. This decision is
still an externality in itself. Maybe the much heralded transparent society
will allow people to put the burden on those around them... "Why didn't you
tell us you were going to blow up the world, your bastard? Well, I've been
webcasting my consciousness stream since before I thought of the idea, using
tiny nanobots to read my neural pattern, and an old C64 I had lying around
to decode it into text. Don't blame me if you are too lazy to read."

>
> Should the human race only be allowed to advance in steps that were all
> approved by the broad consensus? Should a poll have been taken before
> we allowed that the sun is the central body of the solar system rather
> than earth? Oh, you say, we don't need to get a consensus for facts.
> But then why do you need a consensus to bring major advances into play
> that 98% of the world's people never will understand well and that the
> majority are singularly unqualified to pass judgement upon?
>

You're right; the existence of this ignorant 98% is obviously anathema to
the advance of humanity as a whole.

> I ask these things to open conversation rather than to say "You are
> wrong." I actually sympathize with some of your concern. But I don't
> see how waiting for consensus is a sign of proper diligence or will
> actually help humanity at all. Call it elitist if it makes you feel
> better, but I believe recognizing the paucity of intelligence is simple
> honesty.
>

People piss me off too. Still, we all live here (in the universe); it's good
when we can get along.

Obviously I don't subscribe to this idea that the alleged 98% bozo factor
should be lead by the natural leaders in the top 2% (amongst which I imagine
you would count yourself). Possibly it's because I am worried that I'm more
borderline, and might not make the cut! I wouldn't like that much, and I can
empathise with others who feel the same way.

> I also believe that failing to acknowledge one's intelligence and
> ability to help choose and produce the future can be a false modesty
> that keeps one from being fully and responsibly engaged. Failing to
> step up and do what you can taking full responsibility is required of a
> great number of us if human beings are to have a viable, much less
> joyously abundant, future.
>

I'm not sending us out to the fields to be farmers.

>
> > This is a time of unparalleled change, and will look like a walk in the
park
> > next to the times to come. It is a time for humility in our approach,
and
> > special concern for the other beings that inhabit the planet; as it
becomes
> > easier for the few to ignore the wishes of the many, it becomes no more
> > tolerable to do so.
>
> It is precisely because of HUGE concern for the needs of all humanity
> that many of us became scientists and technologists and it is out of
> that concern that we dream large dreams and see to what extent they can
> become reality. We would be irresponsible, having been gifted or having
> acquired such ability if we did not use it.

I'm not sending us out to the fields to be farmers.

>
> The cutting edge of any species is the edge. It is not the consesus
> masses. Why condemn the edge for being the edge? It is there that
> advance will happen that lifts the whole.
>

Too cool. I'm all for that. Count me in! I'm not sending us out to the
fields to be farmers.

>
> >
> > Also, we are playing with fire - well, actually fire is a baby's toy
> > compared to the stuff we are messing with now. It's not a good time to
get
> > complacent and arrogant - "we are as gods, ha ha ha!". It's time to be
more
> > humble than ever, to be open-system, to take in information from our
> > environment. It's been discussed on the list just how dangerous some of
the
> > coming technologies are (ai, nanotech, etc), and if you go over the
posts,
> > you'll see that most of the danger is attributed to use of that
technology
> > by humans infected with the God meme. People who think that they know
better
> > than everyone else, who feel justified in producing externalities (like
grey
> > goo).
> >
>
> Humility taken so far is for people who will deny their own strength AND
> their huge responsibility. It is not a strategy that helps anyone.
> Seeing the huge potentials for change is not something that makes me the
> least bit complacent. It scares the heebie-jeebies out of me quite
> often. But it is where the power and future of this species lie. Those
> of us who are the forerunners, the intellectual scouts, the builders of
> bridges between today and tomorrow, including those of us who
> cross-check that we are keeping our wits about us, certainly cannot
> afford to be complacent. But that doesn't mean we should stand aside or
> be frightened to look and to attempt to chart a path forward that makes
> the most sense and enables the best outcomes. After all, if we don't
> make the attempt, then can we expect to take care of our default? If
> not us, who?

Humility is not fear. It's probably the opposite.
>
> Yes, we need as many voices and viewpoints as can be fruitfully
> employed. Yes, we are talking about some of the most serious things
> anyone has ever contemplated and our euphoria should be tempered with
> quite a bit of sobriety.
>
> One thing that worries me is that we are quite good and coming up with
> technology. We are not nearly so good at creating a unifying vision (or
> sets of visions) that will more likely shape the use and unfolding of
> the technology for the maximum good. If we don't create a vision or set
> of positive visions to guide us then the technology will more than
> likely greatly magnify all the good and bad tendencies in the world
> today. I doubt that that is survivable.
>

On first reading, that sounded suspiciously like consensus seeking. But I
think what you mean, is that someone needs to create a future vision for
people to rally behind, so that we have some framework for moving into the
future to full potential, whilst identifying potential problems and avoiding
traps. I think that's probably the motivation for Transhumanism, in a
nutshell.

I support this. I support transhumanism. I'd even call myself a
transhumanist. "Emlyn, you're a transhumanist. Nyah nyah!"

What I'm talking about, in all these ravings and ramblings about humility,
is the essense of transhumanism. As a vision, it is optimistic, it's forward
thinking, it's possibly individualistic. Also, it's about people, and
humanity, even as it seeks to leave humanity behind.

I'm not talking about "banning" technologies; who's going to decide what to
ban? I am interested in respecting the rest of humanity when we make
changes.

What does that mean in concrete terms, applied to technology? It means that
while scientific & technological progress is good, we must be careful about
applications of technology.

For instance, nanotech is a good idea, and ought to be developed. Being able
to heal ourselves, repair ourselves, modify our selves, through nanotech, it
also good (excellent!). Being able to modify other people, well, that's not
so good. Releasing self replicators into the general environment without
safeguards; not so good. Modifying the environment that others inhabit; not
good, without some mechanism for consensus. That doesn't mean that we'd
better not develop the technology. It means that we have to develop and
promote a morality to go with it, about respecting other people.

Also, AI is good; intelligence augmentation is fantastic. Creating a self
modifying super intelligence is good. Turning it on, and plugging it into
the world's computer systems, without asking everyone else if that'd be ok
by them, that's not good. That's very naughty. That's could be called
sociopathic.

Working on GM food is good. Great! Designing crops to feed more people,
excellent. Designing crops with terminator genes, dodgy, but still ok,
maybe. Releasing self replicators into the general environment without
safeguards, not so good.

I could be very wrong about the technologies I've outlined above. That's the
point. I'm not about to appoint myself the arbiter of human morality for the
21st century (although apparently I do have an opinion). I'd appreciate it
if others afforded the same respect to me.

As an aside, I think it's funny that transhumanism, which purports to being
about moving away from humanity, is actually more about what it is,
fundamentally, to be human, than any other belief system/philosophy/vision
that I can think of. Its about us merging more fully with our tools,
believing more strongly in our ability to reshape the universe, believing in
ourselves rather than some unknowable greater force(s). What is more
essentially human, than the intimate relationship we have with our tools,
and our desire to reshape our environment using them? To lose that focus,
and that ability, makes us less human; so to strengthen it, makes us more
so. The posthumans of our vision, supposedly having left humanity behind,
will be paradoxically, maximally human. Possibly not organic, but 100%
natural, certainly.

Emlyn



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:31:08 MST