How can we deal with risks?

From: Holger Wagner (Holger.Wagner@lrz.uni-muenchen.de)
Date: Wed Oct 29 1997 - 17:39:03 MST


I actually don't want to sound pessimistic in my first posting to this
list (just today, I've read the article about dynamic optimism which I
found pretty good), but there's something I have been thinking about
since I first found out about Extropy - but so far, I couldn't find an
answer.

In history, very often genius ideas and inventions turned out to be
rather "dangerous", one example are nuclear weapons, a better one are
CFCs. From what I know, when CFCs were discovered, everybody thought
they were completely harmless (non-toxic etc.) It took some years before
the actual problems they may cause (destroying the ozone-layer) was
discovered. I don't know if this is as bad as the media keep on telling
us, but there are a lot of other examples (like simply cars that pollute
the air).

Now, I'm not at all a conservative person, but I do see a problem here
that should be solved somehow - actually, it's a couple of different
problems. If solutions to these problems are already available, please
tell me where to find them.

1) Today, humans are by no means perfect. They have a certain idea of
what they do and what the consequences are, but it happens quite often
that something "impredictable" happens. If you apply changes to the
ecologic system, can you really predict what consequences this will have
in the long run - and would you take the responsibility?

Possible solution: I assume that most scientists are very intelligent,
so they should understand stuff like pancritical rationalism and should
be able to apply this to their work. By doing that, they at least
improve the chance of not doing anything that has extremely bad results
in the long run. Usually, it's the innovator who decides whether
something should be invented or not, right?

2) Today, humans are by no means perfect. While I trust scientists to
have at least a vague idea of what they're doing, I do not trust people
in general. For example, people that work for governments (so-called),
or people that are only interested in profit. Actually, you just have to
walk on the street and talk to people, and you'll discover a lot of
people that simply don't understand a lot of things (they don't actually
need to - for what they want to do). I believe that this poses a great
risk of misuse of any technological advancement.

Solution: Educate people accordingly. (easy to say - but I don't believe
it's possible until I see world-wide results).

If you want to overcome the fear most people have of technology, you
need to solve these problems and make people understand the solutions.
This is obviously quite an old problem, but I think the risks are ever
increasing - until humans themselves are majorly improved (anybody who
has access to advanced technology that has a potential danger needs to
be capable of understanding what he's dealing with - something that is
definitely not trivial).

The major problem with this is that I can improve myself, but I can't
improve the rest of the world - and only one insane person can do great
harm if he has access to the "right" weapons. Imagine a person like
Adolf Hitler with access to future-genetics and nanotechnology. I assume
it will be possible to create both - lethal viruses that spread quickly,
and nanotech-robots that wipe-out life (maybe filtered with certain
genes???). I've recently seen a documentary about Hitler's personality,
and he actually was a poor fool. He was lonely and psychotic. If he
didn't have the power he had, history would have been quite different.
But what if there comes a day where we have to face an insane fool who
has the technology to wipe out all life on this planet? Someone who just
doesn't understand or just doesn't care about the responsibility?

I've obviously put up the "worst scenario one could think of" - but I
think it's quite possible if there is no major change to how most people
think. If all people were Extropians, there would probably be no problem
but I'm afraid that will take somewhat longer than development of
Nanotechnology (and many other things that can be grrrrreat - if used in
a responsible manner).

Tell me what you think about this - I'm definitely not questioning
technology but humans.

later,
Holger

-- 
o---------------------------------------------------------------o
| "That's the funny thing about memories, isn't it? We are not  |
| what we remember of ourselves, we are what people say we are. |
| They project upon us their convictions  -  we are nothing but |
| blank screens." ______________________________________________o
o________________/        Trevor Goodchild in "Aeon Flux"       |
                 \__ mailto:Holger.Wagner@lrz.uni-muenchen.de __|


This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:45:04 MST