Below is my letter to NPR, regarding the show tomorrow.
[snip - background info...]
Which comes to my first point - adopting the Joy "go slow"
or "stop" perspective, is literally writing death sentences
for millions of people. It of the same moral class, in my
opinion, as that of the Catholic Church and many others who
ignored what the NAZI's were doing to the Jews in WWII. We
look back at those times now and believe that was inexcusable.
Mr. Joy's position is incomplete unless he clearly justifies
holding back what might be considered "natural evolution"
for the sake of preserving the human species, while at the
same time condemning many members of that species to
unnecessary deaths.
This problem has raised its head in the recent debate on
gene therapy. The research problems that may be responsible
for accidental deaths must be balanced against those individuals
who are doomed to premature deaths, should the research not
continue as quickly as possible. The medical principle
of "First, do no harm", must be balanced against the problem
of "By not doing, you do a greater harm". I would argue that
the most fundamental right is that of seeking "self-preservation".
To conduct research and test products necessary to achieve
that should be considered a natural extension of that right.
I have little moral obligation to the human "species". I do have
a moral obligation to myself and indirectly thru emotional
ties and commitments to those with whom I am close.
The perspective that we should slow or stop GNR research
will lead people who believe that the research is necessary
to simply do the research in other countries, or eventually
on other planets. You see this today as many German scientists
who want to conduct genome research leaving the inhospitable
climate in Germany for other European countries or the U.S.
You cannot stop this train, you do want to constructively
manage it.
Joy's fear of the loss of humanity, is an issue of timing.
Humanity is ultimately doomed anyway, if not by an asteroid,
then by the oceans evaporating, or by the sun becoming a
red giant and incinerating the planet, or by a nearby passing
brown dwarf pulling the Earth out of orbit, or the probability
that, in the long run, all of the protons in the universe
will decay. He seems to be hoping that the "end" of "humanity"
will not come in his lifetime.
His arguments have been proposed before. In the mid-1970's
the development of biotechnology raised concerns that the
escape of engineered microorganisms could doom humanity.
The solution to that was the Ansilomar conference, and the
development of laboratory practices that have for the last
20-25 years, demonstrated that these technologies can be
handled safely. The history with Iraq and the Gulf War
show that, yes, when people threaten to use these technologies
in aggressive, anti-humanitarian ways, that it may be necessary
to exert extreme efforts correct those situations. But
the people that die in those corrective efforts are few
compared to the numbers that will be saved by having
the technologies available. It is worth noting that the
anti-genetic-engineering people rarely mention the millions
of people a year who die from hunger and starvation, whose
lives will eventually saved by more productive agriculture.
Mr. Joy may be unaware that the Senior Associates of the
Foresight Institute do have a working document "A Policy
on Nanotechnology", in part, based on the Ansilomar principles.
The first principle of that document is that you shall not
release self-replicating nanaomachinery into an uncontrolled
environment. Until we have a much better understanding of the
relative risks and can develop *very* reliable methods for failsafes
and self-destructs, any uncontrolled release would be a foolhardy
thing to do. Rockets don't work all the time, but we seem to
launch them with fairly high frequency to the benefit of mankind.
Only occasionally do we have to destroy one and clean up a mess.
The technologies being developed *will* be complex, they may
sometimes fail. However, movements such as the Open Source
Software initiatives, have shown how allowing greater numbers
of human minds to look at designs and find their flaws should
allow highly reliable systems to be engineered. Committees,
such as the NIH Recombinant DNA Advisory Committee (RAC) (a panel
of experts reviewing safety aspects of gene-engineering
efforts) combined with things like Open Source seem to provide
potential solutions for the potential problems Joy proposes.
Ansilomar, RAC and Open Source offer an important counterpoints
as how we should proceed. They stand in contrast to the "behind
closed door" development of nuclear weapons, used by Joy to
argue that we must not proceed with GNR.
It is worth noting, that I'm editing this document using a
computer program known as "vi", developed by Bill Joy, almost
25 years ago. Its longevity seems to suggest that software,
once debugged, is not as fragile as Joy's Wired article
suggests. Of course we must take prudent precautions to
see GNR does not develop in the wrong way. One nightmare
that I fear, that is discussed a great deal by Extropians and
Transhumanists is the possibility of an *amoral* artificially
intelligent "entity" developing from self-replicating,
self-evolving (artificial life) then proceeding to invent
nanotechnology to suit its purposes. Whether that possibility
has a higher or lower probability than a near-earth asteroid
hitting the planet in the next 20 years is an area where attention
should be focused. If that is however possible (after all nature
has produced humans that are almost at that point), then we *will*
need GNR to defend ourselves.
I have spent the last 8 years thinking almost exclusively about the
problems posed by GNR. Rather than be afraid of them, I believe that
they offer an opportunity significantly greater than that promised by
many of the religions of the world ("Everlasting life in Heaven with God").
The technologies being developed seem to allow the possibility for people
to evolve into something greater than what nature has produced thus far,
or, for those who prefer the human path, an indefinate healthy lifespan,
until such time as a rare accident catches up with you or you choose to
end your existence. The exclusive Kurzweil/Moravec path (humans turn into
Robots), or the exclusive Joy path (humans turn into pseudo-luddites
and/or self-handicapped space explorers) are not an either-or-situation.
There is enough matter and energy readily available for a Chinese-like
"one-solar-system", many "species" to evolve and coexist.
For humanity to ignore or retreat from these opportunities, would
in my opinion represent the definitive breaking of the human spirit.
And that would truly doom us to a fate worse than those imagined
by Bill Joy.
If you would like more background information on issues involved
with lifespan extension, I have an essay on the net:
http://www.aeiveos.com/issues.html
Robert
[snip... - signature & disclaimer stuff]
This archive was generated by hypermail 2b29 : Thu Jul 27 2000 - 14:05:25 MDT