From: Robin Hanson (rhanson@gmu.edu)
Date: Tue Mar 14 2000 - 09:21:40 MST
Hal Finney wrote:
>John Clark, <jonkc@worldnet.att.net>, writes:
> > In the current issue of Wired magazine is an article I did not expect
> > to see, the author expresses great fear over Nanotechnology, computers
> > smarter than humans, and of course genetic engineering. After expressing
> > sympathy for the ideas if not the methods of Ted Kaczynski, ... pretty
> > standard Luddite stuff, the bizarre thing is that the author is Bill Joy,
> > chief scientist and cofounder of Sun Microsystems. ...
>
>I have had a chance to read it now, and I was unhappy with it, for
>several reasons. For one thing, it is a one sided polemic. ...
>However, even with this emphasis, the dangers aren't really spelled out
>very clearly. For robotics, we are given only a quote from Moravec about
>how they will inevitably out-compete humans. We have discussed here in
>the past the principle of comparative advantage, ...
>For nanotech it is the standard grey goo scenario, ...
>For genetic engineering it is any of the scary frankenfood scenarios ...
>With this rather sketchy outline of the dangers, Joy fleshes out his
>article with a rambling passage about the early days of the nuclear arms
>race, ... He looks for essentially the same solution ... give up on
>them and stop working on them. .... There is no discussion of the
>dangers of setting up such a universal monitoring system. ... deaths of
>poor people around the world if economic growth is thwarted ...
>In short, it's a powerful but one sided argument, strong on attack but
>short on defense. ... the ideas seem so outlandish and impossible to
>implement that it seems unlikely to go anywhere. ...
As an academic, what stands out most to me is that it is a sermon, not an
[academic-style] argument. It is not going to convince the people I talk
to the most, but might well influence other sorts of people.
He writes: "But if we are downloaded into our technology, what are the
chances that we will thereafter be ourselves or even human? It seems to me
far more likely that a robotic existence would not be like a human one in
any sense that we understand, that the robots would in no sense be our
children, that on this path our humanity may well be lost."
http://www.wired.com/wired/archive/8.04/joy.html
I think this is his key intuition, and is the key intuition that drives most
fears of the future. It is like the proverbial grumpy grandpa, who sees
that the kids today are not like he was, and fears the "world is going to
hell in a handbasket." He knows that the "moral fiber" of his generation is
what kept them from the abyss. And if his no good grandson exceeds some
threshold of tolerable difference, why he's just going to disinherit him,
ha, and strike a blow for decency everywhere.
This sort of grumpy grandpa neglects how different his generation is
from the generations that went on before him, and how little his ancestors
ten or ten thousand generations back might have approved of how his generation
lived. With a broader historical perspective, such a grumpy grandpa would
probably realize how wide the range of environments are that can make folks
happy and productive, and might then accept this grandkids world a bit more.
Yes, on a social level Bill "Long Now" Joy seems such a historically narrow
grumpy grandpa. If his descendants incrementally and voluntarily choose to
become more robotic, why Bill is horrified, and fears true human decency
is lost. That's now the way it was when he was a kid, by gum, why if God
had intended men to fly, he'd have given us wings.
Btw, I find it curious people are shocked to find that they might agree
with something Kaczynski wrote. Would they be shocked if Kaczynski had
written that the sun will rise tomorrow, or that computers will get cheaper?
Why must a person who does evil things be wrong about everything?
Robin Hanson rhanson@gmu.edu http://hanson.gmu.edu
Asst. Prof. Economics, George Mason University
MSN 1D3, Carow Hall, Fairfax VA 22030
703-993-2326 FAX: 703-993-2323
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:27:23 MST