From: Robin Hanson (hanson@econ.berkeley.edu)
Date: Mon Sep 14 1998 - 11:24:54 MDT
Damien Sullivan writes:
>The Great Market of western civilization is a massive ocean of human minds,
>using language, writing, and prices to exchange thoughts, experience, and
>desires. In it people are capable of forming structures and superorganisms
>large and small to handle a huge diversity of tasks for varying lengths of
>time. ... Much of what we anticipate has already happened, at fast rates,
>and with the creation of sharp dichotomies. I already hear that no one person
>can fully understand a Boeing 747, or MS Excel. We can already produce
>superintelligences capable of producing things our minds aren't big enough to
>grasp. The consciousness isn't superhuman, but a human CEO makes decisions
>based on superhuman levels of prior processing, and with superhuman (in
>complexity, not just gross scale) consequences.
>
>So, what would change if our dreams came true? Everything and not very much.
>Direct neural connections and liquid intelligence at the subhuman level,
>would be revolutionary for individuals, redefining (or throwing away) what it
>means to be human. But above the human level not much would change.
>Efficiency could increase a great deal -- lower training costs, being able to
>get more precisely the thinker you need, perhaps greater reliability and
>honesty, since less individual self-interest. But the type of process would
>be exactly the same as the West has today.
I think this is the right way to think about things, and that it answers Hal's
question: "How can I judge an intelligence which is greater than my own?"
The larger market framework judges new creatures by the supply of labor they
offer, and the demand for products they embody. "Smarter" creatures are those
that can command high rents/salaries for their labor.
In this framework, Vinge's singularity seems to be the idea that economic
growth rates will be very high, because the marginal productivity of computer
labor will be rising rapidly, because investments in improving such productivity
will pay very large dividends. Even when most of the economy is devoted to
developing better computer hardware and software, the marginal investment will
quickly give back twice the "capital" invested. As usual, "capital" amounts are
calibrated so that twice as much capital can produce twice as much stuff buyers
demand for its own sake.
The sort of analysis in favor of singularity that I would find compelling would
place this singularity claim in the context of what we know about the nature
and history of the productivity of investments in computer hardware and
software, or in the "software" of human mental abilities. Such an analysis
would, in this context, identify specifically what is expected to change to
make growth rates so much larger than recent experience, and show how such
changes are likely to induce much larger growth rates.
Robin Hanson
hanson@econ.berkeley.edu http://hanson.berkeley.edu/
RWJF Health Policy Scholar, Sch. of Public Health 510-643-1884
140 Warren Hall, UC Berkeley, CA 94720-7360 FAX: 510-643-8614
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:49:34 MST