From: Robin Hanson (hanson@hss.caltech.edu)
Date: Fri Oct 04 1996 - 13:27:47 MDT
Robin Hanson wrote:
>> The whole issue is about the rates of change. You have been claiming
>> that this technology will produce much faster rates of change and
>> economic growth than we have seen with other technologies,
>
>1) Assumption: the compuer component contributes non-trivially to the
> intelligence of the entity, once the entity comes into existance
>2) Observation: computer hardware and software technology are advancing
> at an empirically-observed exponential rate known as "Moore's Law",
> which we both agreed is unlikely to change in the near future. This
> rate is dramatically faster than the 30-year sustained rates of increase
> of the other technologies you mention.
>3) Assumption: a more-intelligent entity can develop intelligence-augmentation
> faster than a less-intelligent entity.
>4) Conclusion: This is a fast-feedback loop.
Various measures of computer cost and speed have had a roughly
constant growth rate over four decades (or more?) During this time
the world economy has doubled (more than once?) the computer industry
has doubled many times, and computers are lots more available. Thus
this growth rate does not seem especially sensitive to the size of the
world economy or the computer industry, or to computer availability.
Or perhaps it is senstive, but such effects are just about canceled
out by the computer design problem slowly getting harder.
It sounds like you think this growth rate is sensitive to "researcher
intelligence". Note though that the growth rate hasn't changed much
even though computer companies now hire lots more researchers than
they once did, because the industry is larger, and though they could
have instead hired the same number of researchers that they did at the
beginning but made sure they were the IQ cream of the crop. So that
degree of IQ difference doesn't seem to make much difference in the
growth rate. And note that the average researcher today is smarter
than the average one 40 years ago, because humanity knows so much
more. And note that researchers today have access to lots more
computers than they once did, and that hasn't made much difference.
So you posit an effect of researcher IQ on computer growth rates that
can not be substituted for by hiring more researchers, or by having
them know more, and that is not very strong for small IQ differences.
So at what level of increased IQ do you expect this effect to kick in?
And you posit an effect of more computer access on IQ that we haven't
seen yet. So at what level of computerization do you expect this
effect to kick in? And why should we believe in either of these
not-yet-observed effects?
Robin D. Hanson hanson@hss.caltech.edu http://hss.caltech.edu/~hanson/
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:35:46 MST