From: Dan Clemmensen (dgc@shirenet.com)
Date: Tue Oct 08 1996 - 17:36:07 MDT
Robin Hanson wrote:
>
> Robin Hanson wrote:
> >> The whole issue is about the rates of change. You have been claiming
> >> that this technology will produce much faster rates of change and
> >> economic growth than we have seen with other technologies,
> >
[ Dan Clemmensen wrote:]
> >1) Assumption: the compuer component contributes non-trivially to the
> > intelligence of the entity, once the entity comes into existance
> >2) Observation: computer hardware and software technology are advancing
> > at an empirically-observed exponential rate known as "Moore's Law",
> > which we both agreed is unlikely to change in the near future. This
> > rate is dramatically faster than the 30-year sustained rates of increase
> > of the other technologies you mention.
> >3) Assumption: a more-intelligent entity can develop intelligence-augmentation
> > faster than a less-intelligent entity.
> >4) Conclusion: This is a fast-feedback loop.
>
> Various measures of computer cost and speed have had a roughly
> constant growth rate over four decades (or more?) During this time
> the world economy has doubled (more than once?) the computer industry
> has doubled many times, and computers are lots more available. Thus
> this growth rate does not seem especially sensitive to the size of the
> world economy or the computer industry, or to computer availability.
> Or perhaps it is senstive, but such effects are just about canceled
> out by the computer design problem slowly getting harder.
>
> It sounds like you think this growth rate is sensitive to "researcher
> intelligence". Note though that the growth rate hasn't changed much
> even though computer companies now hire lots more researchers than
> they once did, because the industry is larger, and though they could
> have instead hired the same number of researchers that they did at the
> beginning but made sure they were the IQ cream of the crop. So that
> degree of IQ difference doesn't seem to make much difference in the
> growth rate. And note that the average researcher today is smarter
> than the average one 40 years ago, because humanity knows so much
> more. And note that researchers today have access to lots more
> computers than they once did, and that hasn't made much difference.
>
> So you posit an effect of researcher IQ on computer growth rates that
> can not be substituted for by hiring more researchers, or by having
> them know more, and that is not very strong for small IQ differences.
> So at what level of increased IQ do you expect this effect to kick in?
> And you posit an effect of more computer access on IQ that we haven't
> seen yet. So at what level of computerization do you expect this
> effect to kick in? And why should we believe in either of these
> not-yet-observed effects?
>
My argument starts with assumption (1) above. That's why its the
first assumption. The effect will kick in when the very first
human/computer joint entity (or other entity with a computer as
a substantial contributor to its intelligence) comes into
existence. It will then get more intelligent as a result of increases
in the capabilities of its computer part. So far, no feedback is
involved. Then it will get rapidly more intelligent as it uses this
intelligence to better use its computer part. The major difference
here is that the computer will contribute directly to the intelligence
of the entity. I claim that this is different in kind than the
contributions to intelligence of the other technologies you cite.
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:35:47 MST