Re: Singularity - Clarifying Timing Claims

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Tue Sep 08 1998 - 12:43:45 MDT


Hal Finney wrote:
>
> Robin Hanson, <hanson@econ.berkeley.edu>, writes:

> > Before this discussion can proceed further, I think we need to get clear
> > on what exactly Vinge is claiming in these two passages. I'd be most
> > interested in how others translate these, but my reading is:
> >
> > "Progress" rates increase with the speed of the processors involved.
>
> > Now it's not clear what "progress" metrics are valid here, but if
> > economists' usual measures are valid, such as growth rate in world
> > product, there are two immediate problems with this theory:
> >
> > 1) Progress rates increased greatly over the last hundred thousand
> > years until a century ago without any change in the cycle speed of
> > the processors involved.
>
> No doubt there are many factors influencing the rate of progress.
> It is still possible that adding powerful computers to the mix will make
> a difference.

I would amplify that: All *else* being equal, progress is "obviously"
proportional to the speed of the processor. (Assuming that either the main
bottleneck is problem-solving, or that manipulatory technologies can operate
on the same timescale.) In other words, speed is simply a matter of scale.
The only thing that's been added is a subjective time that looks shorter, even
instantaneous, to a human observer.

I ought to add that to canonical.zone-barrier as Synchronization - if the
Singularity takes a thousand subjective years and everyone is synchronized, we
might still have time enough for fun. (Very slightly plausible, requires a
global cooperative effort.)

> > 2) Computer processor speeds have increased greatly over the last
> > century without much increase in rates of progress.
>
> It could be that the amount of computer power available is still too
> small to make a significant contribution.
>
> One metric sometimes used is total brainpower vs total computer power.
> If we assume that the latter continues to grow as the product of Moore's
> law and economic growth rates then the total human+computer power can
> be expected to be dominated by computers in a few decades. If computers
> can be given problem-solving heuristics comparable in power to those used
> by humans, and if their total computational power becomes thousands,
> then millions, then billions of times greater than that of humanity,
> then it is plausible that problem-solving abilities will increase by a
> similar factor.

Too abstract a way of putting it; the real events don't depend on the
mathematical proportions. Say: "Computer processors aren't intelligent, so
their increase in speed doesn't make a difference." When they become
intelligent, the increase in speed will make a difference - although as stated
my trajectory analysis ("Human AI to superhuman"), I think that a sharp jump
in speed will result from internal optimization.

The relative amounts of raw power might still be utterly trivial; a being with
the optimized and linear power of ten brains might be sufficient for fast
infrastructures and Singularity. Then the human processors become trivial,
but the part of the trajectory we're interested in is already over.

> It may be that economic growth rates won't be the most appropriate
> metric for progress. The large installed base of existing technologies
> and natural human conservatism may put us into a new kind of economy.
> New ideas will be coming out of the labs far faster than they can
> be incorporated into society as a whole. We might see a form of
> "social shear" where some people are pushing forward as fast as they
> can while others are hanging back, and still others are trying to hold
> society together. (Unfortunately shear strain often resolves itself
> catastrophically.)
>
> Hal

"Shear" is what we're seeing right now. In the presence of unstable systems
and positive feedback, things generally go all the way to one side or another.
 Forces aren't unevenly distributed. To be less abstract,
transhuman-controlled nanotechnology (to give an example of fast
infrastructure) will take over all of the world, or none of it, but *not* just
the rich parts.

See "Evaporation of the Human Ontology: The simplicity barrier" in
http://pobox.com/~sentience/sing_analysis.html#simplicity
for a description of how processes can change rapidly and without noticeable
transitional phases.

-- 
        sentience@pobox.com         Eliezer S. Yudkowsky
         http://pobox.com/~sentience/AI_design.temp.html
          http://pobox.com/~sentience/sing_analysis.html
Disclaimer:  Unless otherwise specified, I'm not telling you
everything I think I know.


This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:49:33 MST