Re: >H RE: Present dangers to transhumanism

From: Cynthia (cyn386@flash.net)
Date: Thu Sep 09 1999 - 17:58:14 MDT


Doug Jones wrote:

> This is where Eliezer's doubled doubling breaks down- smart AI's might
> optimize their own code, but faster execution requires faster hardware,
> which is tied to a physical realm where Moore's law is difficult to
> shortcut. An AI singularity can't occur unless the time constant for
> hardware improvement and replication can also be shortened.
>
> (Massively parallel processor systems can grow in processing power only
> linearly. Even if more chip fabs are brought on line, it's the rate of
> creation of chip fabs that limits progress.)

Going parallel is the next step, but beyond that their are more steps. Such as
going analog. Analog isn't flexible, but it sure is fast.



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:05:05 MST