From: Dan Fabulich (daniel.fabulich@yale.edu)
Date: Thu Apr 08 1999 - 15:04:54 MDT
At 06:43 PM 4/8/99 +0100, you wrote:
>Dan Fabulich wrote:
>> But Singularity somehow, someway, with a species that may
>> not even be our own? Totally inevitable.
>
>What if 2n increase in intelligence equals n^2 increase in computer
>power? What if 2n increase in intelligence equals 2^n increase in
>computer power? What if intelligence involves not-easily-computable
>elements? What if there's an upper limit to complexity? Eh?
>
>Such confidence in the face of such uncertainty!
My definition of Singularity is pretty weak. I'd consider any kind of a
Borganism a Singularity. Thus, even if there *is* an upper limit to
intelligence, theoretically or practically, we should still expect some
kind of Singularity to result someday.
-Dan
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:03:30 MST