From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Fri Sep 18 1998 - 17:21:13 MDT
Robin Hanson wrote:
>
> If you think that post is a persuasive technical analysis on timing claims,
> rather than suggestive prose, I suspect you don't know what such analysis is,
> and so my requests for it are futile. Does *anyone* reading this other than
> Eliezer think that Eliezer's first post constitutes such a persuasive
> technical analysis?
First of all, my first post is a selection of a summary of a technically
detailed speculation. The summary of the speculation is in the "Singularity
Analysis" web page, and the speculation itself is in "Coding a Transhuman AI",
which if posted here would bring the mailing list to its knees.
As for technical analysis, if you'd like the kind of elaborate equations
beloved of economic forecasts, you're simply out of luck. But be comforted by
the fact that such equations have never successfully predicted the course of
the human race and never will, so it is utterly absurd to extend them to superintelligence.
I suppose I could fake up an elaborate power/intelligence/speed graph, using
EURISKO's runs, past trends in computing power, the gradual drift in
intelligence, and records of economic revolutions. But it'd be a statistical
lie (and as we know, those are even worse than damned lies). I don't intend
to do it because I don't think it'd be honest to my readers. I have enough
confidence in my rough models. I shall not disguise them with impressive
mathematical uniforms they are not entitled to wear.
-- sentience@pobox.com Eliezer S. Yudkowsky http://pobox.com/~sentience/AI_design.temp.html http://pobox.com/~sentience/sing_analysis.html Disclaimer: Unless otherwise specified, I'm not telling you everything I think I know.
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:49:35 MST