From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sun Aug 13 2006 - 12:44:54 MDT
Russell Wallace wrote:
>
> Well, I've previously argued against the feasibility of hard-takeoff
> Singularity on technical grounds;
If I recall your argument correctly, you
1) Made the very strong assumption that the AI had no sensory access to
the outside world in your premises, but generalized your conclusion to
all possible belief in explosive recursive self-improvement;
2) Did not demonstrate the ability to calculate exactly how much sensory
bandwidth would be needed, which of course you can't do, which makes
your argument "semi-technical" at best according to the classification I
gave in "A Technical Explanation of Technical Explanation";
3) Didn't actually give any argument against it except saying: I don't
know how much bandwidth is actually required, but doing it with so
little feels really absurd and ridiculous to me.
If I am incorrect, please post a link to the URLs of your messages to SL4.
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:57 MDT