From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Wed Jan 17 2001 - 09:11:59 MST
Jim Fehlinger wrote:
>
> However, it was Yudkowsky's Web articles in which I first
> encountered an apparently sane, intelligent, and articulate
> person writing with panache and plausibility (and with a distinct
> disinclination to pull punches for the sake of sparing the
> reader's feelings, perhaps even a positive delight in
> future shock "pour epater la bourgeoisie") that all this
> might (nay will, by Vinge) happen within reach of my own lifetime
> (with a bit of luck -- on my part, that is).
Why, thank you! Not for me the self-conscious deprecation of Ed Regis;
no, I know damn well it's all real. Not that shocking the living
daylights out of some unwary websurfer isn't fun for me as well, of
course.
> First it was to be 2035 (IIRC). That would be a stretch for
> me (though I'm in reasonably good shape, apart from a touch of
> arthritis) -- I'd be 83 that year. Then 2025. A more
> comfortable target, for this baby boomer. Now 2008, you say --
> before I've even retired or paid off my mortgage!.
Well, don't rely on not having to pay off your mortgage - the range still
goes up to 2020. (I have to say, though, that 2025, never mind 2035, now
seems so ridiculously far beyond the point of expected Singularity that
you can go ahead and rely on it.)
> Perhaps Eliezer would like to give us a clue as to which basis he's
> counting on for the 2008 runaway -- (1) a software breakthrough on
> somewhat-better-than-today hardware and networking, or (2) a major
> paradigm shift in the hardware. Or something else? My guess for Eliezer is (1),
> because of his writings on seed AIs, but my hope is (2) -- though even
> if strong indications of such a shift have appeared by 2008, I wouldn't
> expect to have reliable and cost-effective manufacturing processes or
> commercial products by then.
Why the shift? Well, mainly:
1) Instead of visualizing the Singularity as a far-off celebration on the
day that the whole human race gets together and networks our computing
power to build the seed AI that is the final culmination of all our
accumulated programming expertise and augmented intelligence, I figured
out how to build the darn thing myself on a Beowulf network.
The other reasons, in no particular order:
2) I figured out that human cognition actually uses all that parallelized
power mainly for performing operations in one neural "step" that are
intrinsically serial - in other words, the brain needs a massive amount of
power because the transistors run at 200hz. You'd need a massive amount
of parallel power to run a spreadsheet if it ran on a 200hz CPU. So my
estimate of the real computing power required to get human-equivalent
intelligence got stepped down by three or four orders of magnitude.
3) Nanotechnology moved waay faster than I expected. So, to a lesser
extent, did cognitive science, Singularity memetics, and even non-SingInst
commercial AI.
4) I stopped thinking in terms of "What is the precise degree of
ludicrousness involved in the thought of my ever getting a cent from
Social Security?" and shifted to "When do I need to do something if I want
to do it before the Singularity?"
-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:04:57 MST