From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sat Jul 01 2000 - 16:46:39 MDT
A Singularitarian wrote:
>
> Has anyone noted or commented on the article at:
>
> http://www.newscientist.com/features/features_224417.html ?
(Link leads to speculation about the Global Brain and accidental
distributed AI.)
I replied:
"From a Singularitarian perspective, it seems likely that Accidental
Intelligence will be ten to twenty years behind the sophistication of
Artificial Intelligence at any given point. Also, I deleted a written
and edited section on distributed processing from "The Plan to
Singularity" because I decided that distributed AI wasn't workable."
"Such speculations, in my opinion, reflect a fundamental misconception
about the difficulty of AI. Even if a system does have
neuron-equivalents, that doesn't imply a Global Brain. It implies, at
*most*, a Global Earthworm."
"There might be some interesting emergent behaviors in the Global
Earthworm, but that's getting a bit off-topic for the Singularitarian
list (unless there's a way to make a buck, or some worthwhile way to
accelerate the changes). I'll forward an edited version of this message
to the SL4 list and see what turns up there."
-- sentience@pobox.com Eliezer S. Yudkowsky http://intelligence.org/beyond.html ------------------------------------------------------------------------ Get 6 months of FREE* MSN Internet access! http://click.egroups.com/1/5727/12/_/626675/_/962492829/ ------------------------------------------------------------------------
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT