Spike Jones wrote:
>
> > "Eliezer S. Yudkowsky" wrote: ...Minsky and
> > Papert killing off the entire field of neural networks...
>
> Elizer, Im not up to speed on the state of the art in
> neural nets, but I did notice there was not much said
> about it in recent years after it seemed so promising
> 12 yrs ago. Could you give us a one paragraph
> summary on your comment?
No, this was long before then. I think there was a paper in 1967 and a
book in 1969, if I recall correctly. Essentially, Minsky and Papert
proved that a two-layer Perceptron (which was what a neural network was
called back then) couldn't compute the XOR function. Which killed off the
entire, then-nascent field of neural networks. Eventually someone noticed
that a three-layer neural network (with a total of five neurons) could
easily compute the XOR function and the entire field came back to life.
It never lived up to its initial hype but is still thriving today,
especially those efforts that strive for greater biological realism in the
networks.
I am oversimplifying slightly - what really brought the field back was
backpropagation, which is what enabled the training of multilayer networks
- but the fact still remains that Minsky and Papert's eulogy was
incredibly premature and did a lot of damage. Nothing remotely on the
scale of Margaret Mead, though.
-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2b30 : Mon May 28 2001 - 10:00:04 MDT