>>I remember ...in the late '70s, predicting enough computing power
>>for real-time brain simulations by 2025 or something
Hara Ra wrote-
>I did the same calculation, got the same result. I wonder how many
>of us did that in the late 70s.....
Hmm, lemme dust off the back of this envelope...
~1E10 neurons, ~1E13 synapses
say a synapse needs a 10 bytes of memory and 1E2 calculations per second.
That's 1E14 bytes and 1E15 ops/sec (not necessarily floating point).
To the nearest power of 10, workstation-class computers have
~1E8 bytes and ~1E9 ops/sec.
So both the memory and the speed would have a factor of a million to go.
If speed only increases by a factor of 10 each decade (2Mhz in 1977 vs.
200 today?) that's 60 years!
But if memory goes up 1000 each decade that's there in about 20 years...
2017.
If you use multiple computers (say, 1000 PC's, 30 years from now) to
increase the speed, there's a networking issue. Worst case, each
neuron has tentacles going everywhere and each computer has to receive
1E10 x 1E2Hz or 1000Ghz. Best case, each neuron transmits locally and to
one other computer (out of 1000) , and each would have to receive 1E9 Hz,
just 1 Ghz. But 10Ghz is only 100 to 1000 (depending whether you count
raw bandwidth or throughput) times the speed of current "Fast Ethernet",
and if you assume network switches rather than a bus-like net, that's
only a factor of 10 in speed per decade for the network.
(Note: according to the memory calculation, you would only need 1/1000
of a computer's worth of real-estate in 30 years--a fraction of a chip!?)
But using components specialized for being brains might slow you
down because it's harder to get chips designed and built than to
program PCs. On the other hand, such brain-chips could already have
become an industry by 2027, used inside insect-robots or spy cameras
starting a decade earler...
--Steve
-- <sw@tiac.net>Steve Witham Don't dream it, su to it.