>The problem isn't "wiring them together" but having them "do
>something" once wired. Nobody has ever had a 70 million cell
>neural net before. How deep do you make it? How broad do you
>make it? How many inputs & outputs should each neuron have?
Okay, I see what you mean. Just hooking them together doesn't accomplish anything unless they can perform significant functions. (da-oh!) I had skipped ahead to the concept of "self-assembly" which the HP guys did with nano-wires, and applied that to the project of organizing de Garis' neural net. With the proper encouragement, the neural nodes could connect themselves in a similar way to how neurons in the brain grow connections on demand. From what I can gather of the de Garis team's work, parallel processing evolves algorithmically... so with self-assembling wires, just let the machine decide how to wire itself up! Brains do it. Eco-systems do it. Even marauding picnic ants do it. Let's let a massively parallel processor do it.
Or... Perhaps I don't know enough about parallel processing to know that it can't work.
>I'm not sure how flexible the de Garis architecture really is
>but if the depth can be up to 10 deep, and the net has
>1000 inputs and 100 outputs, and each neuron can interact with 500
>others, the possible number of configurations is huge. [I'm not an
>expert on neural nets, so if someone can explain this better
>please do so.]
Number of configurations (if limited to unweighted --purely digital -- values, i.e., |and|, |or|, |nand|, |nor| gates) without dialed-in resistance, (since the wires self-assemble), and with allowance for defective connections, can exceed that of the Internet. In addition, because the connecting wires (analogous to dendrites) can grow in response to directions from the processor nodes, analog computing results -- with all the advantages of weighted signaling and attendant fuzzy logic.
My (laughably naive) intuition tells me that neural nets (or massively parallel processors) and self-assembling wire want to converge with organic computing (remember those flatworm neurons that learned how to do arithmetic?) for some out of control AI.
Probably just a (double bowl) pipe dream. But the sloppiness of this solution appeals to my lazy Buddha nature.