Re: Volume of Human-Equivalent Intelligence

From: Nick Bostrom (bostrom@ndirect.co.uk)
Date: Tue Jun 22 1999 - 08:33:21 MDT


Some comments on Raymond G. Van De Walker's epistle:

> Let me show you the numbers: The human brain has 10 billion
> neurons.

Most recent estimtes put it at 100 billion.

> Now, I thought about ways to reduce this by editing the system,
> but they won't work. Like most real computing systems, the majority of
> the logic (>95%) by weight or volume is I/O. (The cerebrum, cerebellum,
> gyrii, and most of the encephalon) Neural networks are great for I/O:
> they're robust and compact compared to the digital systems they replace.
> You would not want to use anything else to construct the phenomenal
> systems of a robot.

Hmm. One way of estimating the brain's processing power is like this:

The human brain contains about 10^11 neurons. Each neuron has about
5*10^3 synapses, and signals are transmitted along these synapses at
an average frequency of about 10^2 Hz. Each signal contains, say, 5
bits. This equals 10^17 ops.

Another method, used by Moravec, is to look at one part of the
nervous system whose function we can replicate on computers today
(the retina). Then we multiply the resources needed for this
computation by the factor by which the total brain is larger than the
retina. This gives us the figure 10^14 ops for the brain, three
magnitudes less than the first estimate.

The second estimate presupposes that we can make some optimizations.
Maybe intelligent design and new computational elements enable us to
do several orders of magnitude better than mother nature with the
same number of ops. Why couldn't the same be possible with regard to
memory requirements? There is no evidence that the brain's memory
system is not highly redundant, so that with highly reliable
artificial (or simulated) neurons one get away with a lot less
memory. We simply don't know.

Another problem with the multiplicative approach - multiplying
neurons, synapeses per neuron, and resources per synapse - is that it
might turn out that a lot of the brain's computing power is in
higher-order interactions in the dendritic trees. Signals may not
only be added, but multiplied, and with interesting time-integration
effects that might be used by the brain. The multiplicative approach
neglects these potential effects, and may thus understimate the
brain's computing power.

> Using Drexler's estimates for random-access memory
> (20MBytes/cubic micron), we can fit 305 of 64K computers in a cubic
> micron. The computers therefore take roughly 9.8x10^4 cubic microns.

Does this take into account that there might be a need for cooling?

> 11,300 cubic microns is small. It's a cube about 22.5 microns on
> a side, say a quarter-millimeter on a side, about 1/8 the size of a
> crystal of table salt. 17,300 cubic microns (storing synaptic addresses)
> is still small, about 25.9 microns on a side. Even 34,600 cubic microns
> (double everything) is small, maybe 32.6microns on a side, the size of a
> crystal of table salt.

As an estimate of precisely how small advance nanotech could make a
human-equivalent computer I think we have to take this grain of
salt with a grain of salt so to speak. An uncertainty interval of a
couple of orders of magnitude does not seem unreasonable.

Nick Bostrom
http://www.hedweb.com/nickb n.bostrom@lse.ac.uk
Department of Philosophy, Logic and Scientific Method
London School of Economics



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:04:15 MST