From: Robin Hanson (rhanson@gmu.edu)
Date: Mon Apr 24 2000 - 06:44:34 MDT
Anders offers some estimates:
> > 1) Neuron Inventory -- The number of distinct types of neurons, and
> > the number of each type in a typical human brain.
>
>... a likely number of functionally different types is perhaps ~1000.
>... 40,000 genes ... likely control gross connectivity ...
>
> > 3) Resolution required -- The spatial/chemical resolution of a scanning
> > technology that would be required to ... distinguish ...
>
>We need to identify synapses, which means on the order of ~1 mu.
>
> > Three parameters are dynamic, about *rates* of progress of these:
> > 1) Neuron types modeled -- The fraction of neuron types for which
> > we have reasonable computational models at *any* level of complexity.
>
>I think this can be estimated using a bibliometric study.
>
> > 2) Average CPU/RAM per neuron -- A weighted average across neuron
> > types of the CPU/RAM requirements for simulating each neuron type.
> > The weights should be according to the fraction of each type of
> > neuron in a human brain.
>
>I think this can be estimated roughly for the detailled models (like
>the Purkinje model or Traub's pyramidal model), although I don't have
>the numbers around. There is likely a multiplicative states x
>compartment demand on memory, ... CPU estimates
>are harder, as there are a lot of clever simplifications ...
I had in mind just writing down the actual CPU requirement for each
actual simulation, find two simulations at two different dates for
the same neuron type, assume exponential growth, and hence infer a
rate of improvement in CPU requirement. No need to try to directly
estimate all the clever improvements the future will bring.
> > 3) Scanning costs -- The time/dollar cost per area scanned of scanning
> > at the resolution required for uploading. ...
>
>Bruce H. McCormick is proposing a [machine] that would be able to
>slice and scan tissues at a very high rate at a quite modest price. ...
His estimate together with an estimate of current costs would help
us estimate a time rate of improvement. It might be more conservative
though to estimate current costs and costs as of ten years ago, say.
> > The progress estimate of neuron types modeled help us estimate
> > when it will be possible to upload at any cost, while the other
> > estimates (together with estimates of how fast CPU/RAM costs fall)
> > help us estimate how fast the cost of uploading is falling with time.
> >
> > Anders, tell me if I'm wrong, but it strikes me that we may well have
> > enough experience modeling neurons to start to estimate these parameters.
> > If so, constructing such estimates of when uploading will be feasible
> > and how fast costs will fall seems a very high priority to me.
>
>I think we can do a reasonable estimate for this. In fact, since I am
>thinking of doing a talk at TransVision in London this summer on
>uploading, it seems to be a good idea to do. I'll see what numbers I
>can dig up.
Well that would be wonderful of course.
The first big question we might hope to answer is whether uploading
will be limited by CPU costs or by scanning costs. And I'm very
interested to know whether uploading can be done for $1000 by 2100.
Or 2050.
Robin Hanson rhanson@gmu.edu http://hanson.gmu.edu
Asst. Prof. Economics, George Mason University
MSN 1D3, Carow Hall, Fairfax VA 22030
703-993-2326 FAX: 703-993-2323
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:28:12 MST