From: Robert J. Bradbury (bradbury@aeiveos.com)
Date: Sun Mar 18 2001 - 11:19:39 MST
On Sun, 18 Mar 2001 CurtAdams@aol.com wrote:
> Yes, but not in the way you want. Simplify less, perhaps; but you will
> still have a bottom level, the kind of thing we don't see.
That is unclear -- is quantum mechanics the bottom or string theory
the bottom? What is all the "dark energy" that is now all in vogue
in the most recent papers?
> I think the appeal to unimaginably powerful beings with mysterious
> motivations and an intense need to hide from humanity sounds like certain
> religious apologia. Why 10^27 supercomputers to sim me? Why
> sim a huge universe when our galaxy would be more than adequate?
Curt, you have said this several times in several different ways
in several letters and I've commented on it once or twice, as has
Eugene and I believe a few others. Get this: THE SIMULATION DOES
NOT HAVE TO BE A COMPLETE UNIVERSE. The simulation can be on
any scale from a single brain to that of the universe.
According to Gwen Jacobs at a recent NSF meeting on terrascale computing,
it would require 1 month of Petaflops computing power to simulate 1
second of brain time (now I don't believe for correct -- I think it is
much worse (i.e. computing requirements are greater) if you are doing
and atomic or quantum scale simulation, but thats another discussion)..
For our purposes here, we will assume this is correct. That translates
to ~3x10^30 Ops to do a real-time simulation of a single human brain.
Given the 10^42 ops available in an MBrain, that means you can support
almost a trillion simulated human brains (~126 human civilization
equivalents at our current population level [IF you were simulating
the entire civilization!!!]). Conversely you could be doing nothing
but running a trillion Curt Adams simulations. Now, if 100 human
civilizations will not give you the statistics you need, you simply
build your MBrain around a star of about 50 times the mass of the
sun and that will get you enough power to run half a billion human
civilization simulations.
An interesting tidbit falls out of this in that it suggests you can't
run a whole brain simulation on a Drexlerian rod-logic 1 cm^3 nanocomputer
in real-time. You would have to slow it down by about a factor of a
billion to run it in the NC (or run a tightly linked net of NCs).
You may also note the efficiency you get if you run "human mind equivalents"
(the figure I typically quote is a trillion trillion HMEs in our solar system,
though its likely to be more. This is because if the worst case estimates of
brain capacity are correct, you get ~10^5-10^7 HMEs per NC. HMEs are where
you effectively do what the brain does but use a combination of software+hardware
that does it more efficiently (e.g. Moravec's "vision" systems).
The net of this (based on the very questionable assumption from Gwen
Jacobs) is that you get a factor of a trillion greater efficiency
by migrating your uploads from atomic simulations (much slower
than us) to human mind equivalents (much faster than us).
Quite interesting. Certainly suggests a fair amount of job security
for future programmers and neuroscientists.
> If the entire universe has been simmed but some super-universe
> creatures as an experiment, they are not our descendants. Also,
> they're simming the whole shebang and not us; we're just an
> epiphenomenom and our actions won't make them turn it off.
You don't have to sim the entire universe. You only have to sim
the I/O and mental functions of the number of non-zombies
you want to put in it!
Paraphrasing some great physicist Curt, you are not even wrong.
Note that I doubt these differences (if you want to maintain them) can
be resolved in this low bandwidth channel. You may simply have to come
to Extro5 and listen to the discussions there.
Robert
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:06:29 MST