From: Robert J. Bradbury (bradbury@www.aeiveos.com)
Date: Wed Dec 08 1999 - 06:20:09 MST
On Wed, 8 Dec 1999, Kate Riley wrote:
> (Please forgive me if this has been asked and answered previously. I have
> been lurking for a bit, but by no means am I familiar with everything that
> has been said and discussed.)
A high pitched off-stage voice says, "Oh no, a lurker!"
A man in a white coat steps forward, says in a low-pitched, somewhat
menacing voice, "Quick get a lurker trap, we can use them for the
Homo chlorophyllicus experiments and no one will know the difference."
>
> Concerning uploaded individuals. Considering that such a being would,
> presumably, be immortal, and considering that such a being would wish to
> retain all of its memories (and probably with more clarity than is afforded
> by our protein brains, which are notorious for storing memories in a
> somewhat fuzzy and incomplete manner), there would seem to be some concern
> about space.
All you people that have been on the list for years, you better look
out, some of these lurkers look like very quick studies.
> Certainly my computer could not hold even a significant
> fraction of my memories.
Seriously (can I do that here?), though I can't remember the paper
reference (I do have it, but I have so many papers...) research at
Bell Labs seems to indicate that your computer *could* hold a
large fraction of your memories. The experiments they have done
seem to indicate you store only a few bits a second. Working
this out:
(a generous 8 bits * 60sec * 60min * 16hrs * 365.25 days * 75+ years)
puts you up in the range of 2.4 gigabytes. This is a really
conservative estimate, the Bell Labs paper pegged the number
at more like a few hundred megabyutes. So you *can* buy single
hard drives now (for a few hundred $) now that can store the
brain's recall capacity. If hard drive's aren't fast enough for
you, the cost for this much computer DRAM memory is about $7000.
We currently have processors that can address this much memory.
[Lookout, while we aren't looking we are becoming obsolete...]
As I've mentioned in another note, the question remains open
whether recall capacity and recognition capacity are the same.
As someone, Hal perhaps?, pointed out, there may be a fair
amount of compression occuring. I don't have to remember
walking to school everyday, I only have to remember odd
events that have occured on a few of the days that I
walked to school on top of a general pattern of walking
to school.
> Now, admittedly, my brain is comparatively small,
> and seems to do alright, but most, if not all, of us have witnessed the loss
> of memories which tends to occur in the later years of a person's life,
Use it or lose it.
> which results partially from the fact that there is a limited space capacity
> for memories.
I don't think this is proven. More likely, as I think Anders has
pointed out, you may get loss of memory because you don't exercise
the synaptic connections and they gradually lose strength relative
to other synaptic connections. You may occasionally "recover" a
memory if you happen to trigger a global pattern that pieces
together a memory from widely separated subfragments across the brain.
> One would want to retain all memories,
Some of us would. Spike seems to be hell bent on forgetting them.
> and therefore would want upgrades, spare hard drives, if you will.
> Is this a concern?
Perhaps, in my mind it depends on how many *really* good stories you
have. I am far far away from the point where I have to start
dumping good stories due to memory limitations. Even when I
get to that point, I suspect I can't rent some of Spike's unused
capacity. I'll probably have to encrypt the stories though, otherwise
he might go using them as his own...
Is this more of a concern if you want to know "everything"?
Yes. But I think most of us are willing to let others be the
"experts" in the things we aren't interested in. I for example, am
*completely* willing to let Robert and Damien be the experts
on "bathos".
> Is it realistic to expect resources to hold out?
It becomes a problem only if we set our memory "valuations" above
the resource base. One of the things that is interesting in science,
is that as we refine principles, theories and laws, the details that
lead us to those conclusions become less important. If I have an
abstraction that I know can be used to go generate a lot of
content, then it may be unimportant for me to remember the
content itself. Think of all of the pretty pictures that you
can get with a fractal equation and a few initial variables.
The problem with humans as currently structured is that the
memory mechanism is on autopilot with no delete key. We
aren't consciously saying "Do I really need to remember this?".
> How many people are expected to ultimately be uploaded?
All of them that want to be. My estimates for Matrioshka Brains
(solar system sized nanocomputer clusters) is that you get
something like 10^26 (100 trillion trillion) human brain
processor equivalents and something greater than 10^28
in human recall capacity (memory) equivalents.
The problem isn't with capacity for uploads, the problem is
with capacity for copies. I can't speak for other list members,
but I think 10^16 copies of myself (which is what everyone can
have) makes no sense at all (it certainly gets Spike worried).
So, what you have is uploading, followed by a *huge* increase
in personal capacity, probably involving some mental slow-down
as you become physically really large (much larger than planetary
sized) and then the crystal ball gets really cloudy.
It may be quite possible, that we all decide to stop growing our
minds at asteroid or moon size because of diminishing returns
or self-management problems. The excess computronium at that
point gets devoted to offspring that we produce by mind division
(copying?) and mating. But since we do those things "consciously"
and the phase space for consciousness is huge, it may take quite some
time to figure out how to use those resources. Some people
on the list would support random evolution of ArtificialBrains
(ABrains) but since we aren't anywhere near a consensus on whether
or not its moral if we want or have to unplug these and whether or
not we can keep them from turning on us, I think the jury is still out
on whether this is a good idea.
Great observations and questions though.
Robert
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:06:00 MST