From: Robert J. Bradbury (bradbury@www.aeiveos.com)
Date: Sun Oct 31 1999 - 10:34:54 MST
On Sun, 31 Oct 1999, Robert Owen wrote:
> phil osborn wrote:
>
> >
> > What no one here seems to understand is that the "mind" is much more
> > than the wiring. I'm talking about hormones, thousands of different
> > neurotransmitter and modifier substances, all of which are released both
> > generally and in specific areas of the brain and are essential to mental
> > focus, motivation and action. The mind is not a logic engine; it is part
> > of a living system.
True. The question comes down to whether the living system is
deterministic, random or chaotic and the "information content"
of the system. I'm fairly partial to the determistic camp, but
would be willing to tilt my hat that some of what goes on in the
mind can be attributed to either random or chaotic effects (at
a low level).
Now as for the "information content", theors for this have been
extensively developed based on the work of Shannon and others.
Whether you are encoding the information in neural impulses
(in the time (frequency) domain or in the amplitude domain, or both),
or in the quantity and effectiveness of neurotransmittors diffusing
across synaptic junctions or hormones being received from the blood --
*information* is *information*.
>
> But Robert's claim that neuron replacement is feasible I find reasonable.
Moravec & Minsky are the authors of these ideas, I'm simply observing
that I find them consistent with my knowledge of physics, chemistry
and biology.
> So do you regard the synthesis of neurotransmitter chemistry impossible?
If you mean the "manufacture" of neurotransmitters, then clearly it
is possible because our brains do it. But on an alternate computing
platform you probably wouldn't want to do that.
> The cybernetic aspect of e.g. signal control requires the operation
> of antagonistic agents -- is it possible to simulate the control
> mechanisms which regulate the release and re-uptake of, say, serotonin
> by monoamine oxidase chemistry? Or do you think these feedback
> processes are so complex that we simply cannot artificially replicate
> them?
All you have to deal with is the information transfer problem.
You do have to know fully and completely the information that is
being transfered. Then you simply encode it in the most effective
way on the simulation machine.
I could envision 3 different levels of uploads:
(a) A molecular level where you run the simulation on a (very)
big cellular automata that is simulating precisely
the quantum interaction between *all* of the atoms in the brain.
(b) A biochemical level where you simulate the activities of all
of the genes & proteins in the cell. Genes being turned on, off
proteins catalyzing reactions with various molecular concentrations,
contractions controlled by statstical diffusion or active transport,
etc.
(c) An information flow level where you simulate simply the information
being transfered. I can reduce the information to packets --
a neural net information packet with frequency & amplitude
information, a hormone packet providing concentration & state
information, neurotransmitter packets that are triggered
by the neural net information packets, reuptake packets
managed by cell state or net weighting state information).
I suspect that you will lose a little of a typical mind as you move
up through each of these levels, but you probably run faster and
faster because less processing power is required. Given equivalent
levels of computing capacity (a) probably runs much slower than
real-time, (b) in approximately real time (+/- a few orders of
magnitude) and (c) probably faster than real-time. If it turns
out there are random or chaotic elements that would be accounted
for in (a) but not (b) or (c), then you need to add these to the
simulation in some way.
You still need the inputs (or the simulation equivalents of those inputs)
that the body is normally going to be giving the brain (O2, glucose,
heart rate, temperature sense, sight, smell, taste, touch, etc.)
You only get things that are potentially really different from
the uploaded mind if you give yourself the ability to edit
the source code. Don't like the fact that Nitric oxide makes
men passive and women aggressive -- well delete that molecule
from (a) or those genes & enzymes from (b) or NO packets from
(c) and you *certainly* have a different mind. I would envision
that changes within the realm of human variation (of which there
is quite a bit) or those we have explored in laboratory animals
(as is the case with deleting the NO enzymes) will be fairly
acceptable/explored. When you say, lets delete all the packets
involved in mental processes when you are asleep and end up
with a memory-less psychotic then you would be well advised
to have a backup copy on hand.
Interesting morality problem... Should you have the right to kill
(due to poor reprogramming) yourself (that really isn't yourself
once you have forked a copy) again and again and again....
"Mind ethics" is going to be a bear as we have pointed out
previously.
I wonder if you can solve the problem by saying "forking" isn't
allowed. You can make a change but you have to save all of
the subsequent state information in such a way that you can
"undo" (editor terminology) or rollback (database terminology)
the changes. If you decide you don't like how things turn
out, you rollback to your original state (presumably with a
message from your former self explaining why it didn't turn
out very well). You have a limited amount of material & energy
in which to keep your rollback information (and the possible
rollback information of everything/one you interact with), so
you can only go forward along a path for just so far before you
have to make a decision whether or not to "commit" to that reality.
Presumably you have auto-rollback "on" unless a conscious commit
occurs, preventing you from staying locked in a psychotic state.
Not as much flexibility, but much simpler from an ethical standpoint.
Robert
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:05:39 MST