> 1) Would an individual be uploaded to a centralized 'net' or a private
> one? Would there be a benefit in either? If centralized how would
> 'you' affect 'me'?
I think that will depend on the available technology. At first, the
computing demands of an upload would likely be rather extreme, and
uploads would be run on special or general purpose supercomputers
(likely blocks of massively parallel nanocomputers). These would
likely be private, although I could envision an "upload hotel" where
the uploads paid rent for computing resources.
Running an uploaded mind over a net is trickier; as anybody trying
to program distributed applications know, errors occur all the time,
different parts may run att different speeds and so on. But it is
also more resilent, you don't need to have a lot of resources in
one place and it is probably cheaper than having a big computer
somewhere. So in time uploads might be able to run on the net,
like the Copies in Greg Egans seminal _Permutation City_ (IMHO
the Big Uploading Novel to date).
Interaction between uploads would depend on their user interfaces.
My guess is that they would normally just interact by allowing some
sharing between the virtual worlds they lived in, like allowing the
other to visit with an avatar. Deeper contact may be possible, but
would require some work.
> 2) Have security and backups been researched? Could someone 'pull the
> plug' and delete everyone? Could a virus be introduced to
> control/corrupt/erase an uploaded individual?
Backups are one of the main selling points. Just make a backup (say)
daily, and reboot it if you get killed. Backups may even be hidden
("In the case of my untimely death, run this program").
Security is essentially computer security, with some extra demands.
You want to make it hard to hack the upload system (which encompasses
more than the mind: the virtual environment, the body simulation,
personal files, backups etc), make error recovery easy (if half of
the computer memory is suddenly wiped by a disk-crash a RAID system
might restore the upload without it noticing) and of course other
forms of misuse (like pirate copies of people) must be prevented.
Tricky, but not fundamentally different from ordinary computer
security.
It is likely hard to control people through a virus, since neural
nets are messy and distributed; it is hard to tell where to strike
to make an upload (say) vote in a certain way or sign over its
money to the corrupter. Of course, if you have access to an
upload, you have all the other possibilities full access to a brain
could give you (shudder).
> 3) Since 'thought' (as far as I can remember) is somewhat determined by
> protein deposits at specific neurons (changing the course to the next
> neuron I suppose) wouldn't everyone in a central 'net' then think alike?
Not if they were different emulations. Connecting the neural nets
of different uploads might be possible, but that will likely not make
them identical unless they are very tightly connected (and then
you will get a borganism, a collective mind, rather than a number
of copies).
> 4) Do emotion and the senses determine how one thinks? I know my mood
> affects my thought processes and certain smells or sounds do also. How
> will this be handled in a 'net'?
Yes, emotions are integral to our cognitive processes (see Damasio's
_Descartes' Error_), and they do affect our thinking (smells are especially
powerful, they interface *directly* with our emotional systems). I
don't see any reason why it would be different emulating the neural
process of thinking from emotion, both are (from a computer perspective)
neural interactions.
> 5) If there would be no emotion I wouldn't 'care' if I were thinking or
> not. Would I cease to exist?
If you had no emotions, then your brain would be incomplete. Whether
you would be you is a matter of definition.
-- ----------------------------------------------------------------------- Anders Sandberg Towards Ascension! asa@nada.kth.se http://www.nada.kth.se/~asa/ GCS/M/S/O d++ -p+ c++++ !l u+ e++ m++ s+/+ n--- h+/* f+ g+ w++ t+ r+ !y