From: Matt Mahoney (matmahoney@yahoo.com)
Date: Tue Aug 21 2007 - 09:26:06 MDT
--- Stathis Papaioannou <stathisp@gmail.com> wrote:
> I'd be certain that an exact biological copy of me has the same
> consciousness. I'd be almost certain that a neuron by neuron computer
> emulation of my brain would have the same consciousness as me (David
> Chalmer's fading qualia argument). However, I couldn't be certain that
> some machine designed to copy my behaviour well enough to pass for me
> would have the same consciousness as me; it might be a p-zombie, or
> more likely it might just have a completely different consciousness. I
> would agree to be destructively uploaded in the first two cases, but
> not the last.
So you argue that consciousness (defined as that which distinguishes you from
a p-zombie) depends on the implementation of your brain? Does it matter if
the neural emulation is optimized by simulating average firing rate as opposed
to individual pulses. How about simulating the collective behavior of
similarly weighted neurons with single neurons? How about simulating the
visual cortex with a scanning window filter? What aspect of the computation
results in consciousness?
What if the copy is not exact? You could upgrade your upload with faster
neurons, more memory, additional senses, finer motor control, a wireless
internet connection, and so on. At what point does your consciousness not
transfer?
Suppose the destructive upload consists of a nondestructive exact biological
copy, followed by you shooting yourself. Would you pull the trigger?
-- Matt Mahoney, matmahoney@yahoo.com
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:58 MDT