John Clark wrote:
> Robert Bradbury <bradbury@genebee.msu.su> Wrote:
>
> >there are a variety of external influences ranging from glucose
> >availability to environmental hormones that may result in different
> >brains even when you have the same genes.
>
> All true but you didn't mention the most important influence of all, memory.
> Although it is hardly a law of nature it is nevertheless true that up to now
> different brains always have different memories regardless of genes;
> which makes the claim that a twin is the same as a copy totally bogus.
Good, so we don't need to belabor that point.
> Jeff Allbright <jeff.allbright@usa.net> Wrote:
>
> >Does having the same computer mean having the same software?
>
> Yes, but by "computer" I don't just mean a box with electronic stuff in it.
> >From a logical perspective if they're not running the same software then
> they can't be the same computer. That's the great thing about computers,
> they're so easy to change, it just takes one line of code.
Now lets extend the two analogies. At some point in foetal development, there is
functionally no difference between two twin embryos. Is this within one cell
division?
Similarly, you mirror the drive of one computer to an empty drive on another
computer. Assume that both computers are running self aware AI applications
while this is going on. Given quantum uncertainty, the only just moment to
delete the original drive that the AI was mirrored from was within one processor
cycle.
Now, since we don't have any human level AI at this point in time, we can't say
for a certainty that turning a human level AI computer off is not tantamount to
killing it, because you don't know how much of its awareness is due to temporary
buffered information that isn't saved when turning it off. So the idea of
turning the AI off, moving its hard drive over to a new machine, then turning it
off, or of copying the drive when the original computer is off, then destroying
the inactive hard drive, is a feasible alternative.
Now take John's favorite old story of the instant person copier machine. As soon
as one neural processor cycle in the human mind, the human copy is its own
person.
I don't buy the argument that you can do a destructive scan of a person's brain
to upload to a new mind and its still the same person. We don't know enough at
this time to make that determination, and if the only difference between the
original and the copy is whether or not you destroy the original in the process,
then to do so is not the ethical thing, to say the least. I know John will say
I'm being bio-centric, or something, but nobody at this time knows enough to be
able to tell me that all I'm doing with the destructive scan of my brain is
giving a gift of information to someone else at the cost of my own existence.
Mike Lorrey
This archive was generated by hypermail 2b29 : Thu Jul 27 2000 - 14:09:01 MDT