From: Harvey Newstrom (mail@HarveyNewstrom.com)
Date: Fri May 05 2000 - 23:32:09 MDT
"Zero Powers" <zero_powers@hotmail.com> sent a copy of this on Saturday, May
06, 2000 12:41 AM,
> Hmm. I had not considered that. If that is what Eugene had in mind, I
> don't suppose I could argue with that. I find it difficult to imagine a
> replica of me that is exact in every way, yet still not having
> consciousness. But if it could somehow be done, and the copy was
terminated
> before the last step of igniting the consciousness, then moral
> ly I don't suppose I'd have a problem with that. Although I still believe
> that would raise a sticky *ethical* issue.
To keep the record straight, I don't think Eugene agreed with my
interpretation. But in my mind, if the copy is still being artificially
caused to have the same thoughts as the original, instead of being allowed
to directly experience the world and generate its own thoughts, then I would
not consider it conscious yet. I would think that the process of
artificially causing it to keep in synch with the original was part of the
original creation/copying process. As soon as it is disconnected from
machinery that control its sensory input to match the original's, and it
starts perceiving the world from its own viewpoint, then it becomes
independently conscious in my opinion.
I believe that Eugene's "copy" must be kept in synch with the original, or
else he no longer considers it to be an identical copy. I don't think he
believe in killing a copy as soon as it has had its first independent
thought. In this sense, I am not sure that different participants really
disagree on when a copy can be killed and when it becomes an independent
being who should not be killed.
-- Harvey Newstrom <http://HarveyNewstrom.com> IBM Certified Senior Security Consultant, Legal Hacker, Engineer, Research Scientist, Author.
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:28:27 MST