From: Matt Mahoney (matmahoney@yahoo.com)
Date: Thu Feb 12 2009 - 18:16:49 MST
--- On Wed, 2/11/09, Johnicholas Hines <johnicholas.hines@gmail.com> wrote:
> The question is: Even if the first-person experience of a destructive
> upload is identical to death, does that make the process
> of destructive upload morally equivalent to murder? Does it mean that
> murder, enslavement and torture of emulated humans is okay?
Nobody can experience death. In order to experience something, you have to remember it. The only reason we consider murder unethical is because cultures with this belief grew faster than those without it.
> > 5. What test must a program be able to pass to grant it human rights?
> 5. I think there's a smooth continuum of possible programs. Even
> though the middle is difficult to classify, there will be clear cases
> where a program is STRUCTURALLY a model of a human, and experts who
> argue convincingly that the program is a FAITHFUL model. Are you
> arguing that because the middle is difficult to classify, these clear
> cases should not be considered moral equivalents to humans? I think
> wikipedia calls this the "continuum fallacy".
> http://en.wikipedia.org/wiki/Continuum_fallacy
How do you know whether a machine structurally models a human brain? If you don't know how the brain works, then I could claim that anything models it. If you do know, then it's not AI, just an algorithm. (Neural networks come to mind).
What I think will happen in practice is that we will grant human rights to machines that can pass the Turing test. This, of course, is extremely dangerous. The Turing test does not imply human limitations. It does imply that the machine must understand human ethics, but not that it have the same ethical beliefs as a human. If a machine is smarter than you, there is no way for you to know if it is dishonest. It knows what you know, and what lies it can get away with. It can exploit your ethical beliefs to its advantage, but you cannot do likewise.
I suppose you could argue that nobody would ever build a psychopathic AI. And nobody will ever create a virus, trojan, worm, or spyware either.
-- Matt Mahoney, matmahoney@yahoo.com
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:04 MDT