RE: Revolting AI (was economics of star trek)

From: Dickey, Michael F (michael_f_dickey@groton.pfizer.com)
Date: Wed Mar 06 2002 - 13:40:51 MST


-----Original Message-----
From: Simon McClenahan [mailto:SMcClenahan@ATTBI.com]
Sent: Wednesday, March 06, 2002 2:08 PM
To: extropians@extropy.org
Subject: Re: Revolting AI (was economics of star trek)

"I now have a moral obligation to care for it, as opposed to enslaving it.
The Categorical Imperative http://www.utm.edu/research/iep/c/catimper.htm
states "a given action is morally correct if when performing that action we
do not use people as a means to achieve some further benefit, but instead
treat people as something which is intrinsically valuable." Of course I
substitute "people" with my anthropomorphized computer. This ethic has
worked for me, treating tools with the similar respect as people or
animals."

Interestingly enough, Kant never included animals in his catagorical
imperative.

> We are so used to computers and robots being programmed to simply
> 'obey'...how are we going to draw a line and say, 'this machine has
sentient
> rights and this one doesn't, even though they look exactly the same?'

"My personal solution is anthropomorphization, i.e. all the machines have
sentient rights. The decision to enslave an entity carries far too much risk
IMO, so I default to respect with reasoning."

If you attatch rights to intelligent machines, would you do the same for
intelligent non-human animals?

Just curious

Michael

LEGAL NOTICE
Unless expressly stated otherwise, this message is confidential and may be privileged. It is intended for the addressee(s) only. Access to this E-mail by anyone else is unauthorized. If you are not an addressee, any disclosure or copying of the contents of this E-mail or any action taken (or not taken) in reliance on it is unauthorized and may be unlawful. If you are not an addressee, please inform the sender immediately.



This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 09:12:48 MST