From: John K Clark (johnkc@well.com)
Date: Sat May 02 1998 - 23:34:10 MDT
-----BEGIN PGP SIGNED MESSAGE-----
>>Me:
>>I wouldn't stop the development of hyper intelligent machines even
>>if I could, and out course I can't.
>Dan Fabulich <daniel.fabulich@yale.edu>
>OK. So what DID you mean by this? "I also think that the question of
>whether humans should give rights to machines is moot, the question
>of whether machines will give rights to humans is not."
What part didn't you understand?
>Since we can't stop the creation of hyper-intelligent robots, nor do
>we stand any chance of controlling them once we do
Yes, I thing that is almost certainly true.
>this doesn't seem like a very interesting question at all.
What's the question?
>"J. R. Molloy" <jr@shasta.com>
>How do you know that "the subjective experience of the machine is of
>interest to it"?
Just a guess, I'm also guessing that some things besides me have subjective
experiences.
>The term /intelligence/ generally refers to the ability to solve
>problems.
Yes.
>In contrast, sentience means the capability to appreciate, enjoy,
>and otherwise relate to sensate input.
Sensate input are just sensations, so you're saying sentience means the
ability to have sensations and the ability to have sensations means sentience.
I don't think you're going to be able to get much mileage out of that.
>Robots work mechanically, i.e., within a narrowly constrained area of
>action without regard to the wider aspects of the environment.
Present robots certainly do because present robots are stupid. I see no
reason why that limitation need be permanent.
John K Clark johnkc@well.com
-----BEGIN PGP SIGNATURE-----
Version: 2.6.i
iQCzAgUBNUvxEH03wfSpid95AQGHAwTw78EtKNTrHtuIBlEfltvY+McFOdwQpbIM
ufl0gGX2jP4i8dEXPT/9Frtgxgcw5EComJkwBxjAroMkdSm5xXYOuqVh8w64prpa
J9qCK4ezegXhaCImoCm4bdmPiHRhYnqA95cBEcj96NynSOq+63oOdmuIQim6Y8CK
W+HK6b6VGA2w9BeFw8BxbcRIIWCL9kEyx8sjWylI0xEMBHH+3/0=
=V3sI
-----END PGP SIGNATURE-----
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:49:02 MST