From: Metaqualia (metaqualia@mynichi.com)
Date: Mon Mar 01 2004 - 07:03:08 MST
> The key points are that there's a Universal Morality,
> which has to have certain general charactertistics,
> there's a personal morality, which has to have certain
> general characteristics, and all sentient minds are
> defined to be the interaction between UM and PM.
Maybe we use the word differently, but morality to me is all about what
happens to others, regardless of what happens to you, it's about averaging
out suffering rather than sticking it to the least powerful. So I don't
believe such thing as a personal morality can ever exist.
Morality is universal, or it's not morality, just a goal system.
I further cannot agree 100% that any rational being will come to the same
definition of morality.
An AI follows a goal system. if its goal system and architecture do not
allow it to create additional supergoals it will keep doing what it was told
to, getting smarter and smarter (gathering more information) but never
trying to do anything else than it was originally created to do.
Now, I can support your statement with one and only one chain of reasoning
- any sufficiently intelligent being necessarily has a model of self complex
enough to produce both qualia and the notion of having qualia
- such sentient would deduce value of other sentients' qualia since a
universe with only one qualia stream instanced in one brain is less likely
than a universe with a qualia stream per brain
- sentient would therefore come to see negative qualia as ultimate evil
through personal experience of such phenomena
but I think a bunch of people won't agree with this, so how do you support
your statement?
mq
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:46 MDT