absolute morality? ha! free will? pnyah!

From: Rob Harris (rob@hbinternet.co.uk)
Date: Fri Nov 26 1999 - 07:57:55 MST


>In that case it seems that there is a risk that SIs will develop their own
goals (just as we do)

No way do we develop our own goals. If an "intelligent" chess program
devises a move that will result in a piece being taken, it may seem that the
program has spontaneously decided to take the piece. It has not. It is
following it's program to win the game, this is it's root motivation, and it
is far from free - it was strictly defined by the game creator and is
therefore completely invariable.
Our base motivations are also strictly defined and completely invariable.
Any action we devise to fulfil these goals, is just that - an intelligent
system devising a method of achieving the strictly defined goals.
And as for absolute morality - the concept is utterly meaningless. Just make
an attempt to decompose it. You will fail. It's abstract symbolic concept
does not have a counterpart outside of our minds.



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:05:51 MST