From: Samantha Atkins (samantha@objectent.com)
Date: Wed Apr 04 2001 - 22:26:07 MDT
Chris Cooper wrote:
>My scenario of humans in a cushy Sysop-controlled zoo doesn't seem to
> conflict with your description of Friendliness. If we have as much, if not
> more freedom in our new virtual digs, minus the ability to harm ourselves or
> others, the Sysop has achieved its goal of Friendliness to humans. We would
> still have individual volition to do anything that we could do pre-Sysop. We
> couldn't upload/upgrade ourselves without the Sysop's help, but then we
> couldn't do this before, either.Thus, no Friendliness conflict.
It is not at all clear that we require Sysop seed level AI in order to
upgrade ourselves significantly and even upload ourselves. Therefore,
if the Sysop considered such things "unfriendly" it would have to make
us dumber. If it did that it would be violating our wishes, choices and
free will and be in conflict with Friendliness. So perhaps helping us
upgrade and upload and grow up in the process is the only choice it
could reasonablly make.
Or perhaps it would conclude that you can be friendly to an arrogant,
determinely stupid species and simultaneously preserve its free will and
idenity. If so it would dump this paradoxical meaningless chore and go
find something better (at least actually possible) to do.
- samantha
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT