Billy Brown:
> As I understand it, your contention is that an entity with a strong
> motivation system might be capable of changing it, but would never actually
> choose to do so. Is that substantially correct?
There is one type of situation where this fails though. If the SI knew that it was to be subjected to some sort of mind-scan, and that it only if it has certain fundamental values would it be allowed to survive, then it might replace its old values with new ones, the new ones being choosen as that set of values that would (1) allow it to pass the test, and (2) lead to the attainment of its old values at least as well as any alternative set of values that would pass the test (modulo its present knowledge).
Nick Bostrom
http://www.hedweb.com/nickb n.bostrom@lse.ac.uk
Department of Philosophy, Logic and Scientific Method
London School of Economics