Dan C. writes:
>> >... However, it's IMO impossible to assign a probability to any action the
>> >SI may choose to take that isn't precluded by the laws of physics. ...
>>
>>... what exactly do you mean?
>>
>... What I meant was that I can think of
>no reasonable way to defend any particular choice. ...
You probably can't think of a reasonable way to calculate the temperature of a black hole either, but that doesn't mean other people can't do it. Do you mean to claim more than that *you* *now* haven't thought of something you like?
>Since the SI will be vastly more intelligent than humans, IMO we may not
>be able to comprehend its motivations, much less predict them. The SI will
>be so smart that its actions are constrained only by the laws of physics,
>and it will choose a course of action based on its motivations.
Why do you assume such a strong association between intelligence and motivations? It seems to me that intelligence doesn't change one's primary purposes much at all, though it may change one's tactics as one better learns the connection between actions and consequences.