Robin Hanson wrote:
>
> Dan C. writes:
> >> >... However, it's IMO impossible to assign a probability to any action the
> >> >SI may choose to take that isn't precluded by the laws of physics. ...
> >>
> >>... what exactly do you mean?
> >>
> >... What I meant was that I can think of
> >no reasonable way to defend any particular choice. ...
>
> You probably can't think of a reasonable way to calculate the temperature
> of a black hole either, but that doesn't mean other people can't do it.
> Do you mean to claim more than that *you* *now* haven't thought of
> something you like?
I can in fact think of a reasonable way to calculate the temperature of a black hole: I can consult the literature or I can consult an expert, and after I have done so I can reasonably expect to be able to evaluate the result from first principles. I cannot consult the literature or an expert on the motivational psychology of SIs, and if I could do so I doubt that I could evaluate the result from first principles. I cannot reliably evaluate information from books or experts on human motivational psychology, much less SI motivational psychology, and IMO neither can you or anyone else: the field simply is not at the same level of development as is the thermodynamics of black holes.
>
> >Since the SI will be vastly more intelligent than humans, IMO we may not
> >be able to comprehend its motivations, much less predict them. The SI will
> >be so smart that its actions are constrained only by the laws of physics,
> >and it will choose a course of action based on its motivations.
>
> Why do you assume such a strong association between intelligence and
> motivations? It seems to me that intelligence doesn't change one's
> primary purposes much at all, though it may change one's tactics as one
> better learns the connection between actions and consequences.
Human motivation is less complex than the motivations of ants?