Re: The Singularity

From: Eugene Leitl (eugene@liposome.genebee.msu.su)
Date: Fri Jul 17 1998 - 09:13:13 MDT


Robin Hanson writes:
[...]
> You probably can't think of a reasonable way to calculate the temperature
> of a black hole either, but that doesn't mean other people can't do it.
> Do you mean to claim more than that *you* *now* haven't thought of
> something you like?
 
Good point, but afaik there is no consistent theory of human actions,
either, and we're a good deal more predictable than an SI. Ergodic
systems are intrinsically unpredictable, and you can't prove the SI is
not occasionally ergodic.
 
> >Since the SI will be vastly more intelligent than humans, IMO we may not
> >be able to comprehend its motivations, much less predict them. The SI will
> >be so smart that its actions are constrained only by the laws of physics,
> >and it will choose a course of action based on its motivations.
>
> Why do you assume such a strong association between intelligence and
> motivations? It seems to me that intelligence doesn't change one's
> primary purposes much at all, though it may change one's tactics as one
> better learns the connection between actions and consequences.

I could reason evolutionary, but I have no idea whether this still applies
to the SI. Areas of human enterprise is certainly unapplicable.

'gene



This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:49:22 MST