>Say you were a super super ... super intelligence (S^NI),
modified beyond
>all comparison with the gaussian version of yourself. After a
particular new
>modification to jump you up to a new level of intelligence, you
find that
>you are so awesomely intelligent that you can predict with
99.99% accuracy
>the outcome of any action that you might consider,
>It could never happen. You may be far more intelligent than a human but the thing
>you're trying to figure out, yourself, is far more complex. You'd be no better off
>than humans are at predicting what you'd do next,
What is your reasoning behind this? You don't offer a trail to your above conclusion. Because the SI is "far more complex"? And so it follows that a superintelligence about which we know nothing will certainly not be able to track the variables in their own system - despite the fact that they probably built themselves from and as a lower life form anyway ? I think not.
This footnote also confirms that this email message has been swept by MIMEsweeper for the presence of computer viruses.
www.bournemouth.gov.uk