From: Rob Harris Cen-IT (Rob.Harris@bournemouth.gov.uk)
Date: Tue Aug 17 1999 - 09:39:40 MDT
>Say you were a super super ... super intelligence (S^NI),
modified beyond
>all comparison with the gaussian version of yourself. After a
particular new
>modification to jump you up to a new level of intelligence, you
find that
>you are so awesomely intelligent that you can predict with
99.99% accuracy
>the outcome of any action that you might consider,
>It could never happen. You may be far more intelligent than a human
but the thing
>you're trying to figure out, yourself, is far more complex. You'd
be no better off
>than humans are at predicting what you'd do next,
What is your reasoning behind this? You don't offer a trail to your
above conclusion. Because the SI is "far more complex"? And so it follows
that a superintelligence about which we know nothing will certainly not be
able to track the variables in their own system - despite the fact that they
probably built themselves from and as a lower life form anyway ? I think
not.
**********************************************************************
This email and any files transmitted with it are confidential and
intended solely for the use of the individual or entity to whom they
are addressed. If you have received this email in error please notify
the system manager.
This footnote also confirms that this email message has been swept by
MIMEsweeper for the presence of computer viruses.
www.bournemouth.gov.uk
**********************************************************************
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:04:47 MST