From: Dan Fabulich (daniel.fabulich@yale.edu)
Date: Fri Dec 04 1998 - 15:48:09 MST
Eliezer S. Yudkowsky wrote:
>You have some people who say, "The choices are all the same; I'll do what
>seems best to me." At a higher level of self-awareness, you have: "I'll
>stick with the evolutionary system I was born in, they're all the same."
At a
>higher level of self-awareness, you say: "Which new system you choose
depends
>on how your current system evaluates that choice." Choices, systems,
>trajectories... but I want to jump out of the system and choose the real
answer.
Will a "superintelligence" be able to "jump out of the system and choose
the real answer?" Just because you're superintelligent doesn't mean you're
transcendent.
Similarly, I think there's a strong argument against being able to "jump
out of the system" without resulting in a universe where we can't even make
probabilistic guesses about truth and falsehood. Logic is an excellent
example of this. We have a notion that things generally follow logically
from one another, but this depends inherently on certain principles of
non-contradiction, rules of inference, etc. If you jump too far out of
such a system, you'll have no logic at all.
You make an apt point when you note that "which new system you choose
depends on how your current system evaluates that choice." But why do you
think superintelligences wouldn't be bound by such a rule?
-Dan
-GIVE ME IMMORTALITY OR GIVE ME DEATH-
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:49:54 MST