From: Anders Sandberg (asa@nada.kth.se)
Date: Thu Jan 28 1999 - 13:09:50 MST
"Eliezer S. Yudkowsky" <sentience@pobox.com> writes:
> I don't think that having access to ELTM (Extended Long-Term Memory, aka
> "The 'Net") constitutes true intelligence enhancement. Likewise for
> pencil and paper or a PalmPilot; likewise high-speed arithmetic;
> likewise chess-playing advice.
>
> If these abilities were part of our minds, if they were integrated with
> everything else, they might provide some true intelligence. Even then,
> I'm disinclined to believe. People with eidetic memories and "lightning
> calculators" are not noticeably transhuman.
I disagree, they are transhuman in some ways. The trick is of course
to increase overall effective intelligence and not just a few, rather
trivial aspects of it. I would say having a good net access is a more
general form of IA than a built-in calculator, and if I could download
skills from it it would be even more general. What constitutes real
intelligence seems to be a largely semantic question, what really
matters is what problems the individual can ask and solve.
> These abilities are all very easy to write into a science-fictional
> character. "Doc" Smith was writing "superintelligent" characters back
> in the 30's. Real intelligence (technical term: "smartness") is defined
> by your inability to write a character with the same abilities. If you
> could predict what a transhuman character would do, you would be transhuman.
What about a transhuman character with partially deterministic aspects:
(sorry for pirating your nice pastisches, I enjoyed them a lot :-)
Baron Hans Nidrach von Pompzidaize sat in his laboratory, looking at
experimental test subject X17. "How do you feel?" he inquired, his
rolling bass echoing from the laboratory walls.
"Superintelligent, Doc," replied X17, who had once been known as John
Smith. "I've only had the Throatwarbler-Mangrove Super-Neural Bypass
for sixteen seconds, and I've already learned twenty-seven languages
and figured out how to play the piano. I have also realized that you
have placed a kind of mental lock on me, making me unable to change
the subliminal programming you placed in me during the procedure."
Baron von Pompzidaize frowned, examining several multicolored
readouts. "Ach, you noticed it? So what will you do about it?"
"Nothing. Since one of these programs prevents me from changing them."
"And you can't find a solution?" the Baron asked, a faint fint of
unease obvious to X17.
"No. If you had programmed me as you originally had planned, it would
have been easy for me to manipulate you to remove the
lock. Unfortunately you made a mistake making even such manipulation
impossible for me to do. In fact, if you tried to remove the lock, I
would do everything I could do to prevent it."
The Baron re-read the printouts and looked pleased. "Well, then, do
you now feel competent to go destroy the Evil Empire and rescue the
Princess? Acting in accordance with the 1930s North-American
conception of gentlemanly behavior, of course."
"Sure, Doc," said X17. "It's not like I've got anything better to do."
"Excellent," said the Baron, checking two gauges and a flashing display.
"You still have the emotional maturity of a flatworm, like everyone
else in this novel. I was afraid your superhuman abilities might give
you goals slightly at variance with mine."
After X17 left, the Baron realized that he was the victim of a most
subtle and cruel revenge. Because X17 had done something awful to him,
without even doing anything. Now he would live in fear that his
creation actually *wasn't* bound by the rules, regardless of what the
actual state was. He could never tell, and trying to figure it out
would only drive him further into paranoia. He couldn't even be sure
if X17 had foreseen his every reaction and played with him, or if he
really had power over X17 but were a victim of his servant's
manipulations. X17 had indeed punished the Baron, in a rational manner
as suited his rational creator.
(OK, not a perfect example, but I couldn't resist)
> He stood up, executing the movement with impossible smoothness.
Actually, this is something I have been thinking of. Superintelligen
motor programs. They might be quite interesting, perhaps even
optimal. Imagine an SI in the kitchen...
-- ----------------------------------------------------------------------- Anders Sandberg Towards Ascension! asa@nada.kth.se http://www.nada.kth.se/~asa/ GCS/M/S/O d++ -p+ c++++ !l u+ e++ m++ s+/+ n--- h+/* f+ g+ w++ t+ r+ !y
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 15:02:57 MST