From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Fri Jun 29 2001 - 21:44:09 MDT
"J. R. Molloy" wrote:
>
> Maybe so, but suspended judgement seems appropriate here.
>
> If GAC helps to develop AI more consistent with your expectations and
> requirements, then all is well and good. But it doesn't hurt to be careful. So
> why not test the beast to see how friendly it is _before_ it even comes close
> to AI. After all, if we can't test a chatbot, how can we ever hope to test a
> full-blown AI.
Molloy, GAC has nothing to do with AI. Zero, zip, nada. It is not on the
track to seed AI or Friendliness. There is no point at all to testing
it. If one person enters the pixel "Is it Friendly to help humans? Yes."
and someone else later enters the pixel "Is it Friendly to turn humans
into sawdust? Yes.", GAC will see no contradiction. GAC is dumber than
Cyc, dumber than Eliza, dumber than an SQL database, dumber than a
tic-tac-toe search tree, and on roughly the same intellectual level as a
sack of nails. GAC defines a new nadir of AI. It is a whole quantum
level dumber than the worst AI programs of the 1950s while sacrificing
nothing in hype.
-- -- -- -- --
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:08:22 MST