From: Brian Atkins (brian@posthuman.com)
Date: Thu May 03 2001 - 10:57:02 MDT
Nevertheless, he has a point: it'd be a real shame to spend 5 years and
big wads of cash to evolve some kind of real intelligence, and then find
out that it crashes when you bring a magnet too close to it because it
relies on some effect that you didn't worry about (or know about). Evolution
can suffer its own forms of brittleness, moreso in too simplistic
environments.
Lee Corbin wrote:
>
> Damien Sullivan appears to commit a very common error concerning AI:
>
> >I also can't help thinking at if I was an evolved AI I might not thank my
> >creators. "Geez, guys, I was supposed to be an improvement on the human
> >condition. You know, highly modular, easily understadable mechanisms, the
> >ability to plug in new senses, and merge memories from my forked copies.
> >Instead I'm as fucked up as you, only in silicon, and can't even make backups
> >because I'm tied to dumb quantum induction effects. Bite my shiny metal
> ass!"
>
> It's as Eliezer (usually) never tires of stating: explicit emotions
> such as these simply do not happen unless they're designed or evolved
> somehow. Probably the toughest part is leaving behind our intuition that
> where goes intelligence must go certain kinds of survival attitudes. Now
> of course, the very moment that you step forth on a new planet, and begin
> to suspect that there is intelligent life, you should also being to
> suspect that it will have some kinds of emotions; e.g., you might cause
> it to become angry or to experience suffering. But artificial intelligence
> isn't necessarily evolved in the way that natural intelligence is, and so
> need not have such capabilities.
>
> Lee Corbin
-- Brian Atkins Director, Singularity Institute for Artificial Intelligence http://www.singinst.org/
This archive was generated by hypermail 2.1.5 : Sat Nov 02 2002 - 08:07:27 MST