> I agree with this. The big question is what invariants are needed to
> achieve intelligence, and if emotions are part of them, specific just
> to intelligences in a certain kind of enviornment/evolutionary past or
> just one possibility among many.
I don't think we are really discussing 'intelligence' here. Intelligence
for limited problem domains already exists in machine-chess-playing
systems, expert systems, and many lower animals.
We're discussing the more nebulous aspects of human-like consciousness:
free-will, imagination, adaptability, creativity, initiative... And
emotions do play a vital role in those.
> > Well, I *don't want* AIs to resemble humans with their complex emotion-
> > based value systems. We need obedient servants, not competition. So if
Obedient servants we already have. That's not what we need. Unless they
have some capabilities to come up with original ideas and creations, they
won't be any big step up.
Hiro