From: Stathis Papaioannou (stathisp@gmail.com)
Date: Wed Jun 11 2008 - 18:58:47 MDT
2008/6/12 John K Clark <johnkclark@fastmail.fm>:
Me:
>> If there is one [a supergoal for humans], it could easily change, but it
>> is not necessary to program a computer this way.
>
> It is if you what the computer to be intelligent. At least that's what I
> think and evolution agrees with me.
Evolution might not favour foolish goals or fixed goals, but that does
not necessarily have anything to do with intelligence. If a person
wants to kill himself, the significance of his being intelligent is
that he will more likely be able to overcome obstacles put in his path
by less intelligent people who see this as a bad idea. This could even
lead to a situation in which being less intelligent and suicidal is
more adaptive than being more intelligent and suicidal, if these are
the only two two options.
-- Stathis Papaioannou
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT