From: Daniel Fabulich (daniel.fabulich@yale.edu)
Date: Mon Jul 13 1998 - 19:17:40 MDT
On Mon, 13 Jul 1998, Joe Jenkins wrote:
> I might like to reply to your response if I only understood it. Like
> I said, I'm new to the subject of ethics and even worse its been a
> while since I read Axelrod's Evolution of Cooperation. I don't
> remember non iterated games and so I have no idea what "keep silent"
> means. Please afford me some enlightenment. Does it mean don't play
> games that are non-iterated?
Er, no. I was referring to the Prisoner's Dilemma, a classic game from
the perpective of game theory and an interesting situation from the
perpective of rational philosophy.
Suppose you had two prisoners about to go on trial for some atrocious
crimes. Suppose neither prisoner can communicate with the other. Imagine
that the prosecutor offers each a deal: they can either confess to the
crime and incriminate their partner (aka "defect") or keep silent. The
deal is arranged this way: if both prisoners choose to keep silent, then
both will spend a short while in prision but will eventually go free.
However, if one defects while the other keeps silent, then the defector
may go free, but the one who keeps silent is killed. Finally, if they
both defect, they both spend a life in prison.
Usually, when we think about this game in the context of of game theory,
we give each of the consequences explicit payoffs; ie I get 2 points if I
defect and you keep silent, I get 1 point if we both keep silent, I lose 1
point if we both defect, and I lose two pints if I keep silent while you
defect. Situations like this one arguably happen fairly regularly, though
the number of situations under which the payoffs are so arranged is
probably overstated. However, it seems reasonable that there are a
variety of situations under which you may gain at the expense of others;
egoism would dictate that you do so, because it is to your best gain, no
matter what the other player does. However, if both players play this
strategy, both lose 1 point; (though each player would have lost more had
they kept silent). On the other hand, if both players keep silent, both
gain a point, yet each player could have gotten more, given the other
player's choice, had they defected.
This game gets interesting when we have to play it more than once (we use
the point system rather than the prison sentances, in this case.)
Contests have been held to try to come up with the optimal strategy in
iterated versions of the game; ie the best strategy when you must play
against the same player multiple times. The undisputed winner, to the
best of my knowledge, is Tit for Tat: keep silent on the first game, then
do whatever your opponent did last game in further iterations. So if I'm
playing against Tit for Tat, and I keep silent on the first round, Tit for
Tat will keep silent on the second round. If I defect on the second
round, Tit for Tat will defect on the third; etc.
The fact that Tit for Tat is the undisputed winner in terms of points
seems to imply that it is the optimal strategy from the egoistic
standpoint; this is contrary to our analysis above of the non-iterated
version. More interesting, IMO, is the fact that it also seems to be
compatible with utilitarianism, particularly when two Tit for Tats are
playing against each other.
Evolutionary stability only applies to the iterated versions of the games
I described above. While "Always Keep Silent" is certainly compatible
with itself, if ever an "Always defect" player entered the game, it would
clear out the others; it would not fare so well against Tit for Tat; while
it would eek out slightly better than Tit for Tat, it would do poorly
against itself, whereas Tit for Tat excels against itself. That would
mean that Tit for Tat would reproduce a lot more in an evolutionary
scenario, through there would always remain some equilibrium fraction of
"always defect"ors in most populations. Interesting, no?
Anyway, as I said, all this means is that in most cases utilitarianism and
egoism will agree on Tit for Tat. The question then becomes: what stategy
should we take for the NON-iterated games, games in which you will not
play against the same person again, or in which you will not know against
whom you are playing? Egoism will say "always defect" in these
situations; utilitarianism will say "keep silent."
If egoism is rational, then it is rational for both players; yet egoism
would demand suboptimal consequences according to its own value system:
the egoistic players find themselves worse off than they would have been
otherwise. If we agree that rationality, at least in part, involves doing
what is necessary in order to get the optimal consequences (where the
"optimal" consequences is determined by one's value system) then egoism
dictates that the way to fulfill the ends of egoism is to reject egoism;
in other words, it is *not* rational to be an egoist, because it leaves
the players worse off than it would be had they been utilitarians.
This archive was generated by hypermail 2.1.5 : Fri Nov 01 2002 - 14:49:20 MST