“When should a person cooperate,
and when should a person be selfish,
in an ongoing interaction with another person?”
(Flood on Java, Wikipedia image in the Public Domain)
This is the enticing question that opens Robert Axelrod‘s book, “The Evolution of Cooperation”.
The Problem of Cooperation, is clearly stated in chapter one:
“Under what conditions will cooperation emerge in a world of egoists without central authority?”
and framed in realistic terms:
“We all know that people are not angels,
and that they tend to look after themselves and their own first.
Yet we also know that cooperation does occur and that our civilization is based upon it.”
This problem has been studied by economists, sociologists, psychologists, game theorists, and military strategists for many years. The question has been formalized in the form of the Prisoner’s Dilemma game:
“In the Prisoner’s Dilemma game, there are two players.
Each has two choices namely cooperate or defect.
Each one must make the choice without knowing what the other will do.
No matter what the other does, defection yields a higher payoff than cooperation.
The Dilemma is that if both defect, both do worse than if both had cooperated.”
The term of “Game” might be misleading, so let’s clarify that this is serious business and that it illuminates not only interactions between human individuals and bacterial populations, but also across species; and even nations, particularly in times of war.
(Shihōnage Aikido technique, Wikipedia image under CC BY SA License)
Here are some properties of the game that are worth highlighting:
- The payoff of the players need not be comparable at all.
- The payoff certainly do not have to be symmetric.
- The payoffs of a player do not have to be measures on an absolute scale.
- Cooperation need not be considered desirable from the point of view of the rest of the world.
- There is no need to assume that the players are rational. They need not be trying to maximize their rewards. Their strategies may simply reflect standard operating procedures, rules of thumb, instinct, habits, or imitation.
- The actions that players take are not necessarily even conscious choices. There is no need to assume deliberate choice at all.
Axelrod convoked a diverse group of people to develop computer programs to play the Prisoner’s Dilemma in a tournament, in order to study the strategies that could provide an answer to his opening question. The tournament was run twice, and it received entries from specialists in many different fields. The surprising answer is that the best strategy for the Prisoner’s Dilemma is TIT FOR TAT, also known as “Equivalent Retaliation.”
The TIT FOR TAT strategy is summarized in:
- In your first move you cooperate.
- In subsequent moves:
- If the other player cooperated in the previous move, you cooperate again in this move.
- If the other player defected in the previous move, you retaliate by defecting in this move.
- Only the actions of the previous move are taken into account.
What accounts for TIT FOR TAT’s robust success as a strategy is its combination of being:
(Endal, Wikipedia image in the Public Domain)
It is “Nice” because it starts by cooperating, and it reciprocates cooperation with cooperation:
“Its niceness prevents it from getting into unnecessary trouble”.
It is “Provocable” because it retaliates with defection to a previous defection by the other player:
“Its retaliation discourages the other side from persisting whatever defection is tried”
It is “Forgiving” because it only looks at the most recent move by the other player. All the rest of the history of interactions is not considered when deciding the current move:
“Its forgiveness helps restore mutual cooperation.”
It is “Clear” because it has simple rules that are consistent and easy to figure out by the other player after a short set of interactions:
“Its clarity makes it intelligible to the other player, thereby eliciting long term cooperation”.
Notice that all this makes sense under the assumption that the players will interact regularly, and that at their every move they know that
This probability of future interaction is called “The Shadow of the Future,” and its size is a key parameter in the emergence of cooperative behaviour.
(Shadow of a Tree from http://www.flickr.com/photos/zest-pk/923930667/ CC BY License)
When we know that it is very likely that we will meet and interact again with this player, our motivation for cooperation increases, and the motivation for defection decreases.
“The great enforcer of morality in commerce is the continuing relationship,
the belief that one will have to do business again with this customer, or this supplier.“
“As long as the interaction is not iterated, cooperation is very difficult. That is why an important way to promote cooperation is to arrange that the same two individuals will meet each other again, be able to recognize each other from the past, and to recall how the other has behaved until now.”
In addition, to ensure the future interaction, it is also important to increase the significance of the interaction, again: the shadow of the future.
“Mutual cooperation can be stable if the future is sufficiently important relative to the present”
Enlarging the “shadow of the future” is essential for promoting cooperation. This can be done by
- Making the interactions more durable.
- Making the interactions more frequent.
Beyond the strength of TIT FOR TAT to be a robust strategy, it also turns out that a small cluster of players who are using the TIT FOR TAT strategy among themselves can invade an take over a population of meanies who are non-cooperative. In a typical setup:
“It took only 5 percent of one’s interactions to be with like-minded TIT FOR TAT players to make the members of this small cluster [of collaborators] do better than the typical defecting member of the [meanies] population.”
This is an observation of fundamental importance, since it hints at the natural fit for success of open-source communities. A small minority of collaborating individuals is sufficient to take over and entire population of uncooperative ones.
In evolutionary terms: Near-term selfishness make you weak and vulnerable.
Axelrod provide advice on how to create approach cooperation:
“While foresight is not necessary for the evolution of cooperation, it can certainly be helpful.”
The advice takes the four simple suggestions for how to do well in a durable, iterated Prisoner’s Dilemma:
- Don’t be envious. The game is not about scoring more than the opponent, but about maximizing your own score. Cooperation is not a zero-sum game, and that makes all the difference. This point is of great importance for Open Source communities, where one must let go of the desire to control how much others are benefiting from our work as free-riders.
- Don’t be the first to defect.
- Reciprocate both cooperation and defection.
- Don’t be too clever. Because that reduces the clarity of your strategy, confuses the other player, and diminishes the willingness for cooperation. “Too much strategy can appear to be total chaos. If you are using a strategy that appears random, then you also appear unresponsive to the other player. If you are unresponsive, then the other player has no incentive to cooperate with you. So being so complex as to be incomprehensible is very dangerous.”
A curious paradox of these rules is that TIT FOR TAT won the tournament without ever scoring better than any of its opponents. The trick is that it did, on average, better than all of the other strategies as the others got into retaliatory troubles among themselves, or were taken advantage of by other bullying strategies. TIT FOR TAT was able to mine for goodies at every opportunity presented for collaboration, and also retaliated with defection at every occasion when the opponent strategy attempted to take advantage of it.
Additional techniques for promoting cooperation include:
- Enlarging the “Shadow of the Future,”
- Changing the payoffs.
- Teaching people to care about the welfare of others.
- Teaching the value of reciprocity.
In its conclusions chapter, Axelrod eloquently describes the impact of the studies described in the book:
“Once the word gets out that reciprocity works, it becomes the thing to do.
If you expect others to reciprocate your defections as well as your cooperations,
you will be wise to avoid starting any trouble.
Moreover, you will be wise to defect after someone else defects,
showing that you will not be exploited.
Thus you too will be wise to use a strategy based upon reciprocity.
So will everyone else.
In this manner the appreciation of the value of reciprocity becomes self-reinforcing.
Once it gets going,…”
“…it becomes stronger and stronger.”
(Walking line of penguins, Salisbury plain, and penguin colony, by Brian Gratwicke, uncer CC BY License)