Avinash Dixit
The Concise Encyclopedia of Economics

Prisoners' Dilemma

by Avinash Dixit and Barry Nalebuff
About the Author
The prisoners' dilemma is the best-known game of strategy in social science. It helps us understand what governs the balance between cooperation and competition in business, in politics, and in social settings.

In the traditional version of the game, the police have arrested two suspects and are interrogating them in separate rooms. Each can either confess, thereby implicating the other, or keep silent. No matter what the other suspect does, each can improve his own position by confessing. If the other confesses, then one had better do the same to avoid the especially harsh sentence that awaits a recalcitrant holdout. If the other keeps silent, then one can obtain the favorable treatment accorded a state's witness by confessing. Thus, confession is the dominant strategy (see Game Theory) for each. But when both confess, the outcome is worse for both than when both keep silent. The concept of the prisoners' dilemma was developed by Rand Corporation scientists Merrill Flood and Melvin Dresher and was formalized by a Princeton mathematician, Albert W. Tucker.

The prisoners' dilemma has applications to economics and business. Consider two firms, say Coca-Cola and Pepsi, selling similar products. Each must decide on a pricing strategy. They best exploit their joint market power when both charge a high price; each makes a profit of $10 million per month. If one sets a competitive low price, it wins a lot of customers away from the rival. Suppose its profit rises to $12 million, and that of the rival falls to $7 million. If both set low prices, the profit of each is $9 million. Here, the low-price strategy is akin to the prisoner's confession, and the high-price akin to keeping silent. Call the former cheating, and the latter cooperation. Then cheating is each firm's dominant strategy, but the result when both "cheat" is worse for each than that of both cooperating.

Arms races between superpowers or local rival nations offer another important example of the dilemma. Both countries are better off when they cooperate and avoid an arms race. Yet the dominant strategy for each is to arm itself heavily.

On a superficial level the prisoners' dilemma appears to run counter to Adam Smith's idea of the invisible hand. When each person in the game pursues his private interest, he does not promote the collective interest of the group. But often a group's cooperation is not in the interests of society as a whole. Collusion to keep prices high, for example, is not in society's interest because the cost to consumers from collusion is generally more than the increased profit of the firms. Therefore companies that pursue their own self-interest by cheating on collusive agreements often help the rest of society. Similarly cooperation among prisoners under interrogation makes convictions more difficult for the police to obtain. One must understand the mechanism of cooperation before one can either promote or defeat it in the pursuit of larger policy interests.

Can "prisoners" extricate themselves from the dilemma and sustain cooperation when each has a powerful incentive to cheat? If so, how? The most common path to cooperation arises from repetitions of the game. In the Coke—Pepsi example, one month's cheating gets the cheater an extra $2 million. But a switch from mutual cooperation to mutual cheating loses $1 million. If one month's cheating is followed by two months' retaliation, therefore, the result is a wash for the cheater. Any stronger punishment of a cheater would be a clear deterrent.

This idea needs some comment and elaboration:

    1. The cheater's reward comes at once, while the loss from punishment lies in the future. If players heavily discount future payoffs, then the loss may be insufficient to deter cheating. Thus, cooperation is harder to sustain among very impatient players (governments, for example).

    2. Punishment won't work unless cheating can be detected and punished. Therefore, companies cooperate more when their actions are more easily detected (setting prices, for example) and less when actions are less easily detected (deciding on nonprice attributes of goods, such as repair warranties). Punishment is usually easier to arrange in smaller and closed groups. Thus, industries with few firms and less threat of new entry are more likely to be collusive.

    3. Punishment can be made automatic by following strategies like "tit for tat," which was popularized by University of Michigan political scientist Robert Axelrod. Here, you cheat if and only if your rival cheated in the previous round. But if rivals' innocent actions can be misinterpreted as cheating, then tit for tat runs the risk of setting off successive rounds of unwarranted retaliation.

    4. A fixed, finite number of repetitions is logically inadequate to yield cooperation. Both or all players know that cheating is the dominant strategy in the last play. Given this, the same goes for the second-last play, then the third-last, and so on. But in practice we see some cooperation in the early rounds of a fixed set of repetitions. The reason may be either that players don't know the number of rounds for sure, or that they can exploit the possibility of "irrational niceness" to their mutual advantage.

    5. Cooperation can also arise if the group has a large leader, who personally stands to lose a lot from outright competition and therefore exercises restraint, even though he knows that other small players will cheat. Saudi Arabia's role of "swing producer" in the OPEC cartel is an instance of this.

About the Author

Avinash Dixit is the John J. Sherred Professor of Economics at Princeton University. Barry Nalebuff is the Milton Steinbach Professor of Management at Yale University's School of Organization and Management.

Further Reading

Introductory

Axelrod, Robert. The Evolution of Cooperation. 1984.

Dixit, Avinash, and Barry Nalebuff. Thinking Strategically: A Competitive Edge in Business, Politics, and Everyday Life. 1991.

Hofstader, Douglas. "Mathamagical Themas." Scientific American (May 1983): 16-26.

Rapoport, Anatol, and A. M. Chammah. Prisoners' Dilemma. 1965.

Advanced

Kreps, David, Robert Wilson, Paul Milgrom, and John Roberts. "Rational Cooperation in the Finitely Repeated Prisoners' Dilemma." Journal of Economic Theory 27, no. 2 (August 1982): 245-52.

Milgrom, Paul. "Axelrod's The Evolution of Cooperation." Rand Journal of Economics 15, no. 2 (Summer 1984): 305-9.

Return to top