(Versión en español aquí)

In this post I am going to briefly summarize a study recently published in PNAS by Ori Weisel and Shaul Shalvi entitled “The collaborative roots of corruption“. Cooperation has indisputable beneficial effects, but in this particular case the researchers were interested in its negative effects, and to that end they designed the following game: two people, A and B, in complete isolation and alone, have to throw a die. The subsequent procedure and payments are as follows:

Now, the thing is: if these two guys are isolated, and we experimental economists never lie to our subjects son we cannot surreptitiously take a peep into their booths, how are we to tell if and how often they lie? Well, in the end it is no big deal, and we can use simple maths to compute probabilities for the different outcomes and compare with their results. If both tell the truth, the chances to get a double (i.e., that both players report the same number) are 1/6: A obtains any number and then B has a 1/6 chance to throw the same number. As the experimenters proposed 20 games to each couple, they should obtain, on average, 3.33 doubles. On the other hand, as all the numbers are equaly likely, the payment should be (1+2+3+4+5+6)/6, i.e., 3.5 €. So much for honesty: the average number of doubles reported was 16.3 (yes, five times more than what should have been observed randomly) while the average payment was close to 5 €. The next figure shows the comparison between random and actually observed outcomes by collecting all the reports (right) as well as simulations of a purely random throw of dice.

As we can see, observations cluster in the diagonal (i.e., where doubles are) and towards high numbers (that yield better payoffs), and are obvioulsy different than what has been found if players had indeed thrown the dice. What can be clearly concluded then is that both players are cheating, hidden in the privacy of their booths, and both cooperate to lie to us. Indeed, B lies by reporting way too often the same number as A, and A lies because she reports way too often high numbers. In fact, some of them are complete liars: 25% of A’s report 6 in all throws, while 50% of the B’s report always the same number as A.

In other treatments, the researchers tried to gain more insight on the factors influencing this behavior. To that end, they studied what happens when B’s payment is fixed, i.e., he receives always 1 € no matter what his report is. Even then, B still lies, in order to help A, even if that is neither beneficial nor prejudicial for B. Of course, if the benefits to B players are raised or lowered, their willingness to cheat increases or decreases accordingly. As a matter of fact, what B does is very much conditioned by A, as is shown by the fact that when the change in incentives is done to A instead of B, the changes in the lying pattern are basically the same, which is surprising. They are so conditioned to A’s behavior than when A is a brazen, i.e., she always says 6, B always reports a double, whereas when A lies less, it is also much less likely that B lies. Finally, and this is very interesting, when it is only one person who throws both dice, she also lies, but to a lesser extent than when the game is played collaboratively, as if the presence of another crook would make it easier to “cross over to the dark side”…

Thus, the conclusion the experimenters reach is quite depressing:

The current work reveals a dark side of cooperation: corrupt collaboration. A collaborative setting led people to engage in excessive dishonest behavior. The highest levels of corrupt collaboration occurred when the profits of both parties were perfectly aligned, and were reduced when either player’s incentive to lie was decreased or removed. These results suggest that acts of collaboration, especially on equal terms, constitute “moral currencies” in themselves, which can offset the moral costs associated with lying. Paradoxically, the corrupt corporate culture and brazen immoral conduct at the roots of recent financial scandals (34) are possibly driven not only by greed, but also by cooperative tendencies and aligned incentives. In conclusion, when seeking to promote collaboration in our organizations and society, we should take note that in certain circumstances cooperation should be monitored, rather than encouraged unambiguously.

Does this sound familiar? Does it sound like a contractor and a town hall officer colluding to steal some public money together? (Why on earth have I come up with this example? Will it have per chance anything to do with the pre-crisis situation in Spain? It would be an amazing coincidence!) As Weisel and Shalvi say, those guys maybe greedy, true, but on the other hand their incentives are truly aligned, and each one of them sees on the lack of morality of the other the perfect excuse to become a crook… On the other hand, the results also shed light on what organizations should do:

(…) even if the absolute incentive to lie and report a double remains unchanged, earning at least something rather than nothing, in case of failure to report a double, can reduce the likelihood of brazen lying and may limit the emergence of corrupt collaboration. From the point of view of an organization seeking to reduce corrupt behavior, assuring a decent base salary that does not depend on performance can reduce the likelihood that its employees engage in brazen lying.

Once again, this kind of advice points out to one of the main uses of behavioral science, and of experimental economics in particular: to help us steer people’s behavior towards more appropriate ways. President Obama himself, following the example of the Behavioural Insight Team of the UK government, has issued an executive order to all the federal agencies to begin using behavioral insights in their programs. Let’s hope we can contribute to cut down corruption everywhere…

Share this: Twitter

Facebook

Like this: Like Loading...