Rousseau, Hobbes, and Locke all meditated on the development of social contracts that they considered necessary for people to operate in large societies. Game theory gives scientists a chance to test some of these ideas with hard data. By having people play anonymous games with money, researchers found that people from larger societies, ones that are more integrated into the market, are more likely to be fair in anonymous dealings; these same people are more willing to punish others when they are unfair. These findings suggest that fairness and punishment in dealings with strangers are largely learned behaviors, and that we need these norms and institutions to prevent our communities from fragmenting.

Before ten thousand years ago, localized groups probably had fairly limited contact with more distant human populations. Fast forward a few thousand years, and large, complex, and cooperative societies had become prevalent. Scientists have long been uncertain what facilitated the social changes that allowed people to feel comfortable trading with others they hardly knew.

Why should I be nice to you?

Some psychologists thought large societies are natural outgrowth of the sort of behavior people apply to kin and close friends. Extending this comfort to new acquaintances, adding them to a roster of trusted business partners, could help establish large, loosely connected networks. Another theory was that humans slowly established societal norms and agreements under which they could feel comfortable dealing with strangers—less natural to their psychology, but necessary.

The notion of fairness between two strangers is an interesting behavioral question—what motivates someone to be fair to someone they don't know? Small groups that know each other, like families, are motivated to be fair because they expect fairness in return. But in large scale groups, there are no reputation or reciprocation obligations between strangers.

Punishment can act as a means of enforcing fair behavior. While reputation damage is considered generally a serious enough punishment to keep people from exploiting others in small societies, it doesn't work fast enough in large ones. So, it can be equally important to know how we started punishing each other, and how it correlated with society size.

Hating the player: fairness and punishment in money games

To study these things, researchers sampled the attitudes of people from many different societies. Some were hunter-gatherers and foragers from less-developed regions that represent the small-tribe mentality, and others were from highly developed and industrialized countries.

Researchers gathered data on the community size, the religion of each person (Islam, Christianity, tribal/none, etc.), and their level of market integration, in this case a measure of how much food was obtained through the market versus what obtained by the person's own means, such as through farming or fishing. They also took data on wealth, education, and so on to groom out spurious variables.

Participants were tested with three games, all involving divisions of money via anonymous dealings that tested fairness and punishment. The first was the Dictator Game, which measures fairness. One player has to divide a sum of money between himself and another player—the more money he gave the other player, the fairer he was.

The second was the Ultimatum game, which added a small element of punishment. Again, the first player had a sum of money to divide between himself and another, but the second player could choose to reject the offer, causing both players to receive no money at all.

The third game, the Third-Party Punishment Game, involves three players. Again, the first player is dividing money between himself and second player, but this time a third player is getting a separate payment of half the total sum. The third player gets to see how much the first player offers the second, and can choose to spend some of his money to punish the first player by making him receive less money.

Loving the game: how society affects us

From the game data, scientists found that the type of community a person came from had a large influence on their behavior in these anonymous dealings. A person who was more integrated in a market community offered more money to others, showing that people more familiar with conduct in a large market are more likely to be fair to people they don't know. Participation in religion also corresponded with an increase in fairness.

On the punishment side, scientists measured the data in minimum acceptable offers, or how low an offer could get before a player would decide to punish the person making it. They found that the minimum acceptable offer increased with the size of the community that the punishing player belongs to (it eventually tapered off as in a natural logarithm plot). They found no relationship between market integration and punishment, although religious people were much more likely to punish other players.

Researchers think this data supports the notion that, as societies grow, more and more costly punishments are necessary to sustain them in order to prevent the group from breaking into smaller populations again. They also point out that that people from small, less integrated, and less religious societies are good analogs of our pre-large-society-forming selves, and that they are relatively unconcerned with either being fair to strangers or punishing them.

This calls into question the idea that social interactions in large societies stem entirely from an innate willingness to build up our Rolodexes and treat new strangers like old friends. According to this data, that willingness is heavily mediated by the establishment and internalizing of institutions and norms that handle rules of trade for us. While social contracts are, to a degree, natural (we did come up with them, after all) they are not habitual. The researchers caution that future behavioral studies should be careful about attributing these apparently learned behaviors to human nature.

Science, 2010. DOI: 10.1126/science.1182238 (About DOIs).