If your friends jumped off a bridge, would you follow them? For parents, it's meant to be a rhetorical question—a way of winning any argument that begins with "But all my friends are..." But behavioral science has been revealing that adults will do a long list of stupid things simply to maintain bonds within social groups they consider their peers.

In the latest sign of just how stupid we can get, researchers found that people are willing to adopt an ethical standard after being told that people like them were assigned that position at random.

Normative

The new work was published by the University of Melbourne's Campbell Pryor, Amy Perfors, and Piers Howe. It's based on past research that looked at how social norms are established. This work has suggested two means that drive their adoption. One is simply practical: people will adopt social standards that are popular because it's likely those standard have some utility. An alternative explanation is only slightly less practical in that it posits adopting a social norm will ensure that you can avoid punishment by the rest of society for violating it.

Both of these assume that adopting social norms has some practical consequences. But an alternative explanation suggests that it's all about establishing group identity (the technical term for this concept is "self-categorization theory"). In this view, people adopt norms because those norms help establish a shared identity with a group, while also enhancing differences with people outside of the group. In this view, there don't need to be any practical benefits arising from the social practice itself; instead, the benefits are all indirect, and based on group identity.

To get at what's going on, the researchers did their best to eliminate any practical consequences from the equation. They made up a group that never existed, and assigned a social norm to it at random. They also ensured that participants in the study would have no further contact with any members of the group by recruiting them via Mechanical Turk.

To establish a group identity, participants in the study were given a brief personality test. The results of this test were used to establish a group identity—participants could then be told various things about "people like you," and have it refer to a personality type, along with the person's age and gender groups.

For the actual experiments, the researchers came up with a couple of ethical dilemmas for subjects to consider: should you hire a friend or a highly qualified individual? Should you report someone who robbed a bank but gave the money to a decaying orphanage? Participants were told that an earlier study group had been given one of these dilemmas, and had been randomly assigned to consider one of the options, like turning the thief in, or hiring a friend.

Fake, useless, and assigned randomly

In reality, these previous experimental groups never existed. The researchers simply told the real study participants that a high percentage of the people like them in the fictional group had been randomly assigned to consider one of the two options. The participants were then asked to decide which of the two options they would choose.

It's worth emphasizing just how tenuous this connection is. The past group was fictional. The characteristics subjects were told they shared with the fictional group were limited to age, gender, and one of five personality types. The actual participants were informed that the fictional ones had only been asked to consider a position—rather than choosing that position, it had been assigned to the fictional subjects at random. It's difficult to imagine that this would be enough to cause any sort of identification as part of a group, or to consider that option a sort of social norm or ethical standard.

And yet...

Being told that people like you had been randomly assigned to consider an opinion appeared to create a clear bias in favor of that position. Participants who were told people like them were frequently assigned to consider hiring a friend were more likely to choose that as their own position; if they were told instead that people like them were asked to consider hiring the qualified individual, they were more likely to pick that option. The same held true for whether or not to report a robber that gave the money to an orphanage.

Replicated and controlled

The researchers included questions to weed out those who misunderstood the scenario. They also swapped the wording so that people were told the option that was less common. People still showed a bias towards the popular option. Changing the description of the fictitious past participants so that they were asked about an unrelated ethical issue reduced the bias to statistically insignificant noise. These seem to rule out the choice just being a matter of misunderstanding or familiarity.

As a final test, the researchers told the real participants that the made up ones had considered both norms, but specified only one norm as the one considered by people like them. In this case, the bias largely went away in the group, indicating that some of the effect is simply an affinity for what other people were thinking about. But they figured out which participants had the strongest affinity for the group of people like them (again, using age, sex, and personality type). Among these participants, the bias in favor of that norm re-emerged.

All of this is in keeping with lots of previous research, which indicated that establishing a group identity is incredibly important to humans. The new work extends that to show that it's likely that group identity is also a major influence on the establishment of social norms—even when they have no utility, and even if they are supposedly established by random assignment. So, any parents who are tempted to ask the "if your friends jumped off a bridge" question might want to take a moment to ponder the fact that if their fellow parents jumped off a bridge, they might suddenly find an affinity for bridge-jumping.

Nature Human Behavior, 2017. DOI: 10.1038/s41562-018-0489-y (About DOIs).