What’s normal?

Perhaps the answer seems obvious: What’s normal is what’s typical — what is average.

But in a recent paper in the journal Cognition, we argue that the situation is more complicated than that. After conducting a series of experiments that examined how people decide whether something is normal or not, we found that when people think about what is normal, they combine their sense of what is typical with their sense of what is ideal.

Normal, in other words, turns out to be a blend of statistical and moral notions.

Our key finding can be illustrated with a simple example. Ask yourself, “What is the average number of hours of TV that people watch in a day?” Then ask yourself a question that might seem very similar: “What is the normal number of hours of TV for a person to watch in a day?”

If you are like most of our experimental participants, you will not give the same answer to the second question that you give to the first. Our participants said the “average” number was about four hours and the “normal” number was about three hours. In addition, they said that the “ideal” number was about 2.5 hours. This has an interesting implication. It suggests that people’s conception of the normal deviates from the average in the direction of what they think ought to be so.

Our studies found this same pattern in numerous other cases: the normal grandmother, the normal salad, the normal number of students to be bullied in a middle school. Again and again, our participants did not take the normal to be the same as the average. Instead, what people picked out as the “normal thing to do” or a “normal such-and-such” tended to be intermediate between what they thought was typical and what they thought was ideal.