Will Venezuela cut gasoline subsidies? Will the US Federal Reserve raise interest rates before the end of the year? Your guess is as good as mine, unless you happen to be what University of Pennsylvania psychology professor Philip Tetlock has identified as a “superforecaster.”

When we decide to change jobs, make an investment, or launch a business, we make that decision based on what we think the future will hold. The problem is, we’re just not that good at accurately anticipating the future. We’re susceptible to hindsight bias, we’re overconfident about what we really know, and our predictions are often self-serving.

Superforecasters, on the other hand, are able to overcome many of these cognitive hurdles, helping them forecast future global events with surprising accuracy.

“I call them superforecasters because that is what they are. Reliable evidence proves it,” Tetlock writes in a new book, Superforecasting: The Art and Science of Prediction.

To understand what makes superforecasters so good, Tetlock and colleagues Barbara Mellers (University of Pennsylvania) and Don Moore (University of California, Berkeley) recruited thousands of volunteers to compete in a forecasting tournament sponsored by the government group Intelligence Advanced Research Projects Activity (IARPA). The results of their study were published in Psychological Science.

The team launched their first prediction contest in 2011 as the Good Judgement Project. An international group of 2,200 to 3,900 forecasters were recruited through professional societies, alumni associations, and word of mouth. Most of these volunteers were just regular people — including a Brooklyn filmmaker, a retired pipe installer, and a former ballroom dancer — but not just anyone could participate.

Forecasters needed to have at least a bachelor’s degree and complete a battery of psychological and political knowledge tests. Participants tended to be men (83%) and US citizens (74%), and about 40 years old, on average.

Forecasters read questions about various possible events and submitted their predictions regarding the probability that a given event would take place (0% if there was no chance an event would happen and 100% if it definitely would). Importantly, each question had to written in such a way that there was no room for argument about how these geopolitical events actually turned out: The events either happened, or didn’t.

All forecasters were then ranked according to their performance. The top 60 participants, who had exceptionally high accuracy, were dubbed superforecasters. These individuals were so good at predicting geopolitical events that their answers were, on average, almost 30% more accurate than those of career intelligence agents.

“Are superforecasters different kinds of people—or simply people who do different kinds of things?” Mellers, Tetlock, and colleagues asked in a study published in Perspectives in Psychological Science. Their data suggests that the answer is a blend of both.

Their results suggest that superforecasters tend to have distinctive traits. They scored higher on several measures of fluid intelligence and crystallized intelligence, and they also scored higher on personality-related traits such as desire to be the best, competitiveness, and willingness to change their minds in response to new evidence.

But it’s not all about innate skill, personality, or intelligence. Three additional factors were identified as driving accurate predictions: training, teaming, and tracking.

Forecasters who received training to help them overcome cognitive biases were better at predicting compared to those without training. Each training module was interactive and included questions and answers to check participants’ understanding of probability, and judgmental traps such as overconfidence, the confirmation bias, and base-rate neglect.

People were also better forecasters when they worked together in collaborative teams. Importantly, teamwork allowed people the opportunity to discuss the rationales behind their beliefs. Having to provide evidence for their answers helped teams avoid cognitive biases and errors in their predictions.

Finally, the best performers from the first year of the tournament (i.e., the top 2%) were placed together in elite “tracked” teams made up of only superforecasters. These teams significantly outperformed all of the other groups. Notably, individuals on these elite teams were more likely than other groups to share relevant news and ask questions with their teammates.

“Superforecasters have achieved a surprising degree of accuracy—and this may be just the beginning of those surprises,” Mellers, Tetlock, and colleagues conclude.

References

Mellers, B., Stone, E., Murray, T., Minster, A., Rohrbaugh, N., Bishop, M., … & Tetlock, P. (2015). Identifying and Cultivating Superforecasters as a Method of Improving Probabilistic Predictions. Perspectives on Psychological Science, 10(3), 267-281. doi: 10.1177/1745691615577794

Mellers, B., Ungar, L., Baron, J., Ramos, J., Gurcay, B., Fincher, K., … & Tetlock, P. E. (2014). Psychological strategies for winning a geopolitical forecasting tournament. Psychological Science, 25(5), 1106-1115. doi: 10.1177/0956797614524255

Tetlock, P. E., & Gardner, D. (2015). Superforecasting: The Art and Science of Prediction. New York: Crown.