Knowing that experts understand something is enough to make lay persons think they, too, have better grasped a concept, according to a September 26 paper in the journal Psychological Science.

Cognitive scientist Steven Sloman, of Brown University in Providence, Rhode Island, set out to test the “community-of-knowledge” hypothesis, which states knowledge is distributed across a group, rather than stored in individuals’ minds. It suggests people would not necessarily distinguish their own knowledge from that of the group. If that were so, Sloman reasoned, then being told that scientists understand a new phenomenon should make subjects think that they comprehend it, too.

Sloman’s research assistant and co-author, Nathaniel Rabb, now at Boston University, dreamt up several novel scientific “findings,” such as a rock that glows, and warm ice. Subjects read brief reports of these new discoveries, each of which included a statement that scientists either “fully understand” or “do not yet understand” how such things could be possible. The actual explanation was not provided.

Then, the subjects ranked their comprehension of how the novel natural phenomena work, on a scale of one to seven, with one being little to no understanding and seven being a deep, detailed understanding. When scientists were flummoxed, subjects rated their understanding at an average of 1.79. When the experts got it, the subjects’ average rose to 2.42. The effect was quite consistent, though Sloman cautions it was a small change in perceived understanding overall. “Saying others understand doesn’t make people feel like they fully understand,” he says.

The difference is “meaningful,” says Aner Tal, a consumer psychologist at Cornell University in Ithaca, New York, who was not involved in the research. “It’s what I would have expected.” He was surprised, however, by Sloman’s next result: the authors found that when they said experts understood the phenomenon, but the military’s Defense Advanced Research Projects Agency (DARPA) was keeping that information secret, the effect of community knowledge on subjects’ understanding scores diminished. “The knowledge has to be accessible to you,” explains Sloman.

Tal noted that the study relied on peoples’ own, subjective assessment of their understanding. He is curious whether the knowledge of experts would affect more practical decisions—for example, would experimental subjects be more interested in trying a new medicine, exercise program, or source of energy if the concepts were thoroughly understood by scientists? Tal suspects expert knowledge would sway those types of decisions as well. Sloman has plans to investigate similar questions. “We want to extend it to more familiar domains, instead of these bizarre scientific phenomena that we made up,” says Sloman.

The community-of-knowledge concept affects many aspects of life, says Sloman, whose book on the topic, The Knowledge Illusion, comes out next March. For example, he says, people may think they understand political matters, such as the Affordable Care Act (commonly known as Obamacare), and develop opinions, when really only experts fully understand the details. Tal adds that in a similar manner, a lack of expert understanding affects people making decisions about travel to places where the Zika virus is a concern. “It’s more scary for people when they feel like it’s not understood,” he says.