It turns out it’s pretty difficult to fully access key sources on any given Wikipedia page. That’s according to a study from researchers at Dartmouth’s Neukom Institute who assessed the 5,000 most-trafficked Wikipedia pages, analyzing them for verifiability. In other words, they didn’t check to see if Wikipedia pages were accurate; they investigated how easily someone could make that determination for themselves.

Based on the presence of markers like International Standard Book Number and Digital Object Identifiers, which are unique serial codes assigned to books and papers, about 80 percent of book citations and nearly 90 percent of journal references were technically verifiable—meaning you could track down the source material if you wanted to figure out whether a characterization on Wikipedia was right. But practical verifiability was a different story. It might be possible to track down the source material—as in, that source material actually exists and the link to get you there is working—but it might be really difficult or impossible to get to it. Using Google’s API, the researchers wrote a program to classify the accessibility of Google Books citations, for instance, and found that most books (71 percent) cited on Wikipedia are only partially viewable online; while many others (17 percent) are not viewable online at all. (About 12 percent were fully viewable.)

“A lot of references are made available. But when you try to track them down, the main problem you run into is not that they’re fake or erroneous, but you can’t get to them,” said Michael Evans, a research fellow at the Neukom Institute and one of the study’s co-authors. “Typically it’s because of paywalls. Sometimes it’s because of link rot.”

“The point is, basically, that Wikiepdia is not bad,” Evans added, “But it needs to meet its own standard for verifiability.”

Evans and his colleagues have an idea for how Wikipedia could begin to do this—and it’s a proposal that, if executed well, could dramatically improve access to information on the Internet. “You could just give some kind of meter about verifiability, actually on the Wikipedia page,” said Dan Rockmore, the director of the Neukom Institute and a co-author of the study. “That could be automated in a fairly simple way.”

He and Evans envision a browser plug-in, for instance, that would run a quick script to assess a Wikipedia page’s citations; then translate its findings into some sort of prominent verifiability scoring system displayed on the page. Such a metric could—perhaps with “smiley and frowny emoticons,” Rockmore offered—warn people about pages with low-verifiability ratings, or add credence to easy-to-vet pages. Such a scoring system would incentivize sourcing articles with information that’s easy for people to check online—and could be used on basically any website that includes lots of citations. (News sites seem like one natural candidate.)