Sunday’s edition of the New York Times Magazine featured a story about a fight over the brain of a patient who, in his life, had been a keystone of memory science. It was an excerpt from Patient H.M., a book about the man who lost his memory after a lobotomy, written by the grandson of the neurosurgeon who performed the operation.

And then the letters started coming in. At issue (in part) was an exchange between author Luke Dittrich and a Massachusetts Institute of Technology neuroscientist named Suzanne Corkin, who died in May. Corkin had conducted decades’ worth of experiments on H.M.—Henry Molaison—and Dittrich wanted to know what would happen to all her files:

Suzanne Corkin: Shredded.

Me: Shredded? Why would they be shredded?

Corkin: Nobody’s gonna look at them.

One of the letters to the Times was from James DiCarlo, the head of the department of brain and cognitive sciences at MIT. He and his colleagues contend, among other things, that the records had not been destroyed and in fact are currently housed at MIT. “Journalists are absolutely correct to hold scientists to very high standards,” DiCarlo’s letter concludes, “I—and over 200 scientists who have signed a letter to the editor in support of Professor Corkin—believe she more than achieved those high standards. However, the author (and, implicitly, the Times) has failed to do so.”

But the New York Times Magazine did fact-check the book excerpt itself, Danielle Rhoades Ha, the paper’s vice president of communications, told me. It fact-checks everything, down to poems, according to Robert Liguori, a research editor at the magazine and a fact checker who has worked on books.

Part of the fallout is surely attributable to the collision here between two standards of evidence: scientific and journalistic. Even in the hands of a rigorous reporter, the latter standard is baggier by necessity—nothing would get published if every story were vetted like a peer-reviewed research study—and can sometimes involve taking a source at his or her word about what happened, as Dittrich seems to have here. To this unavoidable problem, the conventions of journalism and nonfiction publishing in general contribute a further complication—there is rarely any transparency about the fact-checking process and certainly no way of knowing if publishers hold their books to consistent standards.

When I tried to find out if Dittrich’s book had been fact-checked, the assistant director of publicity at Random House said he couldn’t tell me. When I reached out to Dittrich himself, I got a reply back from Random House PR again: “We do not discuss the editing process of our books.” The representative agreed to send me the acknowledgments section, which does describe a vetting process: The writer shared drafts with family members involved in the story and “two eagle-eyed and sharp-brained neuroscientists” who read the whole manuscript.* But if anyone who helped on the book was a fact checker vetting its journalistic accuracy, Dittrich didn’t specify as much.

People are often surprised to learn that books, those bulky, fact-rich forever things, frequently receive less scrutiny from an independent fact checker than the stories they skim in magazines before tossing them in the recycling bin. In an ideal world, those books would be vetted in a rigorous and standardized way. “It’s impossible to write 50, 60, 70,000 words and not screw up somewhere,” said Liguori.

I know well, from spending more than two years as a freelance fact checker for many national science publications, that errors are terrifically common in nonfiction. When I started fact-checking as an intern at a magazine, I was surprised to find mistakes of all kinds in the work of authors I’d long admired—misspellings of proper nouns, botched descriptions of experiments, badly turned metaphors for, say, how the brain processes information or how the solar system formed. In the end, everything comes back to primary sources. One of my favorite catches: Once, an author noted that if you were to slice off someone’s head, blood would shoot up out of the hole—a gruesome illustration of how powerfully the body’s arteries pump blood. I called a forensic expert, who pointed me to videos of beheadings. There was no spurting blood.

All this work costs money—and publishers rarely foot the bill. In a highly unscientific Twitter survey, I asked authors of science books to tell me if their publishers had paid for an independent fact checker. Just four out of 38 respondents said they had.

Liguori offered me his own theory about why publishers aren’t springing for fact checkers: If a text has embarrassing or fatal errors, the onus is put on the author. “Does anybody besides anyone in publishing remember who published those books?”

I’ve long wished that fact-checked material would carry some kind of stamp on it noting if it had been independently and thoroughly fact-checked. (Internet articles included—this one wasn’t.) It would be particularly useful for books, the paper copies of which are impossible to change even when errors are inevitably caught or a new angle to the story emerges.

I’m not the only one. “Maybe there should be a warning,” journalist Mac McClelland has said of books that haven’t been checked, “like on a pack of cigarettes.”

And if material has been fact-checked, I’d like to see the checker get a credit, the way that writers, photographers, and increasingly story editors do. They are just as integral to the process of getting the story on the page. Though I’ve been listed on the masthead of one magazine as a researcher, the vast majority of my fact-checking gigs didn’t provide me with a formal credit—even though I showed stories that I checked the same care and diligence I would have if my name had been plastered at the top.

Sometimes writers, with their reputations very much on the line, will hire their own checkers—five respondents to my survey said they had. That was what science journalist and veteran fact checker Brooke Borel did for her forthcoming book The Chicago Guide to Fact Checking, which, I suppose, given the subject matter, isn’t terribly surprising. (Full disclosure: I’ve fact-checked magazine pieces by Borel and consider her a friend. She also interviewed me for her fact-checking guide.) But for her previous book, about the rise of bedbugs, Borel, like many authors, could not afford a fact checker and did the checking herself. That’s not ideal, Borel notes. It’s hard to get out of your own head and spot your own mistakes, especially when you are invested in the story sticking together. I’d imagine it’s doubly hard for a book like Dittrich’s, in which the subject matter at times involved his own family.

And when her bedbug book was excerpted by various publications, no one asked Borel for backup materials. While some publications, like the New York Times Magazine, have a policy of fact-checking excerpts, in my experience that doesn’t always happen with the same rigor as with other articles. When I’ve worked on excerpts in the past, I’ve turned up errors just as surely I would in most features handed to me. More proof that the fact-checking process is opaque: I asked a couple research chiefs about their magazines’ policies for excerpts. With the exception of the Times, they declined to comment on record.

Correction, Aug. 12, 2016: Due to an editing error, this story originally misstated the source who sent the writer the acknowledgments section of Luke Dittrich’s book. It was the assistant director of publicity at Random House, not Dittrich. (Return.)

Indeed, Dittrich wrote in a response on Medium: “My reporting of the shredding was based entirely on [the researcher’s] own words, to me, on tape.” He even uploaded a recording of the conversation. If many books’ budgets do not carve out money for fact checkers, they certainly wouldn’t include any for a full-scale investigation into locating a pile of notebook shreddings. It might’ve been a good move to call MIT and ask about the status of the records, especially since Dittrich says that he had a contentious relationship with Corkin. (Whether I would have done this personally—or if the researchers at the Times did this—I can’t say. I get to speak with the hindsight of knowing that the claim is being hotly and publicly contested.)

In this case, Dittrich said that he’s actively hopeful he’s not correct. He wrote on Medium, “I hope the data has in fact survived, as that strikes me as the best possible outcome.” For everyone except maybe the fact checkers.