Long before Hilda Bastian was a health researcher, she endorsed a practice she believes may have cost lives.

“I think people died because of me,” she said recently. “And I'll spend my whole life trying not to do it again and to make amends.”

In the 1980s, Bastian was skeptical of the medical establishment. As the head of Homebirth Australia, she traveled the country and appeared on TV programs arguing that moms should have their babies outside the cold confines of hospital rooms.

Then she learned babies born at home in Australia faced a higher mortality risk than those born in hospital at that time. The fact disturbs her to this day.

In the decades since, she’s become one of the most prominent thinkers in the world on scientific literacy and evidence-based medicine. She has dedicated her life to figuring out how to reach people with the best available health research and fight back against misinformation.

For her and many other health researchers and doctors, “fake news” and misinformation — problems that suddenly seem dire in light of Donald Trump’s election and the growing influence of sites like Alex Jones’s Infowars — are nothing new. And over the past 30 years, mostly under a movement called “evidence-based medicine,” they’ve come with up with tools and techniques to fight back against bunk. They’ve also learned hard lessons on what doesn’t work when it comes to using facts to change people’s minds and behaviors.

Their lessons can help all of us — journalists, policymakers, teachers, educators, and even just concerned citizens talking to friends over the dinner table — who care about evidence and want to empower others with it.

Lesson 1: Take time to explain why you believe something — not just what you believe and why your opponent is wrong

So how did Bastian switch over from home birth advocate to home birth critic?

It started with conversations with researchers in Sydney, who were compassionate about her worldview and generous with their time.

In the 1980s, Bastian went to a workshop at a childbirth education conference and met a researcher, Judith Lumley, who wanted to help her understand medical evidence. Through Lumley, Bastian connected with others in the scientific community who took the time to explain not only the evidence behind home birthing but also how to understand its strengths and limitations. Here’s how Bastian described it on her blog:

I didn’t change deeply held beliefs because someone convinced me in one discussion, or even a few. It was a process over years. The scientists and others who influenced me weren’t cheerleaders for the establishment. They were critical of weak research and arguments, regardless of whose interests it served. And they didn’t just expect people like me to believe them because they were experts. They wanted to increase the expertise of others in scientific thinking, especially community leaders.

In that process, Bastian learned that you can’t simply change minds by telling people that what they believe is wrong and you have the correct information. If those researchers had gone after her and shouted about their beliefs, Bastian probably would have deepened her stance in opposition.

Over time, Bastian said, the researchers convinced her “by being credible and trustworthy,” not just appealing to emotion. They even inspired her to get into science. (By the late 1990s, the “proud high school dropout” was publishing research articles on mortality risks related to home births; she’s now on staff at the National Institutes of Health and working on her PhD — her first degree.)

When you win people over this way, Bastian added, it can take a while — but you’re more likely to bring others from the opposing community along. In her case, she stuck around Homebirth Australia, helping to get the practice regulated and develop national guidelines on safe home birthing.

This process wasn’t easy. Bastian received death threats for her change of heart, and the harassment went on for years. Not everyone in the home birthing community appreciated her push for higher standards.

Through her experience, she thinks there’s a lesson for people trying to fight against people skeptical of scientific evidence, like the anti-vaccine crusaders.

“[Pro-vaccine advocates] act as though there’s certainty about the effects of vaccines when there isn’t,” she said. “And each time they do that, they let their side down. If you’re on the anti side, you can just drive a bus through the holes in their arguments, and people are doing that.”

Of course, all medical treatments — including vaccines — carries risks and side effects, and sometimes vaccine advocates are too quick to pretend that research doesn’t exist. “It’s fighting bias with bias and it doesn’t work,” Bastian says. “It just creates more bias and polarizes people.” Instead, taking time to explain why you believe something — not just what you believe and why your opponent is wrong — can go a long way.

Lesson 2: Make sure your information is reliable and easy to access

In order to talk to others about evidence, you need to sort out which evidence is reliable, and find ways to make it readily accessible and understandable. And the evidence-based medicine movement, which started to catch on in the early 1990s, developed tools to do just that.

Back then, doctors were too often using single or cherry-picked studies, or what they learned in medical school or from their mentors, to inform their decisions about their patients’ best care. These one-off studies and old lesson plans didn’t always represent of the totality of the research.

So a group of doctors, researchers, and patients began to organize themselves to solve the problem: to figure out how to sort evidence, and get all the best research digested for doctors so they could use it at the bedside when they needed it, rather than just relying on whatever study they came across that day or what their mentors told them.

These researchers built up a repository of high-quality "systematic reviews," most notably through the Cochrane Collaboration. The reviews used statistical methods to bring together and sort all the best science on specific medical questions, and presented that evidence in a coherent summary.

This effort was revolutionary. Systematic reviews added empirical heft to medicine. They helped doctors more easily access and make sense of a wider selection of data, and they often corrected misconceptions about important health issues — like the advice that it was best to put newborns to sleep on their stomachs, a practice that actually increased babies’ risk of death.

But Bastian — one of the founding Cochrane members — said she realized pretty quickly that the group had to reach beyond doctors and find ways to connect with other communities if they really wanted to have an impact. By 1999, she’d helped get “plain language summaries” added to Cochrane reviews. These summaries appear outside the paywall and articulate, in a few jargon-free sentences, the findings of a systematic review. Here’s one example from the systematic review about vitamin E supplementation during pregnancy:

Bastian’s objective was clear: “Instead of waiting and hoping journalists would pick something up and write an accurate story, or putting out press releases to go with a piece of research and [hoping journalists would] pay attention to a press release, I thought the answer was to write the finished [summary] yourself and put it inside the research.”

Today, the “plain language summaries” are the most translated and most read parts of the giant Cochrane library, and journalists like me, as well as patients, rely on these reviews to contextualize the research we’re reporting on every day. They made the best research accessible and easy to understand.

Lesson 3: Teach them while they’re young

Just making high-quality evidence more available doesn’t always stop bogus claims from taking off, of course, and many people often lack the tools to think critically about the information they’re given.

That’s a problem Andy Oxman, a researcher based in Norway who has studied how to help people make informed health choices for more than 30 years, has become obsessed with. After working with health professionals, journalists, and policymakers over the decades, he noticed that “most adults don’t have time to learn, and they have to unlearn a lot of stuff."

So he started to wonder whether children might be more amenable subjects for learning how to assess evidence and claims. To put this idea to the test, in 2000 he visited his then-10-year-old son’s class.

“I told them that some teenagers had discovered that red M&Ms gave them a good feeling in their body and helped them write and draw more quickly,” Oxman said. “But there also were some bad effects: a little pain in their stomach, and they got dizzy if they stood up quickly.”

He challenged the kids to try to find out if the teens were right. He split the class into small groups and gave each group a bag of M&Ms.

The kids quickly figured out they had to try eating M&Ms of different colors to find out what happens, but that it wouldn’t be a fair test if they could see the color of the M&Ms. In other words, they intuitively understood the concept of “blinding” in a clinical trial. (This is when researchers prevent study participants and doctors from knowing who got what treatment so they’re less likely to be biased about the outcome.)

Within an hour of grappling over how to test the M&Ms, the children seemed to grasp basic concepts about testing health claims. “That convinced me that’s the age to start,” Oxman said.

So he’s been working with other researchers from around the world to develop curricula — cartoon-filled books, podcasts — for schoolchildren on how to instill critical thinking skills at an early age. He’s tested their impact in a big trial involving 15,000 schoolchildren in Uganda.

We don’t yet know whether this method will work because the results haven’t been published — but whether or not the trial fails, it’ll bring us closer to answering an important question about information right now: How do you prevent dubious claims from catching on in the first place?

Stanford University professor John Ioannidis also sees the most hope in early childhood education, and agrees children should be empowered with basic skills on critical thinking. He told me that waiting to teach clinicians the standards of evidence-based medicine late in their training doesn’t always work.

“They’ve already been exposed to things that are so un-evidence-based, and the same principle applies to the general public,” he says. “We need to start early on, to make people understand that basing decisions on fair tests, on science, on evidence is important.” He would like to see basic courses on how to seek out high-quality information and appraise it taught alongside math and reading.

Lesson 4: Evidence is necessary but not sufficient

Leonard Syme, considered the father of social epidemiology, helped invent a critical field of health research. But he also looks back and thinks many of his efforts over the years failed because researchers like him were too out of touch with the needs of the people they were trying to influence.

In the early 1970s, he started running a 10-year, $555 million study that involved 350,000 people. The focus: changing participants’ behaviors on three risk factors — high cholesterol, high blood pressure, and smoking — that the scientific community knew increased the risk of disease and death.

“I devoted 10 years of my life to that project,” Syme said. “When the results came up with no change at all — nobody changed behavior! — that was really shattering for me.”

Another study, based in a community in Richmond, California, also focused on different interventions to get people to cut back on smoking. After five years, again, they had made no dent in the smoking rate.

Syme did some soul searching. He reflected on how Richmond’s economy centered on ship yards that sent food and ammunitions to the Europe during World War II. When the war ended, the city had been left without jobs, in poverty. “There’s air pollution, high crime,” he said. “The city’s devastated.”

“If you ask the people there what problems were on their mind, I promise you smoking would not be on their list. But I didn’t pay attention to that because I was a public health expert.”

It occurred to him that public health experts needed to meet people where they are and better connect to their contexts.

“The cold, hard statistics I trained in just don’t do it,” Syme said. Or, as Benjamin Djulbegovic, a cancer researcher and evidence-based medicine thinker at the University of South Florida, put it: “Evidence is necessary but not sufficient for decision making or changing behaviors.”

Lesson 5: Don’t be afraid to hold misinformation peddlers to account

Sometimes you can’t sway people with research, or compassion, or generosity. Sometimes there are high-profile misinformation peddlers who need to be held accountable. In these cases, try shame.

I heard about this tactic a while back from Ben Goldacre, a British author, physician, and longtime slayer of bad science, when I talked to him about how he decides which quacks to take down in his writing. "Mocking people who misuse science is a really useful gimmick for communicating how science works," he said.

Over the years, Goldacre has taken on everyone from sloppy journalists to pharmaceutical executives, vitamin proprietors, and disingenuous academics. He has illuminated the evidence, and lack thereof, behind detox foot baths, homeopathy, and ear candling. He once got his dead cat the same certificate as a famous British nutritionist just to demonstrate how bogus her credentials were.

Now a professor at the Oxford Center for Evidence-Based Medicine, Goldacre produces work that has changed policy about clinical trial transparency, among other areas of health.

But he doesn't just go after cranks for the sake of it; he uses their stories to educate people about science. And he shames those in positions of power who give them the credibility to have an impact.

"Going after people who facilitate the cranks is more likely to produce long-term benefits and also more closely reflects where the true source for the problem lies," he explained. "I can tell you who hates having their name in the paper, and that is journalists, editors, broadcasters, and policymakers. They are used to being able to hide in the shadows, anonymously, and if you can call them out by name, I think that changes their behavior quite well."

He has a point. I once criticized an anti-vaccine story in Canada's largest newspaper. My reporting — one voice in a chorus of criticism — pointed out that the paper's editor-in-chief was in denial about its bad coverage, and that he was ridiculing well-meaning critics. (In a memorable turn of phrase, he called me a "bathwater gargler.") The result? A rare retraction of the story.

Brendan Nyhan, a political scientist at Dartmouth, agreed with Goldacre's advice. "Oprah Winfrey should be ashamed of how she helped give Dr. Oz a platform. People who put Dr. Oz on TV should be embarrassed," he said. "I advocate naming and shaming, not just naming and shaming the public figures who mislead people but the institutions that give them platform.”

Naming and shaming takes strength, and fighting for facts takes time, knowledge, compassion, and patience. But there is hope. Just remember Hilda Bastian.

Further reading: