Lawmakers and public health advocates are pressuring tech companies to crack down on anti-vaccine content proliferating online, which they fear is contributing to a massive measles outbreak in the United States.

Experts attribute the outbreak to the increasing number of "anti-vaxers," people who don't vaccinate their children. And they warn the movement is largely using social media to promote their views, for example via YouTube videos and Facebook discussion groups.

ADVERTISEMENT

Tech giants say they are taking the issue seriously even as they grapple with competing demands between promoting public health and protecting free speech.

Facebook has said it might stop recommending anti-vaccination content and groups to users, while YouTube is looking to change its algorithms to stop promoting videos with misinformation.

A YouTube spokesperson in a statement to The Hill said the video-sharing platform has started to present medically accurate content “for people searching for vaccination-related topics,” and is “beginning to reduce” recommendations of anti-vaccination videos.

“Like many algorithmic changes, these efforts will be gradual and will get more and more accurate over time,” the spokesperson said.

YouTube over the weekend said it will demonetize channels that promote anti-vaccine content, and link to the Wikipedia entry on "vaccine hesitancy" before videos that promote such views.

Pinterest has taken the strongest stance so far. This week, it announced that it would block search results about vaccinations.

The image-sharing network is one of the only major platforms with a specific “health misinformation” policy, put in place in 2017 after it discovered users were posting false cures for illnesses including cancer and incorrect information on vaccines.

The debate also highlights a critical question for tech companies: the amount of responsibility they should assume for information on their platforms.

Tech platforms regularly invoke First Amendment concerns when pressed to better police content.

Section 230 of the Communications Decency Act also prevents the largest platforms from being held legally liable for content posted by their users, leaving tech regulators without much leverage.

Liz Woolery, the deputy director of the Center for Democracy and Technology’s Free Expression Project, told The Hill she believes each company should assess how to handle anti-vaccine content based on their own guidelines.

But the recent measles outbreak is adding new urgency to the debate and highlighting the power of social networks.

The outbreak comes 18 years after the measles were officially eradicated in the U.S. and has already affected almost 350 people since last fall and resulted in the declaration of a public health emergency in Washington state last month .

The World Health Organization (WHO) this year listed “vaccine hesitancy” among the ten greatest threats to global health .

Health experts who spoke to The Hill said there is a direct link between anti-vaccine content online and the increase in people failing to get vaccinations.

“Measles is a great case study — a computer virus having real-life effects,” Dr. Haider Warraich, a fellow in heart failure and transplantation at Duke University Medical Center, told The Hill. Warraich has studied and criticized the presence of medical misinformation online.

“It started out as rumors on the internet that coalesced in these social media groups and is now having real-life effects in real communities,” Warraich said. “So, I do think that the internet has a role to play in that this may be the first of many other examples in the future.”

A recent Guardian investigation found that YouTube and Facebook’s algorithms steer users toward anti-vaccination content, driving them away from authoritative medical resources to unscientific assessments.

Facebook and YouTube said they were taking steps to deal with the problem, but shortly after that Buzzfeed News reported that Google-owned YouTube's algorithms also recommended anti-vaccine content.

Those reports attracted the attention of lawmakers.

“It’s unconscionable that YouTube’s algorithms continue to push conspiracy videos to users that spread disinformation and, ultimately, harm the public health,” House Energy and Commerce Committee Chairman Frank Pallone, Jr. (D-N.J.) said in a statement to The Hill.

Pallone noted that his committee, which covers both tech and health issues, has a hearing on the measles outbreak on Feb. 27 and vowed to "discuss this with the public health experts who are testifying."

Those witnesses will include Nancy Messonnier, the director of the National Center for Immunization and Respiratory Diseases at the Centers for Disease Control and Prevention (CDC), and Anthony Fauci, director of the National Institutes of Health's (NIH) allergy and infectious disease division.

House Intelligence Committee Chairman Adam Schiff Adam Bennett SchiffChris Matthews ripped for complimenting Trump's 'true presidential behavior' on Ginsburg Trump casts doubt on Ginsburg statement, wonders if it was written by Schiff, Pelosi or Schumer Top Democrats call for DOJ watchdog to probe Barr over possible 2020 election influence MORE (D-Calif.) earlier this month also pressed Google CEO Sundar Pichai and Facebook CEO Mark Zuckerberg Mark Elliot ZuckerbergHillicon Valley: FBI, DHS warn that foreign hackers will likely spread disinformation around election results | Social media platforms put muscle into National Voter Registration Day | Trump to meet with Republican state officials on tech liability shield Facebook to 'restrict the circulation of content' if chaos results from election: report 2.5 million US users register to vote using Facebook, Instagram, Messenger MORE over the issue.

Schiff wrote that he is concerned YouTube, Facebook and Instagram are “surfacing and recommending messages that discourage parents from vaccinating their children, a direct threat to public health, and reversing progress made in tackling vaccine-preventable diseases.”

While lawmakers are pushing for action, health advocates worry that federal health agencies have been slow to act.

The CDC and Department of Health and Human Services (HHS) have not launched new campaigns to target the growth in anti-vaccine content online in recent years. The two agencies did not return The Hill’s requests for comment on the role social media plays in spreading anti-vaccine information.

WHO also did not respond to The Hill’s repeated requests for comment on the role social media could play in dissuading people from getting vaccinated.

Peter Hotez, dean of the National School of Tropical Medicine at Baylor College of Medicine in Houston and co-director of the Texas Children’s Hospital Center for Vaccine Development, said the CDC has largely ignored the anti-vaccination movement since its inception years ago.

“[One] potential argument was, ‘Well, this is a fringe group and by calling it out, giving attention to it, you only give it oxygen,’ ” Hotez, who researches vaccine hesitancy, told The Hill. “I think that probably was a good strategy in the early 2000s, but I think there’s a lack of recognition that this has now become a media empire that now needs to be dismantled.”

“They dominate the internet,” Hotez told The Hill. “Not only social media — they also have almost 500 anti-vaccine websites by some accounts. They use social media to amplify those websites.”

Hotez also pointed to another tech company: Amazon.

Many of the top-selling and highest-rated books in the “vaccinations” category on Amazon are skeptical of or outright oppose vaccines. The fifth most popular book promoted by the online retail giant promotes the theory that vaccines cause autism, a claim that has been categorically debunked by scientists.

Amazon declined to comment for this story on the record, pointing The Hill to its bookselling guidelines, which allow the company to “provide ... customers with a variety of viewpoints, including books that some customers may find objectionable.”

“We reserve the right not to sell certain content, such as pornography or other inappropriate content,” Amazon says in the guidelines, without going into specifics.

Dr. Arthur Caplan, the founding head of the division of medical ethics at New York University’s School of Medicine, told The Hill that a lot of anti-vaccine content spreads through “small and medium-sized groups and entities piling onto one another, remessaging one another, retweeting one another.”

“Twitter [and] Facebook tend to be the big ones,” Caplan said. “There are bots out there … that promote or reinforce misinformation. A lot of stuff gets tweeted, retweeted, retweeted, retweeted.”

Twitter does not have a specific policy on medical misinformation.

ADVERTISEMENT

“Twitter’s open and real-time nature is a powerful antidote to the spreading of all types of false information,” a Twitter spokesperson said in a statement to The Hill. “We, as a company, should not be the arbiter of truth.”

For tech companies, these difficult questions aren't going away.

“Working together is the solution,” said Warraich, who encouraged tech companies to reach out to health officials.

"Technology companies need to be humble and need to realize this is a public health crisis that they can be a solution for."