On Aug 19, 2019, British Prime Minister Boris Johnson outlined plans for a summit of social media firms to discuss how to promote accurate information about vaccination. The announcement accompanied the news that WHO no longer considers the UK to have eliminated measles. Coverage of the second dose of the measles–mumps– rubella (MMR) vaccine in the country has fallen to 87%, lower than the 95% required for herd immunity. “I am afraid people have just been listening to that superstitious mumbo-jumbo on the Internet, all that antivax stuff, and thinking that the MMR vaccine is a bad idea”, commented Johnson on a visit to a hospital in southwest England. “That's wrong. Please get your kids vaccinated.”

On both sides of the Atlantic, measles is resurgent. In 2017, there were 284 cases in England and Wales, increasing to 991 in 2018. In the 53 countries of the WHO European Region, cases of measles leapt from 5273 in 2016, to 83 540 in 2018. In the past year, the caseload in Ukraine alone exceeded 50 000 cases. The USA is likely to have around 2000 cases of measles in 2019, having had 372 in 2018. Wherever cases are spiking, there is a simple explanation: inadequate coverage with both doses of the measles-containing vaccine, which can occur for all kinds of reasons. Ukraine has had shortages in supply, serious problems with its health-care system, and a simmering conflict with Russia that has left 1·6 million people internally displaced. But beyond the issues of access, the Ukrainian population's confidence in vaccines has also wavered.

WHO defines vaccine hesitancy as a “delay in acceptance or refusal of vaccines despite availability of vaccination services”. The effect has been around for as long as vaccination. But the advent of social media has offered an unprecedented opportunity to amplify and spread antivaccination messages. “What was previously a fringe opinion is becoming a transnational movement”, said David Broniatowski (School of Engineering and Applied Science, George Washington University, Washington, DC, USA). “The unique thing about social media is that they allow messages to propagate very quickly and for communities to form.”

It is difficult to assess the extent to which exposure to such messages affects people's opinions. It is probable that the posts mostly bounce around online echo chambers. But even a small effect can be meaningful. “If you are talking about herd immunity, all you need is for coverage to drop a few percentage points to go from no epidemic to epidemic”, explains Broniatowski. “It just takes a minority of parents deciding to delay vaccination long enough for their children to be exposed to the pathogen.” In which case, misleading health information on social media might push vaccine hesitancy to the point of disaster. “We have all the underlying issues around access, people not being able to get appointments or not knowing where to go, and then on top you have all that stuff on social media discouraging parents from having their kids vaccinated”, said Heidi Larson (London School of Hygiene and Tropical Medicine, London, UK). “That can be the last straw, the thing that causes the system to snap.”

Vish Viswanath (Harvard TH Chan School of Public Health, Boston, MA, USA) points out that the general public can be split into three groups. There is a minority of staunch opponents of vaccination who are unlikely to shift their opinion. There is a larger group of people who have been persuaded of the importance of vaccines and are just as unlikely to shift their opinion. And then there are those in the middle. “These people are trying to do the right thing but they have doubts and they have questions, and they can be vulnerable to antivaccination messages”, said Viswanath. “That is where you see the damage.”

The traditional media previously served as a moderating force, filtering scientific information and fact-checking, however imperfectly, for their audience. But that model has broken down. “Social media platforms have a responsibility to let users know where the information they are reading comes from”, said Beth Hoffman (Center for Research on Media, Technology, and Health, University of Pittsburgh School of Medicine, Pittsburgh, PA, USA), who co-authored the Facebook study. “It is very easy for non-credible sources to look like credible sources. As a society, we really need to work on providing people with the media literacy skills and education so that they can figure out which sources of information are reputable.”

A wide range of drivers lie behind vaccine hesitancy, including conspiracy theories, general distrust, belief in alternatives, or concerns about safety. Some social media posts can be classified as misinformation, eg, the Instagram notice that reads “so, a baby can handle 8–9 viruses all at once via vaccination, but cannot handle one single virus when it's wild caught?”. Disproving such assertions is relatively straightforward. Indeed, the majority of comments alongside the Instagram post do just that.

Other messages fall into the category of disinformation, such as posts claiming that babies have died, suffered severe disabilities, or developed autism as a result of being vaccinated. These posts are trickier to counteract. “Disinformation requires an institutional response”, said Viswanath. “We should be able to figure out where the disinformation is coming from and take appropriate action. It has to be conceptualised and treated very differently from misinformation.” Still other messages do not easily fall into either category.

Matters are further complicated by the risk of establishing false equivalence. A study co-authored by Broniatowski found that Russian trolls tweeted both provaccine and antivaccine messages in an effort to foment discord and create the impression that the subject remains a matter of debate. In such circumstances, responding to antivaccination arguments can be counterproductive. “Repeating canards still means you are acknowledging and broadcasting them, and that can leave the impression that the antivaccination perspective is a legitimate one”, Viswanath told The Lancet Digital Health.

Besides, the majority of messages fall by the wayside. A search of Twitter for #vaccinescauseautism, reveals a woman (or bot) who tweets things like “so glad I didn't get my beautiful children vaccinated but it looks like the majority have already been brainwashed” and “my baby boy got measles but he's doing great all thanks to his unweakened immune system”. But the account only has six followers—scarcely worth troubling with. More than half of Twitter messages go unshared. Nonetheless, certain posters have a huge profile. Take the man who tweeted in 2014, “healthy young child goes to doctor, gets pumped with massive shot of many vaccines, doesn't feel good and changes - AUTISM. Many such cases!”. He is now president of the USA (Donald Trump has since urged parents to vaccinate their children).

The social media networks have started to take action. In August 2019, Pinterest announced that searches on its site for vaccine-related topics, such as measles or vaccine safety, will only turn up links to reputable public health organisations. The results will not be accompanied by recommendations, comments, or advertisements. The move was welcomed by WHO's director-general Tedros Adhanom Ghebreyesus. “We hope to see other social media platforms around the world following Pinterest's lead”, he stated. More than 300 million people visit the Pinterest website or access its app every month. A large number but nowhere near the reach of Facebook, which has 2·4 billion users every month. Earlier this year, Facebook stated that it would no longer recommend content that included misinformation about vaccines and it would reject advertisements that carried misinformation.

The platforms have to thread a fine line between censorship and helping to facilitate the dissemination of dangerous misinformation. Instagram has blocked hastags that make patently false claims such as #vaccinescauseaids and has said that it will ban hashtags that become associated with misleading information in the future, although it will not target those who express antivaccination opinions. Youtube has removed advertisements from antivaccination videos, meaning that the posters will not make any money. Twitter ensures that when users search for vaccine-related topics in the UK, the first result is for the National Health Service. In the USA, the same search first turns up a link to the Department of Health and Human Services. But scroll down the page and antivaccination messages abound.

“Social media is not all the same”, adds Broniatowski. “The platform matters a great deal.” Facebook, for example, has administrators and content is largely shared within groups, whereas Twitter is full of bots. So Facebook might be best advised to focus on setting standards for its administrators, whereas Twitter could turn its attention to making it harder for automated accounts to be promoted.

“Some of the issues are transitional”, said Larson. “Social media platforms opened their doors relatively recently; I think we are still deciding where to draw the boundaries.” She believes that the medical community needs to be proactive. “Official websites could be far more responsive; parents complain about not being able to find answers to the questions they have”, said Larson. “A lot of public health officials have been anxious about going into social media, especially the older generation, but that is where the public are to be found these days. We are facing this growing gap between where the scientific and official information lives and where the public is going. That has to change.”