In December, when Facebook launched Messenger Kids, an app for preteens and children as young as 6, the company stressed that it had worked closely with leading experts in order to safeguard younger users. What Facebook didn’t say is that many of those experts had received funding from Facebook.

Equally notable are the experts Facebook did not consult. Although Facebook says it spent 18 months developing the app, Common Sense Media and Campaign for a Commercial Free Childhood, two large nonprofits in the field, say they weren’t informed about it until weeks or days before the app’s debut. “They had reached out to me personally Friday before it launched, when obviously it was a fait accompli,” says Josh Golin, executive director of Campaign for a Commercial Free Childhood. Facebook, he says, is “trying to represent that they have so much more support for this than they actually do.” Academics Sherry Turkle and Jean Twenge, well-known researchers whose work on children and technology is often cited, didn’t know about the app until after it launched.

The omissions quickly came back to bite Facebook. Eight weeks after the Messenger Kids debut, Golin helped organize a group of nearly 100 child-health advocates who asked Facebook CEO Mark Zuckerberg to kill the app because it could undermine healthy child development. That same week, Common Sense Media announced that it would help fund a lobbying effort around the downside of addictive technology, including a curriculum1 distributed at 55,000 public schools that would highlight concerns, such as a possible link between heavy social media use and depression.

Antigone Davis, Facebook’s global head of safety, says Facebook solicited, and listened to, input from a variety of people before launching the app. “We took much of what we heard and incorporated it into the app,” she says. “For example, we heard from parents and privacy advocates that they did not want ads in the app, and we made the decision not to have ads.”

Fixing Problems

Facebook’s approach to outside voices about Messenger Kids is echoed in its efforts to “fix” other controversial issues, such as fake news and election interference. As pressure mounts, Facebook touts its commitment to solving a difficult problem, often citing partnerships with third-party experts as a sign of its seriousness. Behind the scenes, however, the company sometimes obscures financial ties with experts, ignores high-profile critics, or co-opts outside efforts to address the same concerns.

Last week, for example, frustrated fact-checkers pressured Facebook into a meeting at the company’s headquarters, claiming they had been shut out of vital data necessary to assess whether their efforts to combat fake news were working. Days after social-media analyst Jonathan Albright discovered that Russian propaganda may have been viewed hundreds of millions of times around the presidential election, Facebook called Albright, but then scrubbed the data from the internet. Cindy Southworth, one of the experts often cited in support of Facebook’s controversial project to combat revenge porn, belongs to a nonprofit that has received funding from Facebook. After former Google design ethicist Tristan Harris popularized the phrase “time well spent” to warn against the dangers of addictive technology, Zuckerberg adopted the phrase as well. Several times in recent months, he has promised to make sure that “time spent on Facebook is time well spent.” But Harris doesn’t think Facebook is sincere. “It's too bad to see Facebook co-opt the term without taking its meaning seriously beyond asking what are ‘meaningful interactions,’” he tweeted Monday.

Facebook is “trying to represent that they have so much more support for this than they actually do,” says Josh Golin, executive director of Campaign for a Commercial Free Childhood.

The debate over kids and smartphones is far from settled, including disagreement over the study tying social media use to depression in teens, which was conducted by Twenge. One side argues that kids are already on social media and need guidance to learn how to use it safely. The other side says tech giants have crossed the line by targeting young children and are charging ahead without understanding the effects. The only thing everyone agrees on? The need for more research and better parental controls. In this polarized climate, Facebook initially deflected criticism by presenting Messenger Kids as the result of careful consultation with a range of outside experts, even as it subtly stacked the deck.