The Interface is a daily column and newsletter about the intersection of social media and democracy. Subscribe here .

There are many criticisms of Facebook’s size, power, and business model, but two stand out for the intensity with which they are usually discussed. One is that Facebook is a dystopian panopticon that monitors our every move and uses that information to predict and manipulate our behavior. The other is that Facebook has come such a pillar of modern life that every product decision it makes could reshape the body politic forever.

Today, in an impressive flurry of news-making, Facebook took steps to address both concerns.

First, the company said it was finally releasing its long-delayed “Clear History” tool in three countries. (The United States is not one of them.) I wrote about it at The Verge:

It was nearly a year and a half ago that Facebook CEO Mark Zuckerberg, standing onstage at the company’s annual developer conference, announced that the company would begin letting users sever the connection between their web browsing history and their Facebook accounts. After months of delays, Facebook’s Clear History is now rolling out in Ireland, South Korea, and Spain, with other countries to follow “in coming months,” the company said. The new tool, which Facebook conceived in the wake of the Cambridge Analytica scandal, is designed to give users more control over their data privacy at the expense of advertisers’ targeting capabilities. When it arrives in your country, the Clear History tool will be part of a new section of the service called “Off-Facebook activity.” When you open it, you’ll see the apps and websites that are tracking your activity and sending reports back to Facebook for ad targeting purposes. Tapping the “Clear History” button will dissociate that information from your Facebook account. You can also choose to block companies from reporting their tracking data about you back to Facebook in the future. You’ll have the choice of disconnecting all off-Facebook browsing data, or data for specific apps and websites. Facebook says the product is rolling out slowly “to help ensure it’s working reliably for everyone.”

Some writers, such as Tony Romm here, pointed out that Facebook is not actually deleting your data — which would seem to blunt the impact of a button called “Clear History.” In fact, given that the data link you’re shutting off is primarily relevant to ads you might see later, it feels more like a “Muddle Future” button. Facebook, for its part, has cloaked the entire enterprise into a section of the app opaquely titled “Off-Facebook Activity,” which could more or less mean anything.

I find it hard to get too worked up about any of this, because regardless of whether Facebook is able to take into account your web browsing habits, it’s still going to be sending you plenty of highly targeted ads based on your age, gender, and all the other demographic data that you forked over when you made your profile. Or you could simply turn off ad targeting on Facebook altogether, which is more powerful in this regard than any Clear History tool was ever going to be. (Here’s an account from a person who did this.)

Second, Facebook released the results of its anti-conservative bias audit, in which the company asked former Sen. Jon Kyl and the law firm Covington & Burling to ask 133 conservative lawmakers and interest groups to tell it whether they think Facebook is biased against conservatives.

This project has fascinated me since it was announced, since Facebook had clearly volunteered to play a game it could only lose. As I’ve written here before, the definition of “bias” has expanded to include any time someone has a bad experience online.

On one hand, there’s no evidence of systematic bias against conservatives or any other mainstream political group on Facebook or other platforms. On the other hand, there are endless anecdotes about the lawmaker whose ad purchase was not approved, or who did not appear in search results, or whatever. Stack enough anecdotes on top of one another and you’ve got something that looks a lot like data — certainly enough to convene a bad-faith congressional hearing about platform bias, which Republicans have done repeatedly now.

So here comes Kyl’s “audit,” which appears to have taken roughly the same shape as President Trump’s call for stories of Americans who feel that they have been censored by the big platforms. Kyl’s findings are short on facts and long on feelings. Here’s this, from an op-ed he published today in The Wall Street Journal.

As a result of Facebook’s new, more stringent ad policies, interviewees said the ad-approval process has slowed significantly. Some fear that the new process may be designed to disadvantage conservative ads in the wake of the Trump campaign’s successful use of social media in 2016.

So, some anonymous conservatives believe that Facebook is involved in a conspiracy to prevent conservatives from advertising. That might come as a surprise to, say, President Trump, who is outspending all Democrats on Facebook ads. But the Kyl report has no room for empirical thought. What’s important here is that 133 unnamed people have feelings, and that they spent the better part of two years talking about them in interviews that we can’t read. (Here’s a link to the published report, which clocks in at a very thin eight pages. And here’s a helpful rebuttal from Media Matters, which uses data to illustrate how partisan conservative pages continue to thrive on Facebook.)

Despite the fact that we have no idea who Kyl talked to, or what they said beyond his meager bullet points, the report still had at least some effect on Facebook policymaking. As Sara Fischer reports in Axios, Facebook ads can now show medical tubes connected to the human body, which apparently make for more viscerally compelling anti-abortion ads:

The medical tube policy makes it easier for pro-life ads focused on survival stories of infants born before full-term to be accepted by Facebook’s ad policy. Facebook notes that the policy could also benefit other groups who wish to display medical tubes in ads for cancer research, humanitarian relief and elderly care.

And how are conservatives using the information from today’s audit? If you guessed “as a cudgel to continue beating Facebook with,” you win today’s grand prize. Here’s Brent Bozell: “The Facebook Kyl cover-up is astonishing. 133 groups presented Kyl with evidence of FB’s agenda against conservatives and he dishonestly did FB’s bidding instead.”

And here’s Sen. Josh Hawley (R-MO):

“Facebook should conduct an actual audit by giving a trusted third party access to its algorithm, its key documents, and its content moderation protocols,” Hawley said in a statement. “Then Facebook should release the results to the public.”

I asked Hawley’s people if the senator was aware that Facebook’s content moderation protocols have been public for years, but I never heard back.

Anyway, Facebook wrapped up the day by announcing — in a fantastically bizarre feat of timing — that it would begin to hire human beings to curate your news stories, just as Apple does for Apple News. (Apply for the job here! Let me know if you get it!) This is the right thing to do — our leaky information sphere needs experienced editors with news judgment more than ever — but also one guaranteed to court controversy. One person’s curation is, after all, another person’s “bias.”

The return of human editors to Facebook, on the very day that it publishes its investigation into alleged bias against conservatives, is a real time-is-a-flat-circle moment. After all, it was trumped-up outrage over supposed bias in its last group of human editors that helped to set us down this benighted path to begin with. I want to end on something I wrote last February on this subject:

I’m struck how, in retrospect, the story that helped to trigger our current anxieties had the problem exactly wrong. The story offered a dire warning that Facebook exerted too much editorial control, in the one narrow section of the site where it actually employed human editors, when in fact the problem underlying our global misinformation crisis is that it exerted too little. Gizmodo’s story further declared that Facebook had become hostile to conservative viewpoints when in fact conservative viewpoints — and conservative hoaxes — were thriving across the platform. Last month, NewsWhip published a list of the most-engaged publishers on Facebook. The no. 1 company posted more than 49,000 times in December alone, earning 21 million likes, comments, and shares. That publisher was Fox News. And the idea that Facebook suppresses the sharing of conservative news now seems very quaint indeed.

Pushback

First, I made a dumb mistake yesterday: you can access Twitter and Facebook without a VPN in Hong Kong, and millions of people do. I said the opposite. Sorry about that.

Second, thanks to everyone who wrote in with their thoughts on the new, slimmer Interface. I heard from about 10 of you who said they missed the expanded section of links and excerpts. In response, I’ve tweaked today’s edition a bit: I’ve upped the number of links slightly, and I’ve included excerpts for the day’s top two stories. Hopefully this strikes a better balance than yesterday’s edition. Keep the feedback coming!

Democracy

⭐ The Justice Department and Federal Trade Commission are growing their budgets and staffing up as they prepare for antitrust battle. Christopher Stern and Ashley Gold report:

The Justice Department has asked Congress for an additional $1.8 million for its antitrust division, and is in the process of making new hires as it seeks to employ several dozen more people. The Federal Trade Commission is requesting $2.4 million for antitrust work. If the funding requests are even partially met, which is likely, it would mark a turnaround from the first two years of the Trump administration, when staff cuts and flat or shrinking budgets were the norm. The ramp-up in activity has put the country’s biggest tech companies on notice that they face potentially years of intense scrutiny into how they operate. Even with new funding and staff, it remains to be seen whether the small group of lawyers and economists at the Justice Department and the FTC—two budget-constrained agencies with uneven histories when it comes to standing up to corporations—can make a case against the tech giants that is strong enough to take to court. The agencies’ legal strategy also will have to withstand the political influence exerted by the companies’ powerful lobbyists and legal teams.

China is officially very mad that Facebook and Twitter banned its disinformation accounts. “Foreign Ministry spokesman Geng Shuang declined direct comment on the Twitter and Facebook actions, but defended the right of Chinese people and media to make their voices heard over the Hong Kong protests,” Reuters reports.

The European Union has launched an antitrust investigation into Libra. The European Commission wants to understand how the cryptocurrency could harm competition. (Lydia Beyoud and Aoife White / Bloomberg)

Related: Yi Yuan on China’s soft-power deficit. Also: Angela Chen asks whether Twitter should ban Voice of America.

Act.IL is a smartphone app that encourages users to boost pro-Israel messages on social networks. This analysis by the Digital Forensic Research Lab finds that it has not been particularly effective, but notes that its decentralized approach could make it a model for future influence operations.

Podcast appearances have become an essential part of modern campaigning in the United States. (Makena Kelly / The Verge)

Elsewhere

⭐ YouTube hopes to get regulators off its back by ending target ads on videos aimed at children. Mark Bergen reports in Bloomberg:

YouTube has long maintained that its primary site is not for children. (The company says kids should use YouTube Kids app, which does not use targeted ads.) But nursery rhymes and cartoon videos on the main site have billions of views. The platform’s many issues with children’s content– horrific imagery, problems that led to disabling comments– have troubled its video creators, worried parents and empowered rivals. Getting rid of targeted ads on children’s content could hit Google’s bottom line – but this solution would be far less expensive than other potential remedies that aim to placate regulators.

Related: Meet the LGBT YouTubers suing over demonetization. (Julia Alexander / The Verge)

Gun sellers are sneaking onto Facebook marketplace by posting a seemingly overpriced “gun case” that actually contains the gun inside. (Parmy Olson and Zusha Elinson / The Wall Street Journal)

The Epoch Times is a news outlet that has spent $1.5 million on 11,000 pro-Trump Facebook ads. It’s associated with Falun Gong and has promoted QAnon conspiracy theories. ( Brandy Zadrozny and Ben Collins / NBC)

Scientific American’s new issue is devoted to disinformation. Claire Wardle’s essay here argues that the world is suffering from a new “information disorder.”

Hong Kong protesters have reappropriated Pepe the Frog as a pro-democracy symbol. (Daniel Victor / The New York Times)

A profile of OnlyFans and similar sites, which let influencers post racy and even pornographic content for their followers, is using a paid subscription model. In my experience, every gay guy knows what OnlyFans is, and almost no straight person does. A good, overdue look at the phenomenon. (Jason Parham / Wired)

I will read any piece whose headline begins with “an influencer is defending.” (Tanya Chen / BuzzFeed)

And finally ...

Outstanding parody here from Adrian Gray on what you might get with a Twitter premium subscription, including swipeable echo chambers and a mystery button that you really shouldn’t touch. This deserves way more retweets:

Wow, 'Twitter Gold' looks like it has some great features! Can't wait to subscribe! pic.twitter.com/vOQyZKe3fR — Adrian Gray (@AdrianRMG) August 20, 2019

Talk to me

Today’s footer comes from reader @apn: “Send me tips, comments, questions, and zany Matrix movie plot ideas: casey@theverge.com.”