Mark Zuckerberg had already been testifying for four hours in the stuffy, wood-paneled room where the House Energy and Commerce Committee held its hearing on Facebook in April, when he got a question he seemed wholly unprepared to answer. Over the course of those four hours, his company had been blamed for enabling the opioid crisis, silencing conservative voices, violating users’ rights to privacy, and setting up a modern day surveillance state akin to J. Edgar Hoover’s COINTELPRO program. It seemed there was no societal ill too grave or niche that Congress couldn’t lay at Facebook’s feet.

Still, Zuckerberg appeared surprised when Georgia representative Buddy Carter asked him about the elephants. “Did you know that there are some conservation groups that assert that there's so much ivory being sold on Facebook that it's literally contributing to the extinction of the elephant species?” Carter asked.

“Congressman, I have not heard that,” Zuckerberg replied.

The timing wasn’t entirely random. Peters and Kohn didn’t initially plan on going public with their complaint. But when Zuckerberg reluctantly agreed to testify before two congressional hearings, they couldn’t pass up the chance to get a question in with all the world watching. So they spoke to the Associated Press, which published a news story about the SEC complaint the same week Zuckerberg was set to testify. Carter seemed to be citing Peters’ own words.

Zuckerberg’s answer or lack thereof revealed just how far down Facebook's to-do list wildlife trafficking ranks. Zuckerberg has repeatedly used a rose-colored glasses defense about disturbing content on Facebook, copping to the fact that he focused too much, for too long, on all the good Facebook has done—while all but ignoring the bad. But that argument is tough to reconcile with the fact that Facebook’s own internal guidelines ban so very many bad things, from cannibalism to infant abuse. Indeed, Facebook is so accustomed to grotesque content appearing on the site, it offers content moderators counseling to help them deal with the horrors they’ve seen.

Facebook’s leaders aren’t unaware of the ugliness that lurks just beyond every pretty selfie. It’s just that, armed with the immunities enshrined in the United States legal code, the company’s never had to do anything about it. “Mark Zuckerberg goes around saying they’re an idealistic company, well I’m idealistic too. We’re trying to save elephants from extinction,” Peters says. “I want to see this firm put some weight behind the idealism and optimism they talk about.”

That’s starting to happen. When news broke last fall that the company had sold thousands of political ads to Russian propagandists in the runup to the 2016 election, Facebook announced it would double its content moderation team to 20,000 people, impose strict verification processes for political advertisers, and create massive repositories of political ads, complete with information on how much they cost, who paid for them, and what demographics they reached.

This spring, when the world found out that a data firm called Cambridge Analytica had amassed data on as many as 87 million Americans without their permission through a silly third-party quiz app, Facebook radically limited app developers’ data access and announced it would be auditing how all apps use Facebook user data, even if it means hiring thousands more people. As Facebook executives recently explained to WIRED, the company has also overhauled its News Feed algorithms to reduce people’s exposure to fake news and outlets it deems untrustworthy. It has also answered accusations of liberal political bias by inviting the conservative Heritage Foundation and others to study the company from the inside out.