In a furious tweetstorm this weekend, Facebook's Chief Security Officer warned interfering desperate politicians and triggered letfists that the fake news problem is more complicated and dangerous to solve than the public thinks.

As a reminder, we noted that Alex Stamos was seemingly pressured into 'finding' Russian evidence after Senator Mark Warner paid the social media company a visit -

A few weeks after the French election, Warner flew out to California to visit Facebook in person. It was an opportunity for the senator to press Stamos directly on whether the Russians had used the company’s tools to disseminate anti-Clinton ads to key districts. Officials said Stamos underlined to Warner the magnitude of the challenge Facebook faced policing political content that looked legitimate. Stamos told Warner that Facebook had found no accounts that used advertising but agreed with the senator that some probably existed. The difficulty for Facebook was finding them. For months, a team of engineers at Facebook had been searching through accounts, looking for signs that they were set up by operatives working on behalf of the Kremlin. The task was immense. Warner’s visit spurred the company to make some changes in how it conducted its internal investigation. Instead of searching through impossibly large batches of data, Facebook decided to focus on a subset of political ads. Technicians then searched for “indicators” that would link those ads to Russia. To narrow down the search further, Facebook zeroed in on a Russian entity known as the Internet Research Agency, which had been publicly identified as a troll farm. “They worked backwards,” a U.S. official said of the process at Facebook. The breakthrough moment came just days after a Facebook spokesman on July 20 told CNN that “we have seen no evidence that Russian actors bought ads on Facebook in connection with the election.”

And the rest is history as 3000 "Russian Ads" were suddenly discovered to enable Warner to keep the narrative alive - and more crucially demand social media companies crackdown on 'fake news', on politically-sponsored and divisive ads, and on anything they decide is not really fitting their identity-politics-based narratives.

However, given this weekend's tweetstorm by Stamos, we suspect he has finally snapped and been forced to shove some common sense down the throats of the vengeful politicians who see no unintended consequences in their demands for big-brother-esque newspeak.

"It’s very difficult to spot fake news and propaganda using just computer programs," Stamos said in a series of Twitter posts on Saturday. “Nobody of substance at the big companies thinks of algorithms as neutral,” Stamos wrote, adding that the media is simplifying the matter. “Nobody is not aware of the risks.”

For example, lots of journalists have celebrated academics who have made wild claims of how easy it is to spot fake news and propaganda. — Alex Stamos (@alexstamos) October 7, 2017

So if you don't worry about becoming the Ministry of Truth with ML systems trained on your personal biases, then it's easy! — Alex Stamos (@alexstamos) October 7, 2017

My suggestion for journalists is to try to talk to people who have actually had to solve these problems and live with the consequences. — Alex Stamos (@alexstamos) October 7, 2017

As Bloomberg reports, the easy technical solutions would boil down to silencing topics that Facebook is aware are being spread by bots -- which should only be done “if you don’t worry about becoming the Ministry of Truth” with machine learning systems “trained on your personal biases,” he said.

“A lot of people aren’t thinking hard about the world they are asking [Silicon Valley] to build,” Stamos wrote. “When the gods wish to punish us they answer our prayers.”

Stamos’s comments shed light on why Facebook added 1,000 more people review its advertising, rather than attempting an automated solution.

The company sent a note to advertisers telling them it would start to manually review ads targeted to people based on politics, religion, ethnicity or social issues. The company is trying to figure out how to monitor use of its system without censoring ideas, after the Russian government used fake accounts to spread political discord in the U.S. ahead of the election.

The silver lining at least is that Stamos is aware of what a terrible idea the kind of censorship that Democratic politicians (and John McCain) are demanding... whether this means that, behind the smoke and mirrors of actively managing your news feed to provide you with your self-bias-perpetuating perspective, anything will change, is anyone's guess.