Facebook CEO Mark Zuckerberg contradicted himself when explaining how the company thinks about and deals with the spread of misinformation on the platform, highlighting the complicated struggle between Facebook's mission of being a platform for people to express themselves and the very real harm done by people intentionally spreading fake news.

It also suggests that the company's institutional confusion over the differences between journalism, opinion, entertainment and propaganda start at the top.

In a lengthy interview, Recode's Kara Swisher asked the 34-year-old CEO to make the case for allowing Infowars to remain on the platform, even after the outlet has published false stories like suggesting the Sandy Hook elementary school shootings in Connecticut were staged.

Zuckerberg laid out his thinking like so:

The approach that we’ve taken to false news is not to say, you can’t say something wrong on the internet. I think that that would be too extreme. Everyone gets things wrong, and if we were taking down people’s accounts when they got a few things wrong, then that would be a hard world for giving people a voice and saying that you care about that.

But at the same time, I think that we have a responsibility to, when you look at… if you look at the top hundred things that are going viral or getting distribution on Facebook within any given day, I do think we have a responsibility to make sure that those aren’t hoaxes and blatant misinformation. That’s the approach that we’ve taken. We look at the things that are getting the most distribution. If people have flag them as potential hoaxes, we send those to fact-checkers who are all well reputable and have followed standard principles for fact checking, and if those fact checkers say that it is provably false, then we will significantly reduce the distribution of that content...

In other words, Facebook will try and stop provably fake news from going viral, but won't ban it.

Later, when asked why Facebook does not just remove Infowars completely, he gave some counterexamples of places where the platform was used to encourage violence, such as Myanmar and Sri Lanka, and offered:

The principles that we have on what we remove from the service are: If it’s going to result in real harm, real physical harm, or if you’re attacking individuals, then that content shouldn’t be on the platform.

That makes sense as a bright line.

But when Zuckerberg takes the example of Infowars publishing misinformation and equates it with saying "something wrong on the Internet," that gets to the core of the problem. To Facebook, the practice of journalism is no different than somebody expressing an opinion on a user forum. If it's wrong, that's just the person's opinion, and Facebook is not in the business of censoring opinions.

A similar misunderstanding also cropped up earlier this year in a Facebook promotional video where a researcher talked about "getting something right on the Internet" and quipped: "I'm sure it'll happen someday."

In fact, getting something right on the Internet is the whole point of journalism.

Certainly, there are people at Facebook who are trying to help journalism. Facebook has also invested in a variety of programs and scholarships meant to support journalism, including a program to help U.S. metro newspapers get more digital subscriptions and several programs to improve news literacy.

But Zuckerberg and others at the company do not seem to understand that there is a tangible difference between reporting what is happening in the world with the intention of getting the facts right, which is what legitimate news organizations try to do every day, and saying whatever you want regardless of whether it's factually true or not.

Until Facebook's leadership understands that this difference in intention does exist, and that it is possible to discern the difference between the two — there are thousands of reporters and editors around the world whose job is precisely to do this — the company will keep getting the fake news problem wrong.

Failing to solve the fake news problem might not matter to Facebook's business in the short term. Users keep visiting the site regardless of all the hand-wringing over fake news and misinformation. But it matters to society. That means the law could get involved, and that is a real risk to Facebook's business.