Facebook may have put new tools in place that allow third-party checkers to flag dubious news stories, but that doesn’t mean that the social network will try to stop people from sharing “fake news.”

In an on-stage interview today during Recode’s Code Media conference in Southern California, Facebook vice president of partnerships Dan Rose said that, “at the end of the day, if people want to share stories that have been flagged with their friends, that’s ultimately their prerogative.”

“We are making a very important point of not putting ourselves in a position of deciding what’s fake and not fake. I don’t think people want us to be the arbiters of truth,” Rose said, echoing what Facebook chief executive Mark Zuckerberg has said himself in shared posts about the “fake news” phenomenon. “There are third parties out there who do this for a living.”

Facebook, for now, has taken a “not it” approach to the fake news problem

Facebook’s “not it” approach to managing the dissemination of dubious or fabricated news stories is not entirely surprising. It’s not always clear which stories are fake at the moment they’re published, and to block them from the feed entirely would subject Facebook to cries of censorship. Facebook also has a vested interest in keeping its two billion users around the globe sharing as much content with each other as possible, even while it’s trying to figure out best practices around false news.

But Facebook, along with Twitter and Google, drive a disproportionate amount of news consumption. According to Pew Research Center, the majority of US adults now get their news from social media sites, and most of them — 64 percent — get their news from one site only: Facebook.

The issue of “fake news” emerged with a vengeance after last year’s US presidential election, when the spread of viral hoaxes may have contributed to Donald Trump’s victory over Hillary Clinton. (Mark Zuckerberg has rejected this idea, calling it crazy.)

Is partnering with fact-checking and media literacy groups enough?

In recent months Facebook has made some attempts to curb the problem: it partnered with fact-checking organizations, like Snopes, Politifact, and FactCheck.org, and created a tool that people can use to flag suspicious news content. If a fact-checker confirms the story is a hoax, Rose said, a prominent label is slapped on the story, and if a user goes to share it on Facebook, he or she will also see then that the story has been disputed.

Facebook is also making efforts around media literacy, having recently supported a PSA by The News Literacy Project designed to help people become more skeptical news consumers. Like Apple CEO Tim Cook’s recent remarks on the need to educate the modern media consumer, Facebook’s Rose thinks that media literacy is “a skill that people are going to need to have, and it’s a skill that we’re committed to.”

In other words: it’s not a Facebook problem, at least right now, if fake news is being spread on the site; it’s something consumers are supposed to be able to pick up on.

Not it, indeed.