Get breaking news alerts and special reports. The news and stories that matter, delivered weekday mornings.

Facebook's newest tactic for fighting fake news takes the internet out of the equation, instead reaching people through print advertisements to deliver tips on how to determine fact from fiction online.

The advertisements are currently running in the United Kingdom, ahead of the country's snap general election in June. The full page ads are being run in some top British publications, and include the same tips Facebook has previously shared online, guiding people on how to suss out whether a story is the real deal.

Facebook is running full page ads in various British publications, teaching people how to spot fake news. NBC

Related: Facebook Just Rolled Out Its Fake News Tool

Those include "be skeptical of headlines," looking closely at the url to see if it is phony or a look-alike to a legitimate news site, looking at other reports and to ask yourself, "Is this a joke?"

"It’s a way to show the critics they are sincere and trying to take measures to distance themselves from the perception Facebook is basically a channel that can be misused very easily," Niklas Myhr, an assistant professor of marketing at Chapman University, told NBC News.

Facebook has spent the past few months trying to clean up its fake news mess by employing technological tools, a streamlined reporting process, and disrupting the economic incentives (advertising money) purveyors of fake news were collecting.

Byers Market Newsletter Get breaking news and insider analysis on the rapidly changing world of media and technology right to your inbox. This site is protected by recaptcha

While a print campaign is just "one vehicle" for combatting the fake news scourge, Myhr said it's a step in the right direction to help improve consumer behavior.

"As long as people are likely to click on headlines that are shocking and curiosity invoking, it favors much of the fake news stories," he said. Facebook's algorithm determines what you see where in your news feed. The way it's set up, it's going to rely on Facebook to "change consumer clicking behavior" to stop false stories from being given more credence.

They're already making progress on that front.

Last month, weeks before the French presidential election, Facebook shut down 30,000 fake accounts that had been spreading hoax stories.

Facebook said in a statement that its technology allows it to identify "inauthentic accounts more easily by identifying patterns of activity — without assessing the content itself." These signals include "repeated posting of the same content, or an increase in messages sent," according to the statement.

With close to 2 billion users worldwide, Facebook said it's going after fake accounts with the "largest footprint" and most "broad reach."

Tensions reached a boiling point after the U.S. election, with Facebook being blamed by some for influencing the election by allowing misinformation to spread so wildly across the platform, virtually unchecked.

Days after Donald Trump was elected, Mark Zuckerberg called it a "pretty crazy idea" that Facebook could influence an election.

However, he quickly pivoted on the issue, promising to help tackle the spread of misinformation online, but also noting the fine line Facebook would have to walk to ensure it wasn't censoring content.

In December, Facebook announced that a "disputed news" flag would be added to stories that have been debunked by third-party groups.

After a story is marked, a group of Facebook researchers sorts through flagged stories and determines which ones to send to fact-checking organizations, which include Snopes, Politifact, and Factcheck.org.

If it's determined to be fake, the story will still remain on Facebook, but it will be flagged as disputed and include a link explaining why. One recent example flagged a false story with the headline: "Trump's Unsecured Android Device Source of Recent White House Leaks," explaining that it had been disputed by Snopes.com and Politifact.

While this story, and others like it, can still be shared, you'll be warned before you do and they'll be more likely to appear lower in News Feed, according to Facebook.