Jessica Guynn

USA TODAY

SAN FRANCISCO — With growing criticism over misinformation in search results, Google is taking a harder look at potentially "upsetting" or "offensive" content, tapping humans to aid its computer algorithms to deliver more factually accurate and less inflammatory results.

The humans are Google's 10,000 independent contractors who work as what Google calls quality raters. They are given searches based on real queries to score the results, and they operate based on guidelines provided by Google.

On Tuesday they were handed a new one: to hunt for "Upsetting-Offensive" content such as hate or violence against a group of people, racial slurs or offensive terminology, graphic violence including animal cruelty or child abuse or explicit information about harmful activities such as human trafficking, according to guidelines posted by Google.

The goal: to steer people with queries such as "did the Holocaust happen" to trustworthy websites and not to websites that engage in falsehoods or hate speech.

Facebook unveils first serious effort to wipe out fake news

Facebook users are fed up with fake news

Can the fake news trend be de-escalated?

The Internet giant is using data from quality raters to spot "demonstrably inaccurate information," Paul Haahr, a Google senior engineer involved with search quality, said in an interview with industry blog Search Engine Land. Haahr told Search Engine Land that Google is avoiding the term "fake news" because it is too vague.

How it works: Google, for example, advises its quality raters that a search result from white supremacist website Stormfront that denies the Holocaust happened should be flagged as upsetting or offensive content while a result from the History Channel describing what happened during the Holocaust should not.

Related:

More fake news on the Web — this time, via Google

Don't believe everything you search on Google

France to Facebook, Google on fake news: 'Non merci'

Quality raters don't have the ability to change how search results are ranked but feedback from these contractors is used by engineers and machine learning systems to improve search results, according to Google. It declined to comment on the new guideline.

Danny Sullivan, founding editor of Search Engine Land, says results for the query "did the Holocaust happen" have improved. That's in part due to the ranking change and in part to the outrage in response to the search results. It's unclear how much of an impact the new mandate for quality raters will have on the search results people see.

Google has also come under fire for the snippets it puts at the top of search results. A number of those have been inaccurate.

"We will see how some of this works out. I'll be honest. We’re learning as we go," Haahr told Search Engine Land.