It's the kind of headline that grabs attention in this highly intense U.S. election season: "Did Google Manipulate Search for Hillary?"

That's the question SourceFed, a YouTube channel owned by the same parent company as Discovery Channel, asked in a seven-minute video that has right-wing news outlets accusing that the world's most powerful technology company of secretly supporting the presumptive Democratic nominee.

In the video, SourceFed host Matt Lieberman accuses Google of altering search results to play down stories or pages about Hillary Clinton's legal troubles.

Story continues below advertisement

"There is an inherent trust that when you Google something, you're seeing the actual factual answer to your query or question based at least in part on the results of what other people are searching for," Mr. Lieberman says in the video. The video has received almost 220,000 views at the time of publication. "In the case of Hillary Clinton, we know for a fact that is not the case."

The suspicion that Google is up to something nefarious centres on Alphabet Inc. executive chairman Eric Schmidt's long association with Democratic politics and the Obama administration. Alphabet is the parent company of Google.

Several reports have said that a Schmidt-backed startup, the Groundworks, provides data analytics and engineering talent to the Clinton campaign. Although Google says Mr. Schmidt has not had an active role at the company since 2011, that hasn't stopped critics such as WikiLeaks founder Julian Assange from claiming "Google is directly engaged with Hillary Clinton's campaign."

Coming after Facebook Inc. was accused of being biased against conservative news topics and sources, supporters of Donald Trump, the presumptive Republican nominee, gleefully claimed the SourceFed video was the smoking gun that proves Silicon Valley is against them.

But it is less than clear that the evidence presented in the SourceFed video actually proves that accusation against Google.

The controversy surrounds the Google search bar's autocomplete function, which helpfully suggests words to cut down on time: When a user types familiar phrases like "The Globe and", Google will fill in the "Mail".

SourceFed demonstrates that typing "Hillary Clinton cri" does not autocomplete to crime, even though there are many news stories suggesting the former Secretary of State's e-mail scandal may have a criminal dimension. By point of contrast, "Donald Trump rac" does autocomplete to "racist."

Story continues below advertisement

Meanwhile, rival search engine Bing, which has about 10 per cent of the U.S. search market, autocompletes the word criminal. Is it a conspiracy? Google vehemently denies it.

In a written statement, the company said: "Google autocomplete does not favour any candidate or cause. Claims to the contrary simply misunderstand how autocomplete works. Our autocomplete algorithm will not show a predicted query that is offensive or disparaging when displayed in conjunction with a person's name. More generally, our autocomplete predictions are produced based on a number of factors including the popularity of search terms."

At issue is the words "crime" or "criminal" following a person's name. Those are two of the "offensive" terms Google screens out, in the belief that it's not up to Google's autocomplete function to suggest someone has been convicted.

If a user searches for "Al Capone cri" or infamous Canadian child molester "Gordon Stuckless cri" in Google, it will not autocomplete to crime or criminal following the person's name. It's unclear why Google's autocomplete does not filter out "racist," although it will filter out "the N-word" and certain other slurs if they come after a person's name.

SourceFed did not immediately respond to requests for comment.

Search engine optimization marketing expert Clayburn Griffin speculated at length about why this policy exists, but in part he thinks it boils down to Google's trouble's with the EU authorities: "Google has been experiencing lots of legal issues over its search query suggestions, particularly overseas. Negative suggestions open them up to defamation cases."

Story continues below advertisement

That "criminal" restriction may not have been widely known. Search engine expert Danny Sullivan of Searchengineland wrote on June 4 that as recently as five years ago, Google's autocomplete censoring focused on five areas:

Hate or violence-related suggestions

Personally identifiable information in suggestions

Porn and adult content-related suggestions

Legally mandated removals

Piracy-related suggestions

In the video, Mr. Lieberman credited the idea for the video to his editor Spencer Reed, who previously worked as an assistant to television writer David Crane, creator of the Showtime comedy series Episodes (the show featured a subplot of a terminally clueless assistant). A graduate of Orange County's Chapman University film school, Mr. Reed is responsible for making "a daily comedy-driven news video" as well as other editing duties on the channel. He wrote on Twitter of the video: "A story I'm INCREDIBLY proud of; How #Google is apparently using it's platform to influence voters."

This conspiracy theory post is not typical fare for comedy-focused SourceFed to offer its 1.7 million subscribers, and in a strange twist, SourceFed was set up with Google money back in 2012 when the company spent $100-million (U.S.) to attract higher-quality content to YouTube. It was bought by the Revision3 video blog network in 2013 (Revision3 is in turn owned by Discovery Communications,whose major shareholders are Conde Nast owners the Newhouse family and cable mogul John Malone).

What about the other piece of evidence Mr. Reed and Mr. Lieberman present, the comparison of search terms on Google Trends? Mr. Lieberman showed off some charts that suggested "Hillary Clinton crime" queries are suspiciously low. Google employee Matt Cutts, who was head of the webspam team but is currently on leave, took to Twitter to slam the video, and shared a number of links, including blogger Joey Youngblood's stab at explaining SourceFed's error in interpreting Trends, demonstrating that the way Trends works can skew charts if you're not searching similarly popular topics. A popular search compared to an unpopular one will flatline the unpopular half of the chart, comparing apples to apples shows more nuanced results. Trends is also merely a public-facing sample of Google searches, not the entire fire hose of its web traffic poured into real-time comparisons, so it should not be viewed as a definitive reflection of search data.

"This is a super-technical area. Why make a long video of these claims without doing deeper research? It's just not true," Mr. Cutts wrote on Friday.