UPDATED 8:55 p.m. PT with Google blog post defending autocomplete.

Did Google manipulate search for presumptive Democratic nominee Hillary Clinton? Is this a conspiracy?!

Not likely. And no.

We're asking these questions because online viral video site SourceFed believes it's stumbled onto the biggest search and political scandal of this election season.

In a video posted Thursday, SourceFed's Matt Lieberman reports that "Google has been actively altering search recommendations in favor of Hillary Clinton’s campaign."

The allegation comes from SourceFed's video editor Spencer Reed (who does not appear in the video) and is based on disparities they found in Google's autocomplete function. Autocomplete will start to guess at your search query as you type. It's something Bing and Yahoo do, as well, but Reed and Lieberman contend Google does it differently and in such a way as to protect Clinton and her presidential aspirations.

As evidence, the video shows how when you start typing in "Hillary Clinton cri" the site does not complete it as "Hillary Clinton Criminal Investigation." Bing does autocomplete with "crimes" and "criminal investigation." By contrast, a search of "Donald Trump ra" does fill in with "racist" on Google, as well as Bing.

On the other hand, a Google autocomplete search of "Donald Trump indi" brings up a search for Indiana and India, while it returns "indiscretions" as the second option in Bing.

If SourceFed is right, then shouldn't Google at least be consistent? When we searched "Hillary Clinton ema" the second autocomplete result was "Hillary Clinton email charges."

When asked by Mashable about the autocomplete charges, a Google spokesperson emailed this statement:

Google Autocomplete does not favor any candidate or cause. Claims to the contrary simply misunderstand how Autocomplete works. Our Autocomplete algorithm will not show a predicted query that is offensive or disparaging when displayed in conjunction with a person's name. More generally, our autocomplete predictions are produced based on a number of factors including the popularity of search terms.

Google even lets users report offensive predictions.

SourceFed seems to be confusing actual search results with these recommendations, which are essentially Google's automated guesswork.

"Autocomplete predictions aren’t search results and don’t limit what you can search for," Google's VP of product management for search, Tamar Yehoshua, writes in a blog post Friday. "You can still perform whatever search you want to, and of course, regardless of what you search for, we always strive to deliver the most relevant results from across the web."

Are predictions consistent? No. Certainly, "racist" is an offensive term and yet Trump still gets stuck with it. This may have something to do with Google's algorithm looking for more context. The "Donald Trump racist snl" recommendation, for example, actually takes you to CNN's video discussion of the SNL Trump campaign ad parody.

There's a reason why autocomplete may not seem consistent, Yehoshua says.

Autocomplete isn’t an exact science, and the output of the prediction algorithms changes frequently. Predictions are produced based on a number of factors including the popularity and freshness of search terms. Given that search activity varies, the terms that appears in Autocomplete for you may change over time.

Sources tell Mashable that Google's focus is on delivering useful and trustworthy information. In addition, the company doesn't use people to manually rank results or even these recommendations. They have algorithms for that with, obviously, rules, like the one that seems to have a problem with some "offensive terms" when they appear alongside people's names.

The company is, sources tell us, always working on improving autocomplete. Yehosu That could mean that Clinton will soon find her name married to "crimes," or that Donald Trump will finally be able to shed the autocomplete term "racist."

Or it may mean nothing at all.

Have something to add to this story? Share it in the comments.