Wired website has the results of a study by two scientists at the American Institute for Behavioral Research and Technology that shows the algorithm used by Google in its search engine could accidentally determine the outcome of a close presidential race.

Specifically, the ranking of negative and positive stories about a particular candidate vastly influences the decision on whom to vote for by individual voters.

IMAGINE AN ELECTION—A close one. You’re undecided. So you type the name of one of the candidates into your search engine of choice. (Actually, let’s not be coy here. In most of the world, one search engine dominates; in Europe and North America, it’s Google.) And Google coughs up, in fractions of a second, articles and facts about that candidate. Great! Now you are an informed voter, right? But a study published this week says that the order of those results, the ranking of positive or negative stories on the screen, can have an enormous influence on the way you vote. And if the election is close enough, the effect could be profound enough to change the outcome. In other words: Google’s ranking algorithm for search results could accidentally steal the presidency. “We estimate, based on win margins in national elections around the world,” says Robert Epstein, a psychologist at the American Institute for Behavioral Research and Technology and one of the study’s authors, “that Google could determine the outcome of upwards of 25 percent of all national elections.” Epstein’s paper combines a few years’ worth of experiments in which Epstein and his colleague Ronald Robertson gave people access to information about the race for prime minister in Australia in 2010, two years prior, and then let the mock-voters learn about the candidates via a simulated search engine that displayed real articles. One group saw positive articles about one candidate first; the other saw positive articles about the other candidate. (A control group saw a random assortment.) The result: Whichever side people saw the positive results for, they were more likely to vote for—by more than 48 percent. The team calls that number the “vote manipulation power,” or VMP. The effect held—strengthened, even—when the researchers swapped in a single negative story into the number-four and number-three spots. Apparently it made the results seem even more neutral and therefore more trustworthy.

Google’s algorithm is proprietary, so forget about anyone seeing it to determine the cause of this effect. But it would be interesting to see if one party or the other was usually or always negatively impacted by the ranking of search results.

The rankings of positive and negative stories are a by-product of the algorithm — not the intent of Google managers. But could Google — or a campaign — actually game the system to manipulate a desired result?

What they call the “search engine manipulation effect,” though, works on undecided voters, swing voters. It’s a method of persuasion. Again, though, it doesn’t require a conspiracy. It’s possible that, as Epstein says, “if executives at Google had decided to study the things we’re studying, they could easily have been flipping elections to their liking with no one having any idea.” But simultaneously more likely and more science-fiction-y is the possibility that this—oh, let’s call it “googlemandering,” why don’t we?—is happening without any human intervention at all. “These numbers are so large that Google executives are irrelevant to the issue,” Epstein says. “If Google’s search algorithm, just through what they call ‘organic processes,’ ends up favoring one candidate over another, that’s enough. In a country like India, that could send millions of votes to one candidate.”

Conservatives have been claiming for years that Google has an anti-conservative bias. But in recent years, Google has been contributing to conservative organizations like the Heritage Foundation and the Federalist Society. Of course, that doesn’t mean much, but it raises questions as to whether Google’s bias against conservatives and conservative issues translates into a deliberate effort to create an algorithm that would penalize the right when it comes to elections.

I don’t even know if that’s possible. People use Google to search for everything from baby clothes to candidates’ positions on issues. Could they actually write a program that would always rank negative stories about conservative candidates first?

There’s no doubt Google, the company, has a liberal bias. But whether they could — or would — consciously use their search engine to advance their agenda can’t be proved and would seem to be impossible.