A Guardian investigation has prompted Google to change some of its autocomplete suggestions. For instance, no more will "evil" be suggested when Web surfers type "are Jews." The search engine no longer recommends "evil" when "are women" is typed, either.

Google said it made the changeover after a Guardian story called out the search giant. However, the company said it didn't fix everything the Guardian found, like eliminating "bad" when somebody typed in the search field "are Muslims."

Further Reading Google dev apologizes after Photos app tags black people as “gorillas”

“Our search results are a reflection of the content across the Web. This means that sometimes unpleasant portrayals of sensitive subject matter online can affect what search results appear for a given query," the company told the Guardian. "These results don’t reflect Google’s own opinions or beliefs—as a company, we strongly value a diversity of perspectives, ideas, and cultures."

Google continued:

Autocomplete predictions are algorithmically generated based on users’ search activity and interests. Users search for such a wide range of material on the Web—15 percent of searches we see every day are new. Because of this, terms that appear in autocomplete may be unexpected or unpleasant. We do our best to prevent offensive terms, like porn and hate speech, from appearing, but we acknowledge that autocomplete isn’t an exact science and we’re always working to improve our algorithms.

A year ago, an auto-suggested tag on Google Photos labeled a photo of two black teenagers as "gorillas." And when plugged into Google Maps, "n***** house" and "n***** king" returned the result "White House."

After the Guardian story was published, some academics weighed in. One author and researcher, Cathy O'Neil, said that Google "simply can't go on pretending that it has no editorial responsibilities when it is delivering these kinds of results."