Jessica Guynn

USA TODAY

SAN FRANCISCO — Google image searches for "three black teenagers" and "three white teenagers" get very different results, raising troubling questions about how racial bias in society and the media is reflected online.

Kabir Alli, an 18-year-old graduating senior from Clover Hill High School in Midlothian, Va., posted a video clip on Twitter this week of a Google image search for "three black teenagers" which turned up an array of police mugshots. He and friends then searched for "three white teenagers," and found groups of smiling young people.

"I had actually heard about this search from one of my friends and just wanted to see everything for myself. I didn't think it would actually be true," Alli told USA TODAY. "When I saw the results I was nothing short of shocked."

The Twitter post has been retweeted nearly 65,000 times since Tuesday, and Twitter users are using the hashtag #threeblackteenagers to discuss the implications of the video. The conversation about online racism comes as people express anger that the photo of Brock Turner, a white former Stanford University student convicted in a high-profile sexual assault case, was from high school yearbook, not his police mugshot.

"I understand it's all just an algorithm based on most visited pages but Google should be able to have more control over something like that," Alli said.

People have been flagging racial bias in the results of search engines for years. Google says it's merely reflecting back the biases that exist in society and that show up in what and how people search online.

In an emailed statement, Google said its image search results are a reflection of what's on the Web, including the frequency with which certain types of images appear and how they are described.

"This means that sometimes unpleasant portrayals of sensitive subject matter online can affect what image search results appear for a given query," the statement read. "These results don’t reflect Google’s own opinions or beliefs — as a company, we strongly value a diversity of perspectives, ideas and cultures."

Some people agreed. "Loooool. Not Google's fault, though. Just means black people need to work on stock online presentation and presence," one person tweeted.

Alli says he, too, does not believe Google is racist. Since tweeting about the results, they have changed somewhat, largely as a result of his online experiment.

"The fact all these mugshots came up is wrong, but black males making poor choices also plays a major role. If we don't want that image we need to work to change it," he said. "It shouldn't be so difficult to find normal non-offensive pictures of three black teenagers. That search sort of portrays us as a whole and those pictures are not us. We have a lot to offer and that search does not do us any kind of justice."

UCLA information studies and African American studies professor Safiya Umoja Noble says society should not let Google off the hook so easily.

"Google has had many, many incidents of racial bias appearing in its algorithm. It consistently issues a statement that it's not responsible for the output of its algorithm. And yet we have to ask ourselves: If Google is not responsible for its algorithm, who is?" said Noble, who is writing a book that compiles and analyzes the social consequences of racially-biased online searches.

Basically, the algorithm Google employs to show search results learns the biases of searchers and then reinforces it by showing those results more often. That is evident in top search results for white grandma (Grandma VanDoren's White Bread Recipe and Grandma's Bakery in White Bear Lake, MN) and black grandma (porn videos). Not all algorithms work the same. Search results on Bing.com and Yahoo search are different, surfacing a public television documentary Mexico & Peru: The Black Grandma in the Closet and black granny boots. Google is far more influential, handling at least two trillion searches a year.

Some say search engines should do more, not just to address the bias surfaced by algorithms, but to use algorithms to combat it.

With cuts to public education and increased reliance on technology to deliver answers, search engines wield more power than ever before in deciding what information is seen and what information is important. People view Google as an unassailable source of credible and reliable information. Yet, what's often missing in search results that are not curated by a thoughtful hand, say a librarian or teacher: awareness of gender stereotypes and racial biases.

According to the Pew Research Center, 91% of search engine users say they always or nearly always find the information they are seeking when using a search engine and 73% of them say that most or all of the information they find is accurate and trustworthy.

"And they are when you are looking for the hours for the local Starbucks," said Noble. "But when you are looking for information, ideas or concepts, the algorithm often fails us deeply and yet society does not really see that the algorithm as a failure except when these kinds of egregious moments happen."

Longtime Google observer Danny Sullivan says Google is reflecting what's happening on the Web and "the problems of society as a whole." But, says the founding editor of Search Engine Land, "Google could perhaps find appropriate ways to adjust."

For generic image searches for groups of people such as "black teenagers," Google could, for example, return more positive images, Sullivan said.

"If someone is going to search for mugshots, we are not looking for Google to give back positive-looking mugshots. It does not mean you won't find negative content if you go looking for it," Sullivan said. "I do think Google needs to spend more time looking at what they are doing and asking themselves how they balance being a reflection of what's across the Web with also making sure that they are not reinforcing things and that they are making sure people are fairly treated."

Christian Sandvig, professor of information at the University of Michigan, proposes a "Consumer Reports" approach to algorithms.

"We need a systematic approach to this issue that independently monitors these systems from the outside," Sandvig said.

"These companies are happy to assume responsibility for blocking content that makes their sites unpleasant to use and might eat into advertising revenue," he said. "But when they are asked about other kinds of content — content that represents a more subtle problem or doesn't threaten the user experience of their product — we get the argument that objective algorithms made these decisions based on our own behavior. This implies: 'Since these decisions were made by a computer program, they are inhuman, and therefore we aren't responsible for them.' This begs the question: Aren't programmers human? Don't humans work at Google?"

This is not the first time that Google search results have been called out.

Last year Google apologized — and suspended people's ability to submit edits to Google Maps — after searches using a racial epithet directed users to the White House.

Graphic designer Johanna Burai set up World White Web after noticing Google searches for photos of hands returned photos of white hands. The site, WorldWhiteWeb.net, encourages people to link to images of hands that are not white.

Burai toldBuzzFeed News in April that the project, which showed similar results for other body parts and search terms such as man, woman and child, illustrated societal biases.

In a similar query, student Bonnie Kamona searched "unprofessional hairstyles for work" and says the results featured African-American women. For "professional hairstyles for work," the results featured blonde white women.

The search, as The Guardianpointed out, surfaced images of black women from blogs and articles that were "explicitly discussing and protesting against racist attitudes to hair."

Last July, Google apologized after its Photos app automatically labeled black people as "gorillas." Programmer Jacky Alciné tweeted a screenshot of photos he had uploaded in which the app had labeled Alcine and a friend, both African American, "gorillas."

A month earlier, Yahoo's Flickr service rolled out new technology to help tag photos. It identified a black man and a white woman as apes on two occasions.

A 2013 study by Latanya Sweeney, a professor at Harvard University, found "statistically significant discrimination" in online advertising results. Names associated with black people "generated ads suggestive of an arrest in 81 to 86% of name searches on one web site and 92 to 95% on the other," the study found, while names associated with white people generated more neutral results.

"These companies are not doing the user test keyword searches that African Americans are doing. This is one of the reasons we consistently see this happening," Noble said.

One of the culprits: the chronic lack of diversity inside Silicon Valley technology companies, Noble says.

At Google, seven out of 10 employees are men. Most employees are white (60%) and Asian (31%). Latinos made up just 3% of the work force and African Americans just 2% — a far cry from reflecting the racial and ethnic diversity of its users in the U.S. and around the world.

"We are talking about fixing the culture in companies and fixing the woefully under-educated work force in Silicon Valley where people are not trained in ethnic studies and they are not trained in women's studies," said Noble. "They have no idea what the import of the work is having in the social dimension."