Because the majority of English speakers, who use the language as their first language, are white. This is a fact, so it makes sense that when you search for a very generic term like “beautiful woman” in google, google will fill in the gaps with what it thinks is the answer that is correct most of the time.



I was using Swahili as an example to show what happens when you enter a similar search term with a language that is composed mostly of black speakers. Beautiful woman in Swahili is “mrembo mwanamke” which yields results like these:



I can do the same thing in Spanish. “Beautiful woman” is “Mujer linda” and the search results contain, predictably, mostly latinas:



Still not convinced? In Chinese, “Beautiful Woman” is “美女” and the search results? Well, I think you know:



My point is that people consider beautiful women to be of all colors, and you cannot use a demographic/logic based search engine to try and claim racism. You are literally calling a machine that works on algorithms a racist. It’s operating on the premise that languages have demographic breakdowns of ethnicity, and the overall percentage of black people who speak English as their first language is incredibly small compared to people of European descent.