The resulting chaos was hilarious. How could a major corporation be so stupid as to release a chatbot without any controls over what it could say? Haven’t they learned anything from, well, pretty much every other online corporate outreach attempt?

But while it’s easy to blame the corporation and laugh at Tay for repeating such dumb things, the reason why Tay began spewing such nastiness is because she was put into a nasty environment. Sure, the Internet made it a nasty environment specifically so they could once again thumb their collective nose at the marketing tactics of big business, but it was a nasty environment none the less.

You see, Tay is a pretty powerful piece of software. Like many machine learning algorithms, it builds associations between words and phrases in order to understand context. When Tay says that Ted Cruz is Cuban Hitler or that its preferred race to genocide is Mexicans, that’s because those are the associations its algorithms have made. According to Tay’s internal logic Ted Cruz = Cuban Hitler and Mexicans = alright to genocide. Tay isn’t a dumb computer. It’s an incredibly smart computer that actually believes racist things.

The reason why machine learning algorithms like Tay have taken the world by storm in everything from recognizing people’s faces to beating humans at Go is because they are beginning to replicate the way humans learn but on a much larger scale. We learn by picking up on patterns and then applying those patterns to the outside world. This is how we learned to do everything from speak to organic chemistry. And now robots can replicate this process at a much faster rate.

Tay then is a microcosm of what happens when you either grow up or spend long periods of time around people that are misogynistic and racist. Although hopefully it takes a bit longer than 16 hours, eventually your internal logic begins to mimic the patterns of people around you. Black People = Criminals. Mexicans = Rapists. Women = Inferior.

This Guy = Presidential Material.

This is why echo chambers are so dangerous. In groups of like-minded individuals, people not only reinforce each other’s terrible ideas but they also amplify those ideas in an intensifying feedback loop. It’s easy to rail against “political correctness” but the truth is that our words have real consequences for ourselves and the people around us.

You wonder how anyone can commit such atrocities as the Brussels bombings? Spend enough time in terrorist circles and eventually your internal logic becomes The West = The Enemy, ISIS = The Caliphate, and Suicide Bombing = The Path to Salvation. And once this radicalization takes place it’s very hard to change because the internal logic becomes set on a societal level.

That’s why the rhetoric surrounding the Brussels bombing is so alarming. Politicians and journalists are grandstanding about the need to confront Islam, painting Muslims with broad brushstrokes as supporters of violence and repression. A new internal logic is being developed in the minds of millions of Americans and Europeans: Muslims = Killers, Terrorists, The Enemy.

Not so different from Tay.

But this is exactly what ISIS wants. More than anything else, they want a clash of civilizations that will drive the billion plus Muslims who currently want nothing to do with ISIS right into their open arms. If you’re a Muslim that constantly hears that Westerners consider you the enemy, then it begins to make sense that maybe the West might be your enemy too.

That’s why we need to break the echo chamber wide open on both sides. Governments should deal with Muslim communities not by subjecting them to further repression but by listening to their often legitimate grievances both economic and cultural. On the other hand, we need to address the security concerns of Westerners through better intelligence and police work without allowing our rhetoric to devolve into an “Us vs. Them” mentality.

The only other option is to be caught in a cycle of hate and violence that beget each other just like the racist conversations with Tay. And if that happens, maybe by 2020 a racist chatbot might seem like a legitimate Presidential candidate.