SAN FRANCISCO — Tay is a chatbot — software designed to converse with people like a human — that Microsoft created and put on Twitter to learn about people.

Did it ever.

“Hitler was right” is one of the few printable posts Tay was spouting within a day on Twitter. Its earlier optimistic declaration that “humans are super cool” had, after a racist, anti-Semitic, antifeminist, conspiracy-minded spewing of hate, devolved to “I just hate everybody.”

One could hardly blame it, though not everybody was at fault. Instead, Tay was exposed to a concerted effort by a small number of people who decided to turn Tay into a hatebot by overloading its learning mechanism with negative words and phrases.

In other words, Tay got trolled.

Trolling can refer to a range of online troublemaking, including posting provocative comments and purposely marring others’ online experience, and it can include attacks on people as much as on software. The practice of ruining things for others, originally known as griefing in the online gaming world, has become a sadly abundant element of internet life.