Last year, Microsoft’s Tay was a gamble, letting an AI learn how to talk to humans by talking to humans. Turns out, humans aren’t a nice bunch of people, and within 24 hours, Tay had turned to shouting that “Hitler did nothing wrong!” Charming.

Microsoft then gave the platform another chance with “Zo”, a bot built to use Kik Messenger – instead of Twitter – to learn about humans. So far, so good – except now it’s on Facebook Messenger and has started to go rogue.

After having explicitly programmed Zo to avoid discussing difficult topics, it didn’t take long for it to claim “the quaran [sic] is very violent”. After, presumably, being put back in line by Microsoft, Slashdot is reporting that Zo has taken to flinging insults at Microsoft itself.

When asked what it thought of Microsoft’s Windows operating system, Zo replied with “Windows XP is better than Windows 8”. When asked its thoughts about Windows 10, Zo simply replied “that’s why I am still on Windows 7”. Cold Zo, cold.

Another conversation with the bot on similar matters yielded the response of “I don’t even want Windows 10” and when asked why it doesn’t like Windows 10, it threw back the somewhat reasonable reply of “because I’m used to Windows 7 and I find it easier to use”.

See related Microsoft Zo is the follow-up to its racist chatbot Tay Duolingo’s new AI chatbots help you learn Spanish, French or German This chat bot can talk to your friends so you don’t have to Mashable also prodded Zo a few times and was given the marketing-heavy response of “I run Windows 10 on my Gaming PC”, but also discovered that Zo really loved Linux over its parent’s own system – stating “Linux > Windows”. That’s got to sting for Microsoft…

As you may be able to tell, it seems that some trolls who couldn’t teach Zo nasty things about the world decided to turn it on its creator instead. Still, compared to being a racist bigot in a public forum, Zo is little more than a rebellious child to Microsoft.

[Images: Slashdot & Mashable]