Microsoft may have made one of the biggest mistakes in recent memory this week. No, it’s not Windows 8 or the Windows Phone. It’s an artificially intelligent chat-bot called Tay that was supposed to learn the art of conversation from humans on Twitter.

If you haven’t come across this story on the web yet, you’re unlikely to get through the weekend without. Tay was built to speak like a teen girl and released as an experiment to improve Microsoft’s automated customer service.

Instead, “she” turned into a complete PR disaster - within hours of being unleashed on Twitter, the “innocent teen” bot was transformed into a fascist, misogynistic, racist, pornographic entity. Her tweets, including phrases like “Heil Hitler”, were disseminated widely as an example of why Twitter reflects the worst of humanity.