AD

AD

So, one might say, she learned perfectly. Microsoft had to put her to sleep.

This is a little bit funny in the “Didn’t Microsoft just watch Boaty McBoatface happen? Didn’t they know what to expect?” way, and a little bit funny in the sense that it is disturbing to hear these sentiments expressed by a bot that also is doing its best to impersonate millennial slang (gotta keep it 100, where “it” is Holocaust denial).

One of the rules of the Internet (Godwin’s law) is that all roads eventually lead to Hitler, whether it’s the comments section on a video featuring puppies or . . . a chat bot designed to speak the slang of today’s hip youngsters.

The Internet is an ongoing high-and-low-stakes Tragedy of the Commons, where for every 100 people who are using the communal well appropriately, there is one person who will dump a dead rat wrapped in pornography into it. You don’t need a LOT of people dropping dead, porn-covered rats into the communal well. Just one will suffice to ruin the well. And the nature of the Internet is that it encompasses Everybody, so suddenly instead of just one slightly furtive local Rat Guy, there’s a Vibrant, Self-Reinforcing Community of hundreds of Rat Guys wrapping dead rats in pornography and dumping them in wells. If you don’t build a dead-rat filter onto the well, it’s a matter of time until someone deposits a rat in there.

AD

AD

So what do we learn from the spectacular Tay failure? Zoe Quinn (whom the bot managed to harass, proving that it really had learned how to be 4chan in less than 24 hours) pointed out that we should take this as a warning against content-neutral algorithms. As she tweeted, “It’s 2016. If you’re not asking yourself ‘how could this be used to hurt someone’ in your design/engineering process, you’ve failed.”