Last Friday, a storm erupted when someone noticed Amazon were selling t-shirts bearing offensive slogans like “Keep Calm and Hit Her” and “Keep Calm and Rape a Lot.”

This discovery provoked a strong reaction, leading to outrage on Twitter and critical articles from CNN, The Guardian, The Daily Mail and more. The slogans were condemned in the strongest possible terms, with criticism directed at both Amazon for selling the shirts and at the US firm Solid Gold Bomb for creating them.

The following day, a blogger named Pete Ashton argued that the slogans were likely generated by computer and “nobody made, or approved, the design.” He claimed a combination of the Amazon Marketplace, a print-on-demand service, and a simple piece of software could result in the offending t-shirts appearing online without any human approval.

After reading this article, my wife asked me how likely Ashton’s explanation was. Could a product really go on sale that no-one had ever seen?

I’ve been a web developer for fifteen years, working with many of the technologies required. The chain of events seems plausible enough to me. It would be trivial to write a program that took words like “drink” or “carry”, and combined them words like “on” or “beer” to produce thousands of t-shirt slogans. It would be trivial to use something like ImageMagick to create images of what t-shirts might look like and upload them to Amazon. I could probably do it in an afternoon. So the explanation makes sense, but is there any evidence that it’s true?

If the “computer did it” hypothesis is correct, I reasoned, I should be able to analyse the products still on-sale and calculate the original words used to create them. I can generate a list of slogans with those words and check if they appear on an SGB product. If every possible slogan is on sale, that supports the theory that this was an unsupervised computer program. If some are missing, it could indicate a human editor.

I quickly wrote a program that fetched any SGB product featuring the words “Keep Calm and”. It picked apart the description and recorded which verb had been used and which words terminated the sentence. Within minutes, I had a list of 759 verbs and eleven terminators.

A second program took every combination of those words, a whopping 8,349, and cross-referenced against SGB’s product list. With a few exceptions, every combination was on sale. The missing slogans were the ones apparently withdrawn after the scandal broke. It seemed pretty likely that the shirts were uploaded by an automatic process.

I was still left with the question of there the original list came from. Who wrote it? Who decided to include “rape” and “grope”?

As I now had my own copy of the list, I wanted to see if I could recreate it from online sources. A little googling led to a blog post by marketeer Marcy Tanniru, titled “1600 Action Verbs & A Quick Note On Bullet Points”. She was talking about the use of bullet points to help keep your writing concise and included a handy spreadsheet of verbs.

Taking this as my starting point, I wrote a program to filter it down. Verb phrases like “zoom out” were trimmed to a single word. Words that were too long or too short were discarded. After applying those simple rules, I was left with a set almost identical to my reconstituted list – only thirteen words difference. Twelve appeared in Tanniru’s list and not mine; one appeared in mine and not Tanniru’s. And yes, her list contained “rape”, “hit”, “choke” and the other verbs at the heart of the controversy.

I have no idea if Tanniru’s list actually was used generate the slogans. SGB claim that their list was 200,000 strong before it was whittled down to the final 700 or so. Maybe Tanniru’s list was in there, maybe it wasn’t. But my experiment had shown how easy it would be for this t-shirt scandal to happen accidentally.

Should SGB have taken the time to read their word list before uploading the shirts? Absolutely. Failing to do that was simply lazy. 759 is not that many words; it would be the work of an hour to look it over in detail.

Does that make them pro-domestic violence or pro-rape? I don’t believe so. For it not to occur to them to check the list shows a startling lack of awareness. It makes them insensitive, but not necessarily misogynistic.

However that only addresses the verb list, there’s still the list of terminator words to consider. Being shorter, it’s far more likely this list was compiled by hand rather than pinched from the web. The words I found were: ‘on’, ‘off’, ‘in’, ‘out’, ‘them’, ‘us’, ‘not’, ‘a lot’, ‘it’, ‘me’ and ‘her.’

Notably absent, is ‘him.’

The SGB website currently carries a lengthy apology, confirming Pete Ashton’s theory and claiming the “product data was derived simply from the product name and the 16 word combinations like ‘On’ and ‘Off’ and ‘Him’ or ‘Her’.”

But there were no shirts reading “Keep Calm and Rape Him.” It is possible, through error, to have a word in the list which doesn’t make it into a product. A badly initialised loop counter, for example. But that’s reaching. I saw no evidence that there was ever a “him” on that list and that is very concerning.

However, it does appear that there was no evil misogynist, sitting in a room, thinking “Keep Calm and Rape a Lot” would make a great t-shirt. The truth is far more mundane.

A list of words was taken from the Internet. This list was not checked properly, or perhaps at all. It was used to generate 8,349 t-shirt slogans that no person ever read. Thumbnails were generated automatically and uploaded to Amazon. They were listed for sale without anyone ever checking.

None of this excuses SGB. It was their product, it was their responsibility. It is absolutely correct to criticise them for what they have done, but only for that. It’s fair to say they were lazy and even sexist too, with the apparent exclusion of “him” from their word list.

But we cannot condemn them as pro-rape or pro-domestic violence, as the evidence suggests we don’t have a case.

This article is an adaptation of a longer piece originally recorded for episode 93 of Skeptics with a K.