An analysis of online misogyny reveals that many thousands of women are willing to engage in discourses using language and launching attacks that are, fundamentally, derogatory to their own gender

Shutterstock

Caroline Criado-Perez, Stella Creasy, Mary Beard, Hadley Freeman, Grace Dent, Amanda Hess, Catherine Mayer. The roll call of prominent women being viciously attacked on Twitter continues to grow. As many of these women have themselves pointed out, much of the abuse they receive is not directed at what they have said or done, but at their gender. Sprinkled in amongst the general abuse and bomb threats is a recurring motif: misogynistic, sexualised language that often carries that vilest way of expressing power and dominance -- rape.


The internet has sadly always been a tough place for women.

Decades ago Ellen Spertus was writing about female harassment on Usenet forums and LISTSERVs. Nine years ago, the University of Maryland sent fake male and female online personas into chat rooms to measure the difference in sexually explicit or harassing messages they received. The 'male' accounts (for the record) received an average of 3.7 per day, 'females'

Read next We asked TikTokers why they’re pretending to be Holocaust victims We asked TikTokers why they’re pretending to be Holocaust victims

100.

Online misogyny is an old problem that has suddenly increased in scale. In contrast to the 1990s, the internet is now one of the major thoroughfares of public life. Far more of us now use it, indeed rely on it, for everything from doing our jobs to finding love. The Metropolitan Police now receives thousands of complaints of online bullying and harassment every year.


Last week, Demos published a report that attempts to measure the amount and nature of misogyny we face today on Twitter. We were interested in the various contexts and occasions that three words -- 'rape', 'slut' and 'whore' -- were used by UK-based Twitter accounts. The volume that we found was enormous. Between December 26 2013 and 9 February 2014, from UK Twitter accounts, just over 100,000 contained the word 'rape', 48,000 'whore' and 85,000 'slut'.

But how were these words actually used? We built algorithms (a description of the technology from the University of Sussex is here) and looked at the data closely ourselves to map out the broad contexts in which each word was deployed. As you would imagine, it was varied. A significant number of the tweets were sober, serious and non-offensive. 40 percent of the tweets containing 'rape' were reporting on news stories involving rape, or were involved in explicit activism against the misuse of the term. Likewise, about 10 percent of the uses of 'slut' and 'whore' were clearly non-offensive, often actually trying to do something about the problem of online misogyny.

Underneath this serious and responsible layer of news reportage and activism, we found a more problematic, and very large, grey zone of use. These tweets used 'rape', 'slut' and 'whore' in a normalised, apparently (in the eyes of the sender) casual way. 29 percent of tweets using rape did so non-literally, as in "Barcelona is going to rape Celtic next week" and 35 percent of uses of 'slut' and 'whore' were colloquial, conversational or apparently off-handed: "If I was pretty and skinny would be such a whore".

Read next What Microsoft buying TikTok would really mean What Microsoft buying TikTok would really mean

Linguistics research like this is often the art of drawing lines in the sand, and at the most serious end of this wide category were those 18 percent of tweets containing 'slut' and whore' that were more obviously misogynistic, as in: "why take photos lookin like a slut and then moan when people say bad things?"


Most worrying were those tweets -- 12 percent that contained

'rape', and 20 percent that contained 'slut' or 'whore' -- that seemed to be a direct threat or insult. These were cases where the words were being most clearly drawn on as linguistic weapons with which to hurt and demean, to menace or harass. Out of context it is hard to tell how serious they really were, but they appeared to be beyond the casual, touching on something darker, more threatening and more serious.

Use of words 'slut' and 'whore' on Twitter by gender Demos

Most surprising, however, were who we found using these words.

Read next India’s TikTok shutdown has left careers and fortunes in tatters India’s TikTok shutdown has left careers and fortunes in tatters

Women are almost as likely to use 'slut', 'whore' and 'rape', both casually and offensively, as men. Judging by our automated analysis, accounts with male names used one of the words 116,530 times, and accounts with female names 94,546.

Use of word 'rape' on Twitter by gender Demos

These results expose, I think, two important layers of social complexity that we need to understand about misogynistic trolling:

First, online misogyny is not a male-female binary. Many thousands of women are willing to engage in discourses using language and launching attacks that are, fundamentally, derogatory to their own gender. Misogyny online, as with misogyny offline, is an insidious thing. It makes women hate other women, even, perhaps, some women hate themselves.

Second, online misogyny means more than just violent verbal attacks. Underneath the specific use of Twitter to reach, offend, insult, threaten and otherwise demean a victim is another problem.


Words like 'rape', 'slut' and 'whore' are part of a new internet vocabulary. They have taken on a huge variety of non-literal, colloquial and metaphorical -- indeed conversational -- meanings that are (I checked) well outside of our formal dictionary definitions. For Oxford neurologist Susan Greenfield, this is a world of 'yuck and wow', where we increasingly produce content that is more and more shocking, more and more taboo-laden. This is certainly desensitising the words themselves, and may be creating a more permissive cultural backdrop against which rape threats are made.

Both of these point to the need for solutions in addition to either shutting down content, or legally going after the perpetrators themselves. There is an important body of moral norms and understandings about how to treat each other decently online that we are missing and desperately need to promote. Clearly, online trolling is a phenomenon that is complex and multi-layered, starting with how people learn to use the internet at home and at school, and until we get better at understanding its drivers and contours we have little hope of building a digital world that we actually want to live in.

This is a guest post from Carl Miller, co-founder and Research Director of the Centre for the Analysis of Social Media (CASM) at Demos. You can follow him on Twitter: @carljackmiller