By Shivangi Narayan

“Assam doesn’t need it, as much as we need an NRC [National Register of Citizens] type of exercise in Delhi”, a policeman from the Delhi Police told me. “It is the illegal immigrants and the ‘Muslims areas’ who are responsible for the crime, you see how things change when we survey them out. Why don’t you help? You are a research scholar, why don’t you put up a proposal [with the Police administration] for a survey like that?”[1]

A crime mapping platform established by the Delhi Police to algorithmically predict crime in the city works in an institutional setting where policemen believe that certain communities in specific spaces are the reason for the rising crime in the city.

crime mapping platform established by the Delhi Police to algorithmically predict crime in the city works in an institutional setting where policemen believe that certain communities in specific spaces are the reason for the rising crime in the city. During our interaction, the policeman I spoke with brought out the crime maps of the previous days in which, I pointed out, the so-called crime prone ‘Muslim areas’ looked absolutely crime free. “What would they have to steal in their own areas? They go out and commit crimes in other rich areas. Also, they don’t want the police spotlight in their own areas, that’s why they go out and do what they have to do.”

The process of othering is a stepping stone in inflicting violence. Only when a community, or a people, is othered by being called vermin[2], anti-national, ‘tukde tukde gang’[3] or downright criminal, does violence on them become possible. Othering is a way of dehumanizing the subject so that the violence does not impact the conscience of the perpetrator.

While data-led policing could be considered a scientific and hence neutral and objective approach that would debunk myths related to the inherent criminality of certain people, a close look at both the data collection methods and the subsequent mapping practices would indicate otherwise. An oft overlooked aspect of data-led policing is the institutional setting within which these systems are deployed.

In Delhi Police, for example, the mapping is based on data sourced from the calls to the police emergency number 100 and from First Information Reports (FIRs) lodged in the police stations across the city. These data are mediated by the police officers’ own interpretation of the crime/criminal and the victim. This has been amply demonstrated in sexual harassment crimes where police officers have been shown to blame the victim rather than the perpetrator for the crime (and thus refuse to file their complaint). Patriarchal notions about women’s place being more at home than outside inform these refusals, as it was evident when some policemen in the Delhi Police commented on women’s clothes, especially in places thronged by young people like Khan Market or Connaught Place, as “invitations to rape.”[4] Policemen told me that they “know” that people inflate their problems when they call the police on the emergency number 100. “There is never just a wallet lost, it is always a wallet with a lot of cash in it,” one of them told me. This, in his opinion, was in response to the notion that police does not look into ordinary complaints. Aurora Wallace[5] (2009) in fact claims that the police is only concerned with crimes that impact property prices in urban centers and, therefore, nuisance crimes or law and order crimes are high on their list of priorities. Criminals are thus constructed out of people who disturb the functional or aesthetic order of the city rather than out of everyone who commits a crime. For example, as Cathy O’Neil explains, the police do not have the “expertise”[6] to deal with white collar crimes. She argues that the police did not arrest or charge anyone responsible for the mortgage fraud which lead to the 2008 meltdown, because she argued, everything about policing from “their training to bullet proof vests, is adapted for the mean streets.”[7] O’Neil describes the American Police specifically, but the argument can be extended to the Indian Police as well. Historically, crime statistics have been considered doubtful and incomplete as they are only based on reported and recorded crimes. According to Tim May[8] domestic violence crimes are hardly reported and therefore are not included in police statistics, but this obviously does not mean that these crimes do not take place. Therefore, it would not be correct to identify a community or a region as criminal on the basis of reported police data. The picture becomes even more partisan if we consider, as mentioned before, that the recording of crime data depends on the police officer’s individual interpretation and understanding of both the crime and the victim.

Call takers in the Dial 100 call center take down the primary details of the calls and are responsible for categorizing them into specific crime events. Some of those I spoke to opined how the “posh areas” do not call the number 100, only the poor ones do. According to them, people in these poor areas are always fighting with each other, asking for the police to intervene. Even the sexual harassment cases are more about the women teaching men a lesson than about the actual harassment. “When we reach the crime spot, they ask to rafa dafa[9] the matter,” they said. With such preconceptions about people from slums and immigrant colonies, the call takers can miscategorize calls from such areas, leading to faulty data for the final hotspot mapping. They also work with a substandard location database that disproportionately impacts those residents whose addresses are not standardized, particularly those living in temporary settlements and colonies.

Vidushi Marda and I, in our paper ‘Data in New Delhi’s Predictive Policing System’ explain how in Delhi Police the data collection and creation process, as well as any decision based on it, disproportionately impact those from vulnerable groups. We list three kinds of biases: historical, representational and measurement bias. These prejudices inform how the belief about the inherent criminality of the poor is embedded in crime data and how these statistics feature more poor than rich inhabitants of the city.

Hotspot mapping, where crime data is plotted on a Geographic Information System (GIS) map of a region to spatially analyze and deduce crime patterns, is considered to be one of the most common tools of predictive/algorithmic policing. Along with the crime data being biased, as I will argue further, the actual maps represent poor and rich areas with the same color palette and icons thus erasing any context specificity and ground realities. The map reader can only see abstracted crime numbers as dots on the maps, but not any underlying reasons for the same.

Going back to the aforementioned conversations with Delhi police officers, even when the data indicates a break from the commonsensical notions of criminals (i.e. being from specific communities such as Dalits or Muslims or the relatively crime free so-called Muslims areas), it is read more as an aberration than as the norm. Ironically, however, even a little increase in the number of ‘crime’ incidents in these very areas or in slums or immigrant colonies confirms their suspicion that these spaces, and hence the people residing in them, are ‘inherently criminal’. In our interaction with the police and the mapping company that provides the Delhi Police with the mapping software, it emerged that they both believe that socioeconomic inequalities are the primary reason for crime in the city and therefore urge the policemen to collect such data for efficient crime analysis.

The need for such data stems from the prejudicial belief that the poor are criminals and the spaces where they reside are festering hot beds of crime. This belief gets legitimized through repeated discussions and agreement among the general public. D. Asher Ghertner[10] defines it the ‘nuisance talk,’ that is something that takes a life of its own in defining who is aesthetically and thus functionally fit to be in the city. ‘Nuisance talk’ is a way for the norm-setting middle class for standardizing disgust towards the unwanted spaces in the city; Ghertner calls it a way of producing cities by expunging the unwanted from the social order. The spaces inhabited by poor communities are constructed as ‘unwanted’ and thus a ‘nuisance’ as they are not only dirty, but also unsafe because of the inherent criminal tendencies of the residents.

This kind of surveillance is mostly administered on the poor of the city as it is evident from the police registers where details of routine offenders are kept. Men of Dalit or Muslim communities who live in slums are overly represented in categories such as ‘habitual offenders’ or ‘bad characters’ or ‘ruffians’ or ‘rowdys’

Technological solutions, that are praised for their clarity and objectivity, are applied in institutional settings where caste hierarchies and religious discrimination are profoundly rooted. In this framework, the data collection and elaboration flatten the context and confirm the bias of the majority, leading to branding specific spaces as inherently criminal.

A corollary to Ghertner’s arguments can be seen in formulations on caste where the erstwhile untouchables were termed as “dirty, drunkard, devoid of merit, beast of burden and not to be trusted ” marking them as criminals for being poor, uncouth and dirty. Police officers admit to closely watch the residents of their ‘beat’ (the area assigned to each officer for patrol and administration). If they find something that doesn’t add up, for example a boy from a humble background with a new bike or with fancy clothes, they watch him for potential relations with gangs or drug dealers. More often than not, it is one or the other, a policeman told me. Nothing beats local knowledge, he swore. From the constables’ responses I could make out that these cases are frequently dealt with disproportionate force. This kind of surveillance is mostly administered on the poor of the city as it is evident from the police registers where details of routine offenders are kept. Men of Dalit or Muslim communities who live in slums are overly represented in categories such as ‘habitual offenders’ or ‘bad characters’ or ‘ruffians’ or ‘rowdys’ (Khanikar, 2018).Technological solutions, that are praised for their clarity and objectivity, are applied in institutional settings where caste hierarchies and religious discrimination are profoundly rooted. In this framework, the data collection and elaboration flatten the context and confirm the bias of the majority, leading to branding specific spaces as inherently criminal. This has echoes with the colonial era criminal tribe classification system, which described certain tribes as criminal by birth because they did not conform to the settled ideas of civility posed by the British Administration. [11] With the impunity that tech solutions enjoy, predictive policing programs regurgitate commonsensical notions of crime and criminal spaces in the city and reify existing biases against slum dwellers, immigrants, Muslims, Dalits and the poor. This makes for a fertile ground for more violence, whether in the slum or the thanas (local police stations); a violence that in the long run becomes normalized, culminating into “these people need to be treated this way only” rhetoric that the police use so often to justify the disproportionate use of force on certain sections of the population.

Shivangi Narayan is a PhD student at the Centre For Study of Social Systems, School of Social Sciences, Jawaharlal Nehru University, New Delhi. Her interests include digital identification systems such as Aadhaar, Big Data and Artificial Intelligence bias specifically in policing. Prior to pursuing PhD, Shivangi was a senior correspondent reporting on technology and e-governance for the fortnightly magazine Governance Now.

[1] Personal interactions with policemen in Delhi Police during my fieldwork in 2017. The Delhi Police’s own annual report released in January 2019 put the blame of rising crime on migrants and ‘youth’s frustrations.’

[2] During the holocaust, Jews were referred to as vermin before they were sent to concentration camps.

[3] This literally means the “breaking gang” and refers to the phrase given by the national media to some protestors in JNU who allegedly raised the slogans “Bharat tere tukde honge, Inshallah, Inshallah” (India you will be broken up, Allah willing).

[4] Personal interactions.

[5] Aurora Wallace, “Mapping City Crime and the New Aesthetic of Danger,” Journal of Visual Culture, Vol.8(1) 2009.

[6] Cathy O’Neil, Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. London: Penguin Random House, 2016, p. 9.

[7] Ibid.

[8] Tim May, Social Research: Issues, Research and Methods. New Delhi: Sage Publications, 2011.

[9] This Hindi expression means to bury, forget about the matter at hand. Personal interactions with policemen. Parts of data points mentioned in this part of the article also appeared in Vidushi Marda and Shivangi Narayan, “Data in New Delhi’s Predictive Policing System.” In FAT* ’20: Proceedings of ACM Conference on Fairness, Accountability, and Transparency, January 27–30, 2020, Barcelona, Spain. ACM, New York, 2020.

[10] Asher D. Ghertner, Rule by Aesthetics: World Class City Making in Delhi. Oxford University Press: New Delhi, 2015.

[11] See Radhika Singha, A Despotism of Law: Crime and Justice in Early Colonial India. Oxford University Press: New Delhi, 1998.