On 25 July 2018, Unofficial PMO India, a Facebook page which then had 2,51,000 followers and regularly uploaded memes against the current NDA government, ran afoul of the social media platform’s community standards—a set of rules that “outline what is and what is not allowed on Facebook…and apply around the world to all types of content.” The page’s troubles began after it uploaded a collage of a picture of Prime Minister Narendra Modi pinching the ears of a child, juxtaposed with an image of Adolf Hitler in a similar photo op. The image was removed by Facebook within hours of uploading and the page followed soon after. The page is run by Farza, a friend who is based out of Mumbai, and three other people. Just after the page was pulled down, Farza called me in a panic. He was worried that with the image and page gone, his personal account would be next in line. A few hours later, the personal profiles of all the page administrators, including Farza, were suspended for 30 days. “What to do?” Farza asked me, in another emergency call from miles away. I did not have a clear answer.

For years now, every week I receive similar panic calls from Facebook users whose accounts have been suspended for violating unspecified aspects of the community standards. Every such call was from users who posted content against the Narendra Modi government and its ideological associates. Since 2015, I have documented every call I received to understand whether these are all mere coincidences on account of faulty algorithms or if there is an agenda to Facebook’s suspensions. The duration of suspensions ranged from one to thirty days, and in a handful of cases, even more. There was no clarity on what content attracted what duration of suspension.

After three years of tabulating this data, a pattern has emerged. All the suspended profiles I dealt with were barred on the pretext of enforcing Facebook’s community standards. As I noticed recurring patterns, I tabulated a majority of the suspended profiles into three categories of Facebook activity—upload of memes against Narendra Modi; protests against the ruling government’s policies; sharing content which comes under the previous two categories. In all cases, no clear reason was ever provided as to which post broke the community rule or which community guidelines were not adhered to. These clarifications were not provided even after the profiles were reinstated.

With 270 million active users in India, and 2.27 billion monthly active users worldwide, Facebook, with its multifaceted social-media platforms, has colossal social and economic impact. In this Facebook country, what constituted the community standards was a mystery until April 2018, when the company finally opened up the guidelines to the public. Speaking on the occasion, Monika Bickert, the head of product policy and counterterrorism at Facebook, told Reuters, “The Community Standards guidelines are not a static set of policies. The rules change often.” Bickert said that every two weeks she leads a content-standards forum, where senior executives meet to review the social network’s policies for taking down objectionable content. The group, which comprises executives from the company’s policy division, also receives inputs from more than a hundred external organisations and experts on areas of concern, such as child exploitation and terrorism.

But how these community policies are formed or changed is still not available for an audit. A changing community policy is like a changing constitution—hard to stabilise. In the cracks of these changes, Facebook thrives as a banana republic of the digital world. Digital-rights groups have been requesting to audit Facebook’s community standards algorithms for years due to its caustic social impact, to no avail.