As the clock ticked up to 9pm on Friday 11 August 2017, more than 200 men snaked down a dark, long expanse of grass in Charlottesville, Virginia, called Nameless Field. The assembled group was abundantly white, and almost uniformly dressed in pressed khakis and polo shirts. Each man grasped a wooden torch filled with kerosene.

They formed a column, lined up two by two. They lit their torches. Organisers, wearing earpieces, paced up and down the line issuing directions, amplified by electric bullhorn. “Now! Now! Go!” the bullhorns ordered. The men marched, and began to chant. “Blood and soil!” they yelled, echoing Nazi ideology. “Jews will not replace us! Jews will not replace us!”

Journalists had been tipped off to the organised white nationalist march. They knew the location, Nameless Field, and the timing, the night before the “Unite the Right” rally. They didn’t know it would be this overtly anti-black and antisemitic. The torches, the night-time setting, the clash with student counterprotesters that ensued at the base of the field’s central statue – Thomas Jefferson, who’d founded the University of Virginia – foreshadowed that this would be a weekend of terror.

This was a physical manifestation of many factions of the alt-right, whose identities were so often masked online on forums such as 4chan or Reddit, or tucked away in private chat groups. Now they were attempting to prove they were more than an internet meme machine comprising basement-dwelling trolls who distributed hate-seeped messages with a rapidly evolving visual and verbal vernacular on forums such as Reddit. To outsiders, much of the memetic content was indecipherable. But one gateway to understanding it had become the cartoon frog images known as Pepe, which had been purposefully infused with antisemitic meaning by 4channers, declared a hate symbol by the Anti-Defamation League, and which had been tweeted out by Donald Trump in the lead-up to his election as US president.

Various factions of the alt-right had plotted entirely different optics for the rally, down to their fitted collared shirts. “We want to look slick and sexy,” wrote Andrew Anglin on the Daily Stormer, an alt-right website. The lack of memetic symbolism and swastikas at the rally was deliberate. “Pepe banners are a non-starter,” Anglin wrote. Supporters were instructed to stay at home if they were obese or looked like dishevelled trolls.

The rally indeed foreshadowed terror. That Saturday, on the streets of Charlottesville, alt-right-aligning protesters and hundreds of counterdemonstrators flooded the streets. Demonstrators hurled water bottles and sprayed chemical gases at one another. Journalists were assaulted with urine. “We are starting to slowly unveil a little bit of our power level,” Robert “Azzmador” Ray, a neo-Nazi writer for the Daily Stormer, told Vice News. “You ain’t seen nothin’ yet.”

Far-right groups bearing torches march across the University of Virginia campus in Charlottesville on 11 August 2017. Photograph: Mykal McEldowney/AP

Late on Saturday morning, the police ordered all protesters to disperse, deeming the assembly unlawful. Virginia’s governor declared a state of emergency. Then a car driven by a 20-year-old who’d been photographed hours earlier carrying a hate-group emblem ploughed into a crowd of demonstrators, killing Heather Heyer, a 32-year-old local paralegal, and injuring at least 19 others.

Steve Huffman, the chief executive of Reddit, who back in 2005 had hand-coded the site’s original infrastructure within weeks of graduating from the University of Virginia in Charlottesville, watched this overt display of terror unfold from a small screen on a plane flying into San Francisco, and then at home in the city’s Mission District. It left him incensed. He was angry about the whole rally and racist display, but also that Charlottesville, one of his favourite places on the planet, which he thought of as a progressive college town, would be perceived by others as a haven for racist vigilantes. The starting point for the march, the main University of Virginia quadrangle, was a field Huffman and his former roommate, Reddit co-founder Alexis Ohanian, had traversed often as undergrads. It was a less than five-minute stroll from the UVA dormitory in which they first met.

Seeing the display of physical intimidation unfold, and the subsequent violence, Huffman grew more infuriated at the alt-right’s factions than he had been before, simply seeing attempts to spread hate-fuelled ideologies on Reddit. “I was like, fuck all these people. Ban them all!”

Other Silicon Valley companies, such as Airbnb and Uber, had already taken swift action to ban white nationalists from their platforms. While it was nearly impossible to track individuals on Reddit due to pseudonymous usernames, forums espousing white nationalist, racist, xenophobic, misogynistic, and hate speech of other stripes, were not difficult to locate, monitor, and cull – though Reddit’s complex, decade-long history of grappling with its own founding free-speech ideology had in the past made doing so fraught.

Fuming over the news, Huffman visited a Slack messaging channel used by Reddit’s trust and safety team. Members had already pinpointed where on Reddit the rally, or the deadly crash, were being applauded. One subreddit, r/Physical_Removal, which advocated the elimination of liberals from the US, frequently posted memes featuring an alt-right image dubbed “Pinochet’s Helicopter” (a veiled reference to the reputation of the former Chilean dictator’s regime for throwing communists out of aircraft and into the ocean). The forum was known for bashing communism and comparing American leftists to Isis, and on the Saturday of the march featured a large “Unite the Right” banner and several upvoted discussions of the crash. One moderator posted that the murder of a counterprotester was “ethical”.

Trust and safety was already discussing banning r/Physical_Removal. “They were not talking about if, but when,” Huffman said. By the end of the following week, they had secured approval from general counsel Melissa Tidwell’s legal department to zap the entire community of r/Physical_Removal from Reddit.

The removal was clean, straightforward and – unlike past actions by Reddit that changed policy or attempted to silence even the most noxious corners of the site – did not trigger a massive revolt by the site’s 250 million users.

Snuffing out noxious content was not an entirely novel thing for Reddit, but its conscious quest to detoxify itself – and to define its ability to do so – was fresh. A little over two years earlier, under the leadership of Ellen Pao, the interim Reddit CEO and venture capitalist who had become a feminist hero of Silicon Valley for suing venture capital firm Kleiner Perkins for discrimination, Reddit made its first substantive move to clean up. It banned five subreddits that deployed hate speech and imagery against people of colour, overweight people and transgender people. The bans were met with broad retaliation by redditors against Pao and her staffers, who were threatened, harassed, and had personally identifying information about them posted, known as being “doxed”. Messages to Pao’s Reddit inbox threatened to harm her young daughter, and posts on Reddit compared her to dictators. “Threats carry a bit more weight when the people threatening you also publish your address,” Pao wrote later.

Ellen Pao resigned as interim CEO of Reddit after a user backlash against her efforts to address hate speech and imagery. Photograph: Jeff Chiu/AP

Not long thereafter, on the morning of 2 July 2015, a Reddit staffer beloved by the site’s community was abruptly dismissed from her position over a video call conducted by Ohanian. Word of Victoria Taylor’s dismissal spread fast on Reddit. Far-flung moderators of subreddits of all sizes – from the large, polished “default” subreddits such as r/books to tiny niche ones – were confused and enraged. By afternoon, 300 subreddits, from r/science to r/skrillex, had made their sections private, and inaccessible in protest. Twelve hundred subreddits were essentially turned off. The term “AMAgeddon” had been coined to describe the event, and the press took to it. The Daily Dot, a website that reported frequently on the machinations of Reddit, wrote: “The great Reddit meltdown has begun.”

Within days, it turned from a nonviolent protest of silence by hundreds of subreddits into a furious, hate-fuelled witch-hunt. A change.org petition called for Pao to step down as the chief executive of Reddit, stating that she had ushered in a “new era of censorship” on the site. As posts on Reddit encouraged more community members to join the revolt, signatures swelled. It picked up 100,000 in a single day. Again, Pao and her team became subject to widespread harassment. Threads on a new subreddit dedicated to discussing the revolt, r/Blackout2015, associated every four-letter word with Pao’s name; the posts were sexist and racist. Her inbox was again jammed with insults, slurs, and overt threats. A week later, Ellen Pao resigned as chief executive.

How did Reddit grow into a community with hundreds of millions of users that could snuff itself out in a single day? How does a single employee’s dismissal lead an entire company valued at half a billion dollars to almost implode?

A fraction of the answer can be traced back to Reddit’s founding philosophy. When Huffman and Ohanian set out to build a site of links to the best content from around the web, they had thought of it as “the front page of the internet”. Only, no editor in a tower would determine what could be seen or disseminated; instead, an algorithm would help content voted up by users to rocket to the top of the page. One of the first subsections Huffman opened up on Reddit was r/NSFW, or not safe for work. The site, even in its earliest days, was populist, messy, and harboured a distinctly anti-establishment vibe.

The site grew into a home for thousands of forums for everything under the sun, dubbed subreddits, which any user could create and manage themselves. Reddit’s little sections were curated by an army of thousands of volunteer moderators, who, in exchange for upholding Reddit’s content policy (no spamming, no doxing, and the like), were allowed to run their own little corners of Reddit with other rules of their choosing.

Subreddits were wide-ranging: r/LifeProTips, where users would post often funny life hacks, and r/todayilearned, which was filled with fascinating “aha” moments or strange historical coincidences. On r/personalfinance, individuals asked for help getting out of debt, and strangers chimed in with worthwhile tips. R/DIY, r/Frugal, and r/HomeImprovement captured a scrappy, self-starter vibe, spreading advice and praise for completed projects. There was an emerging porn ecosystem, but there was also r/spaceporn, with beautiful images of outer space, and its sister sites, r/EarthPorn and r/FoodPorn. (Of course, some clever pornographer had to take that to its logical backlash-to-the-backlash place, and also create r/EarthPornPorn, in which users could view images of humans au naturel amid nature.)

For all the whimsical and delightfully wry content volunteer moderators ushered into the world with the help of their subreddits’ followers, these so-benevolent redditors had also cost people their jobs, harassed a sexual assault victim, and launched dozens of witch-hunts against demonstrably innocent people. It was a community left to, essentially, police itself. Even the moderators, in charge of making rules for their subreddits, and detecting or banning rule-breakers, were sometimes themselves culprits.

The chief executive prior to Pao, Yishan Wong, whose leadership also flamed out under bizarre circumstances, had referred to Reddit’s vast community as a “hivemind”, likening redditors to fickle bees whose attention and whims require nurturing – because they could flee or turn on their keepers. He hadn’t been entirely wrong. He had reason to fear the whims of the community, but the trepidation with which he approached Reddit’s community ended up undercutting it. He granted the hive too much power to perceive its own reality. Users regarded Reddit as a playground for pushing and seeking the limits of free speech, and where pseudonymous users could be the masters of their own strange online domains. The set of rules that governed Reddit were considered one thing – but executed as another. The belief system was incongruent. The centre could not hold.

Chris Slowe, Reddit’s first employee, later deployed a different – and perhaps more deft – metaphor to examine this gap in perception. He told New York magazine: “Things that should be treated as case law started getting turned into the constitution.” What he meant was that standalone decisions that made sense in the moment, but not long-term, to allow certain behaviour or content, ended up being elevated to govern everything that happened on Reddit. That incoherent and disjointed constitution then dictated that when completely immoral content – say, a photo of Jennifer Lawrence naked, which invaded her privacy and had been stolen from her personal iCloud account – showed up on the site, its dissemination there was considered not strictly “unconstitutional”, the way child abuse images would be. So, Reddit would defend it.

Never was this tension more apparent than when, in 2015, Huffman stepped back into the CEO role. Huffman wasn’t just the new CEO, he was Reddit’s father of the constitution, its very own James Madison. As Reddit’s “creator”, the author of most of the site’s initial codebase, he was imbued with the power to rewrite – or even abolish – its perceived constitution.

He wasted no time in taking the great risks previous leaders had not. On his fifth day as CEO, he posted an announcement to the company’s public blog that Reddit was re-evaluating its policy on the most “offensive and obscene content”. “Neither Alexis nor I created Reddit to be a bastion of free speech,” he wrote, “but rather as a place where open and honest discussion can happen.” Commenters went wild, crying censorship and hypocrisy, giddily noting that as recently as 2012, Ohanian had indeed referred to Reddit as a “bastion of free speech”.

Past remarks came back to haunt Alexis Ohanian – seen here in Austin, Texas, in 2014 – when fellow co-founder Steve Huffman set about redefining Reddit’s constitution. Photograph: Travis P Ball/Getty Images North America

From his personal computer, Wong watched thousands of comments come back at Huffman, and chimed in on the thread: “AYYYYYY LMAO. How’s everyone doing? This is AWESOME!”

Despite the criticism, Huffman didn’t waver. On his seventh day as CEO, Huffman posted again on Reddit. “Let’s talk content, AMA” – shorthand for “ask me anything,” Reddit’s question-and-answer forum, he wrote – and laid out a brief history of free speech on Reddit, with himself at the centre of it. “As we grew, I became increasingly uncomfortable projecting my worldview on others. More practically, I didn’t have time to pass judgment on everything, so I decided to judge nothing.” He dubbed the ensuing era “Don’t Ask, Don’t Tell,” and said it didn’t go well. It led to inconsistent reasoning and policy stagnation.

On that day, 16 July 2015, Huffman announced a new set of “additional restrictions”, which would ban content on Reddit’s public pages that was deemed to incite harm or violence against an individual or group, as well as anything that harassed, bullied, or harmed an individual or group, or that included sexually suggestive content featuring minors.

This new set of restrictions essentially gave Huffman and his teams the power to cut off a handful of the worst offenders, spanning several purviews. Bans on individuals grew into a tiered warning system that could be automatically instituted for one, three, or seven days. Truly egregious policy violations resulted in permanent username bans. Similarly, Huffman introduced a new quarantine system under which legal but generally offensive subreddits could only be viewed by subscribers who had specifically requested to sign up for them – and those subscribers needed to do that through a verified email address. Quarantining a subreddit in this fashion also meant that Reddit’s most important ads would be removed from it, and the pages’ posts wouldn’t show up through search or on r/popular. The move allowed certain problematic subreddits, such as r/SexWithDogs (which contained precisely what its title suggests) to go on existing for now, while Reddit stunted their growth and shielded them from Google, advertiser offence, and the eyes of unsuspecting newbies.

Within Reddit’s office, enacting the changes was a slog at first. Huffman said he and the community staff decided to quarantine the overtly racist subreddit r/CoonTown and then debated its existence for weeks. Not long after, he described the back-and-forth: “This is free speech. This is racist. This is bad for the community, this and that. There’s a lot of ways you can argue it.” During the debate, Reddit continued to be criticised in the press for not taking action – it was said the site was essentially subsidising white supremacist behaviour by continuing to host r/CoonTown, with its estimated 200,000 page views a day.

Just weeks later, in August 2015, Reddit finally banned the subreddit. Not for being overtly racist or harbouring a group of white supremacists that rivalled the traffic of Stormfront, the decades-old and notorious white supremacist website. Instead, Reddit said it did so only as part of a group of communities that, Huffman wrote, “exist solely to annoy other redditors, prevent us from improving Reddit, and generally make Reddit worse for everyone else”. Also included in the list of bans that day were communities dedicated to animated child abuse images and a subreddit called r/WatchN---ersDie. Even if he wouldn’t admit it at the time, in that tiptoe of a public justification for the ban, Huffman had amended the constitution.

A few months later, Huffman was happy with the result of the bans and the new subreddit quarantine system. “We quarantined a couple nasty ones yesterday and nobody even noticed. No press picked up on it,” he said. One had been an attempted clone of r/CoonTown.

Over the following year, the act of cutting off noxious communities and users would become an actual process, a streamlined system in which user-reported comments and threads would reach moderators and the community team. The team dubbed “trust and safety” would perform an analysis. Legal would sign off on the ban or quarantine. The subreddit’s moderator team would have been warned multiple times before the ban, either by a member of the community team, whose identities are public, or by a username from the trust and safety team, whose identities are entirely shielded.

Still, despite a cleaner line being drawn, Reddit continued to rest on claims that a community had violated the site’s policies, rather than, say, violating human decency, morals, or ethics.

By 2017, following the election of Donald Trump, when Reddit pinpointed and banned alt-right subreddits that were behaving more aggressively, Huffman described the action by his team, and its ho-hum aftermath, as straightforward. “Like, by the book, no drama,” Huffman said. “Cool.”

To Huffman, the days following the Unite the Right rally were a turning point for Reddit. He said staff realised that if they wanted to have a hand in stopping this from happening again, they’d need to be more proactive. It wouldn’t be easy. It would mean evolving as a company, hiring up, and developing more capacity to both make and execute significant decisions like this.

Just a year earlier, Huffman considered his greatest challenges regarding the Reddit community to be differentiating the good trolls from the bad ones, and trying to navigate the conversation between free expression and hate. Drawing some lines wasn’t possible, he had said. But a year and a half into his tenure, his thinking had evolved – as had that of the community team. It, along with trust and safety and legal, would widen the umbrella of “illegal” behaviour on Reddit – primarily content that could fall under the umbrella of “inciting violence”, a definition they expanded to include “content that encourages, glorifies, incites, or calls for violence or physical harm against an individual or group of people… or encourages the abuse of animals”. The wider definition was a step toward cutting off bullying, harassment, and bestiality. “That’s a big change in the philosophy that I think has made the company better,” Huffman said.

It opened the door for Reddit to ban another extensive list of subreddits. These were mostly far-right-leaning and Nazi-sympathising forums and those that glorified harm or death, such as r/selfharmpics and r/PicsOfDeadKids. Most were small forums, with fewer than a thousand subscribers each, but the list was extensive. It included 70 different communities that violated the updated rule.

If Reddit had made such a move at any time in its past, the results could have been catastrophic. This was a mere blip. The press made note of it, but neither users nor moderators revolted. Huffman was relieved by the lack of attention. “This was kind of the last of the big wave of communities that caused me angst,” he said, leaning back in satisfaction.

A few moments later he amended that thought: “Oh, gun sales. Never mind, we probably will still look into gun sales.”• This is an edited extract from We Are the Nerds by Christine Lagorio-Chafkin, published by Little, Brown on 2 October (£14.99). To order a copy for £12.89 go to guardianbookshop.com or call 0330 333 6846. Free UK p&p over £10, online orders only. Phone orders min p&p of £1.99