It's a windowless room, packed with about two dozen desks, a half-dozen screens showing TV news and Twitter feeds and even more monitors lining the walls tracking trends in Facebook user behavior. This is Facebook's first ever "war room," designed to prevent election manipulation by improving data-sharing across the company and enabling quick decision-making. This roughly 900-square-foot room, which Facebook recently showed to journalists, is a visual representation of the company's commitment to dramatically improving communication and security ahead of the Nov. 6 U.S. midterms. This demonstration of Facebook's internal efforts comes after a long string of security breaches and privacy hacks, going back to Russian manipulation of the 2016 presidential elections. Since the revelation of the Cambridge Analytica privacy scandal in March, Facebook shares have fallen 14 percent. Now, the social-media giant is pulling out all the stops to prevent another debacle and more negative headlines. With less than three weeks before the U.S. election, and even less time ahead of the Oct. 28 runoff for the Brazilian presidential election, this room is the hub for Facebook's work to identify the spread of fake news and quickly shut it down. The company says its current combination of technology and 20,000 employees focused on safety and security would have blocked the Russian manipulation of the 2016 election.

"We've essentially done much scenario planning and 'war games' internally within the war room to plan out different types of problems that we may see," said Samidh Chakrabarti, who oversees Facebook's elections and civic engagement team. "We've practiced and we've done drills to see how we can detect that, how we can come to quick decisions, and how we can take quick action." The war room is staffed from 4 a.m. until midnight, and as of next week, will be buzzing 24/7 with representatives from teams that represent every corner of the company. WhatsApp, Instagram, operations, software engineering, data science, research operations, legal, policy, communications — they're all represented in the room. Charts of user behavior on Facebook and its other apps are on monitors around the room. Facebook uses machine learning and artificial intelligence to monitor the spikes that could point to hate speech, fake news going viral or efforts at voter suppression. Nathaniel Gleicher, Facebook's head of cybersecurity, said the company's goal is that the election be fair, and that "debate around the election be authentic. ... The biggest concern is any type of effort to manipulate that." Ahead of the Brazilian vote, the company identified an effort to suppress turnout and was able to shut it down quickly, thanks in part to the proximity of so many teams in a single room. "Content that was telling people that due to protests, that the election would be delayed a day," said Chakrabarti. "This was not true, completely false. So we were able to detect that using AI and machine learning. The war room was alerted to it. Our data scientists looked into what was behind it and then they passed it to our engineers and operations specialists to be able to remove this at scale from our platform before it could go viral." Facebook is combining its teams focused on the U.S. and Brazilian elections because fighting what the company calls "bad actors" is a global problem that never ends. The idea is that these teams can share information about the latest tactics they're seeing and share best practices for blocking them. Gleicher warns that Facebook is seeing growing efforts to manipulate the public debate as we get closer to the midterms.