Information war, meet the war room. Facebook on Wednesday briefed reporters on its latest efforts to uncover nefarious uses of the platform during election season, offering a tour of the large conference room where dozens of employees are monitoring events around the clock.

“We know when it comes to an elections, every moment counts,” said Samidh Chakrabarti, head of civic engagement at Facebook, who oversees the war room. “So if there are late-breaking issues we see on the platform, we need to be able to detect and respond to them in real time, as quickly as possible.”

On one hand, the war room is just one of many conference rooms in MPK 20, the company’s Menlo Park, CA headquarters. But it’s larger than average, and has been stuffed with people and electronics equipment. There are desks for 24 people, and the room is ringed with 17 screens, each of which highlights a different stream of information Facebook is monitoring.

Employees look for suspicious spikes in spam and hate speech, in some cases using custom software built for the purpose. They look for efforts at voter suppression, such as falsely telling people that lines are long or that the election has been delayed. (The team recently uncovered one such hoax claiming that the Brazilian election date had been delayed a day due to protests, and swiftly removed the offending posts.)

Other employees use CrowdTangle, a company acquired by Facebook that monitors the viral spread of articles across platforms, to detect which articles are trending on Facebook, Instagram, Twitter, and Reddit.

All told, representatives from 20 teams have people in the war room, representing 20,000 global employees working on safety and security. The teams include threat intelligence, data science, engineering, research, operations, legal, policy, communications, and representatives from Facebook-owned WhatsApp and Instagram.

If anyone finds a problem, they escalate it to a specialist, who can then route it to the appropriate decision-maker. Facebook has also given state attorneys general and other elected officials a way to reach the war room quickly to report voter suppression and other suspicious activity.

“There’s no substitute for face to face, in-room interaction.”

If the room is new, the work is not: Facebook teams have been working to fight misinformation in a coordinated way since 2016. The decision to bring them all together into a single room was made in the hopes that it would let them make decisions faster.

“When every moment counts, then decision-making needs to be fast,” Chakrabarti sayd. “There’s no substitute for face to face, in-room interaction. so we felt [the need for] this physical war room to be able to coordinate teams.”

On the day we visited, the room was decorated with a mix of American and Brazilian flags, in a nod to the next two elections that are being watched in the war room. It’s a fraught time: last week, Facebook removed 559 pages and 251 accounts in the United States for using fake identities and coordinating information campaigns. Meanwhile, Brazilian researchers today published the results of a study that examined the most-shared images in popular public WhatsApp chats and found that many of them contained misinformation:

From a sample of more than 100,000 political images that circulated in those 347 groups, we selected the 50 most widely shared. They were reviewed by Agência Lupa, which is Brazil’s leading fact-checking platform. Eight of those 50 photos and images were considered completely false; 16 were real pictures but used out of their original context or related to distorted data; four were unsubstantiated claims, not based on a trustworthy public source. This means that 56 percent of the most-shared images were misleading. Only 8 percent of the 50 most widely shared images were considered fully truthful.

Facebook wouldn’t commit to keeping the war room open over forever. But with multiple elections taking place around the world each year, the company plans to keep it open for the foreseeable future. “This is going to be a constant arms race,” said Katie Harbath, Facebook’s global politics and government outreach director. “This is our new normal. Bad actors are going to get more sophisticated in what they’re doing, and we’re going to have to get more sophisticated in trying to catch them.”