A bipartisan group of political data firms are drafting a set of industry standards that they hope will prevent voter data from being misused as it was in 2016. The guidelines cover transparency, foreign influence in elections, responsible data sourcing and storage, and other measures meant to root out bad actors in the industry and help fend off security threats.

The conversations, which are being organized by Georgetown University's Institute of Politics and Public Service, come at a time when data collection more broadly faces increased scrutiny from lawmakers and consumers. Ever since news broke this spring that the political firm Cambridge Analytica used an app to hoover up data on tens of millions of Americans and exploited it for political purposes, Facebook and other Silicon Valley tech giants have had to answer to Congress and their customers about their mass data-collection operations. But the Georgetown group focuses specifically on the responsibilities of the companies that undergird some of the country's biggest political campaigns. Among the firms participating in these discussions are Republican shops like DeepRoot Analytics, WPA Intelligence, and Targeted Victory, as well as Democratic firms such as Bully Pulpit Interactive, NGP VAN, and DSPolitical.

"These are the firms that power all of the elections in America, and so my hope was if you can get them in a room and get them to understand the importance of the data they’re using and to self-regulate, you could achieve a dramatic improvement on behalf of voters," says Tim Sparapani, a fellow at the Georgetown Institute who is overseeing the group.

Sparapani served as Facebook's first director of public policy from 2009 until 2011, after spending several years at the American Civil Liberties Union. A self-proclaimed privacy advocate, he has warned about the need for stricter oversight of data brokers for years. These are companies that collect, store, and analyze data about consumers for a variety of purposes. In the political world, that data can include basic information about how many times a person has voted, their party registration, and their donation record, but it can also include social media and commercial data that can help campaigns better understand who a given person is and target them with political advertising.

The data broker industry remains largely unregulated, both inside and outside politics. The Federal Trade Commission has urged Congress to regulate data brokers since at least 2012, but nothing has come of it so far. In June, Vermont became the first state to pass a data broker law, which goes into effect in January.

"These are the firms that power all of the elections in America." Tim Sparapani

The Georgetown group first met last fall, months before Cambridge Analytica began making headlines. At the time, the industry's primary concern was the risk of a data breach or a hack at the hands of a foreign threat: In the summer of 2017, a cybersecurity firm discovered that DeepRoot Analytics' entire trove of 198 million voter records was exposed in a misconfigured database, constituting the largest known voter data leak in history. Brent McGoldrick, CEO of DeepRoot, says the leak was a shock to the system.

"You just have a different mindset coming out of something like that, where you start to think differently about everything from security to privacy to the data you have and the perceptions of it," he says.

Coupled with the intelligence community warnings about Russia and other foreign actors' continued attacks on the American electoral system, McGoldrick says, it seemed well past time for his company and its competitors on both sides of the aisle to talk about protecting themselves and the people whose data they hold.

McGoldrick brought up the idea with Mo Elleithee, a former Democratic National Committee spokesperson who founded Georgetown's Institute of Politics and Public Service in 2015. Together, they tapped Sparapani to oversee the effort. "We understand that in order to move the ball forward on privacy and security issues, we’re going to have to hear from people who, maybe we don't like hearing what they have to say," McGoldrick says. When the Cambridge Analytica story broke months later, he says, it only underscored the need for this kind of work.