A year ago, The Intercept published a story about a Trump campaign affiliate that was circulating personality tests to collect Americans’ personal information. The company, called Cambridge Analytica, had already been unveiled by the Guardian in a chilling report that detailed its voter-targeting operation. There was every reason to be concerned. These revelations arrived in the midst of a year in which aggressive political campaigning, concerns over fake news, and the rise of bots that spread propaganda gave us reason to question the kinds of information we were handing over to third party applications, like Facebook, and how this freespun data deluge might come back to bite us in the ass.

But this awareness of Cambridge Analytica, and their covert manipulation of our data, didn’t coalesce into rage until late Friday night, when the words of a pink-haired, gay, vegan Canadian hit a cultural nerve. At 28, Christopher Wylie agreed to talk, he told The Guardian, out of a sense of guilt. Four years earlier, Wylie says he came up with the idea to pull big data and social media to fuel a form of information warfare: an idea that led to the creation of Cambridge Analytica. Coming forward involved breaking a nondisclosure agreement, yet Wylie did it, he explained, because he felt morally conflicted. “I assumed it was entirely legal and above board,” he told The Guardian. But he’d helped to create a weapon, and he was ready, as best he could, to participate in its dismantling.

Judging by content alone, Wiley’s reckoning doesn’t make for a huge news moment; the details he reveals about the inner workings of Cambridge Analytica have, for the most part, already been disclosed by investigative reporters. But Wiley triggered something that countless news stories weren’t able to: A latent rage that may lay the groundwork for a movement that demands accountability from Facebook.

The unchecked power of companies that harvest our data is a great problem—but it’s hard to get angry about an idea that’s so nebulous. Like climate change, the reaping of our data is a problem of psychology as much as business. We know that the accumulation of massive power in so few hands is bad, but it’s impossible to anticipate what terrible result might come of it. And if we could envision them, these consequences are imaginary: abstract and in the future. It feels so oppressively intractable it’s hard to summon the will to act.

Like climate change, the reaping of our data is a problem of psychology as much as business.

Even if we could act, the options aren’t great. Except for the very very rich, or the extraordinarily poor, participating in the economy requires leaving a digital footprint. Most of us scroll through privacy terms on the sites we use without reading them, and accept updates without noticing or understanding the consequences. We all know we’ve been compromised already.

In a flash, Wylie’s story made the idea of misused big data concrete—and urgent. Unlike, say, Phillip Morris, which sold a product that directly caused people to get cancer, the problems of big tech are abstract enough that they require people to illustrate their impact. Wylie is just one in a small-but-growing cadre of digital whistleblowers who have come of age in the early decades of the Internet, and played a hand in helping tech companies and government institutions harness the power of the data that has emerged, and now regret their roles. Former CIA employee and government contractor Edward Snowden leaked classified information from the National Security Agency in 2013 because he said he was concerned about global surveillance techniques. Tristan Harris rose to become a design ethicist at Google before he left in 2016, and concerned that technology companies design addictive software applications, began a campaign to produce technology that is good for people. Former Facebook product manager (and current Wired columnist) Antonio Garcia Martinez helped develop advertising at Facebook; now he speaks out, after writing a book about his experience. Guillaume Chaslot, a former YouTube engineer, detailed his concerns about the platform’s recommendation algorithm to the Guardian earlier this year.