When there is wrongdoing in fields that are both complex and opaque, it often takes a whistle-blower to inform the public. That’s exactly what former quant trader turned social activist Cathy O’Neil has become for the world of Big Data. A Harvard trained mathematician, O’Neil spent the last several years teaching at Barnard, working for DE Shaw, one of the world’s leading hedge funds, and launching a technology start up designed to deliver targeted advertising. Her key takeaway from the last two experiences—that Big Data is increasing inequality and threatening democracy—is the subject of her important new book, Weapons of Math Destruction, out on September 6.

Unlike the WMDs that were never found in Iraq, data driven algorithms are all around us. Already, many of our bosses use them to grade our performance. Our children’s teachers are hired and fired by them. They decide who gets access to credit and who pays higher insurance premiums, as well as who will receive online advertising for luxury handbags versus who’ll be targeted by predatory ads for for-profit universities.

In fact, it was that last example that prompted O’Neil, who’s also a member of the Occupy Movement, to write her book. While working at the start-up, she heard a presentation from an investor lauding the fact that the company’s new technology would mean that he would “never have to see another ad for the University of Phoenix,” but would be automatically funneled more offers for “vacations in Aruba and jet skis.” “I realized that far from doing anything good, this technology was actually siloing people into online gated communities where they no longer had to even acknowledge the existence of the poor,” she says.

O’Neil sees plenty of parallels between the usage of Big Data today and the predatory lending practices of the subprime crisis. In both cases, the effects are hard to track, even for insiders. Like the dark financial arts employed in the run up to the 2008 financial crisis, the Big Data algorithms that sort us into piles of “worthy” and “unworthy” are mostly opaque and unregulated, not to mention generated (and used) by large multinational firms with huge lobbying power to keep it that way. “The discriminatory and even predatory way in which algorithms are being used in everything from our school system to the criminal justice system is really a silent financial crisis,” says O’Neil.

The effects are just as pernicious. Using her deep technical understanding of modeling, she shows how the algorithms used to, say, rank teacher performance are based on exactly the sort of shallow and volatile type of data sets that informed those faulty mortgage models in the run up to 2008. Her work makes particularly disturbing points about how being on the wrong side of an algorithmic decision can snowball in incredibly destructive ways—a young black man, for example, who lives in an area targeted by crime fighting algorithms that add more police to his neighborhood because of higher violent crime rates will necessarily be more likely to be targeted for any petty violation, which adds to a digital profile that could subsequently limit his credit, his job prospects, and so on. Yet neighborhoods more likely to commit white collar crime aren’t targeted in this way.

In higher education, the use of algorithmic models that rank colleges has led to an educational arms race where schools offer more and more merit rather than need based aid to students who’ll make their numbers (thus rankings) look better. At the same time, for-profit universities can troll for data on economically or socially vulnerable would be students and find their “pain points,” as a recruiting manual for one for-profit university, Vatterott, describes it, in any number of online questionnaires or surveys they may have unwittingly filled out. The schools can then use this info to funnel ads to welfare mothers, recently divorced and out of work people, those who’ve been incarcerated or even those who’ve suffered injury or a death in the family.

Indeed, O’Neil writes that WMDs punish the poor especially, since “they are engineered to evaluate large numbers of people. They specialize in bulk. They are cheap. That’s part of their appeal.” Whereas the poor engage more with faceless educators and employers, “the wealthy, by contrast, often benefit from personal input. A white-shoe law firm or an exclusive prep school will lean far more on recommendations and face-to-face interviews than a fast-food chain or a cash-strapped urban school district. The privileged… are processed more by people, the masses by machines.”

That’s a particularly disturbing idea, given that WMD are proliferating at the same time there are so many other forces pushing greater inequality. How do we fix things? O’Neil has proposed a Hippocratic Oath for mathematicians. She and others also suggest much deeper regulation of the burgeoning field, perhaps via random algorithmic “audits” by regulators, and deeper analysis of how such algorithms work (Princeton recently launched the Web Transparency and Accountability Project, which aims to study algo-racism and discrimination). O’Neil also proposes updating existing civic rights oriented legislation to make it clear that it encompasses computerized algorithms. One thing that her book has already made quite clear – far from being coolly scientific, Big Data comes with all the biases of its creators. It’s time to stop pretending that people wielding the most numbers necessarily have the right answers.

The Leadership Brief. Conversations with the most influential leaders in business and tech. Please enter a valid email address. Sign Up Now Check the box if you do not wish to receive promotional offers via email from TIME. You can unsubscribe at any time. By signing up you are agreeing to our Terms of Use and Privacy Policy . This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply. Thank you! For your security, we've sent a confirmation email to the address you entered. Click the link to confirm your subscription and begin receiving our newsletters. If you don't get the confirmation within 10 minutes, please check your spam folder.

Contact us at letters@time.com.