Both Undercover Colors and Good2Go have been subject to a number of well-founded and well-articulated critiques: On account of their clumsy, unrealistic fit with the social realities of sexual violence, their implication that women (not men) should be responsible for preventing rape, and (in Good2Go’s case) the possibility that personal information collected by the app might be sold or subpoenaed. But “rape solutions” like these also reveal a deeper and far thornier issue: the tendency to address sexual violence as a data problem.

Undercover Colors and Good2Go are technological tattletales. Both are designed to tell the truth about an encounter, with the objectivity and dispassion of a database or a chemical reaction. Tattletale solutions make sense only if we see rape, fundamentally, as a problem of bad data. But thinking about rape this way implies that what we’re most worried about is men being wrongly accused of sexual assault. That the reports women provide aren’t reliable, and should be replaced by something “objective.” These technologies prioritize the creation of that data over any attempt to empower women or to change the norms around sexual violence; they’re rape culture with a technological veneer.

Even after the fact—once the act has happened—our tendency is to view sexual violence through the lens of data rather than human experience. Take the footage of Janay Palmer’s battery, subject to tremendous public scrutiny and comment. Across the board, that film has been framed almost exclusively through the lens of data and proof: Who saw the video, and when? What does it reveal about the NFL brass’s veracity and institutional policies? Whose account does it confirm, and on whose does it cast doubt? The main characters in this story are Ray Rice, Roger Goodell, and TMZ; Janay Palmer is just a woman’s face at the receiving end of a fist. Somehow the assault isn’t about her at all.

By looking at sexual assault through a data lens, technologies like these collapse complex experiences into discrete yes-or-no data points. But sex is not a singular act, and consent is an ongoing conversation: Some acts are agreed upon but others are not, and participants are allowed to change their minds at any point. Focusing on data production drives us to think of sexual violence in black-and-white terms—a dangerous oversimplification of a far messier and more nuanced reality.

Silicon Valley might see this issue as a catch-22. When technologists make silly apps, they are slammed for ignoring real social problems. When they take on real social problems, they’re criticized for treating them like silly apps. It’s encouraging to see techies trying to address knotty social issues like sexual violence. But if technology is going to intervene for good, it needs to adopt a more nuanced approach—one that appreciates that not every problem can be treated as a data problem. Laundry delivery is a data problem; rape is not.