Photo : Getty Images

Online dating is a hellscape, but the U.S. House Oversight and Reform subcommittee is fed up with just how shitty apps like Tinder, Bumble, and Grindr have been with regard to their users’ safety and privacy. Yesterday, it launched a new investigation into dating apps for doing an inadequate job of screening for minors, and inappropriately selling or sharing personal data.


The subcommittee sent out four letters to Match Group, Inc, The Meet Group, Inc, New Grinder LLC, and Bumble Trading Inc. All four contain the same language, and basically outlines concerns that underage users can sneak their way onto dating apps by simply lying about their age. The letters also take issue with the fact that screening policies for sex offenders aren’t uniformly enforced, even within the same app.



“Our concern about the underage use of dating apps is heightened by reports that many popular free dating apps permit registered sex offenders to use them, while the paid versions of these same apps screen out registered sex offenders,” writes Rep. Raja Krishnamoorthi, chairman of the subcommittee on economic and consumer policy. “Protection from sexual predators should not be a luxury confined to paying customers.”


In particular, the letters reference the case of Joseph Meili, who pled guilty to third-degree molestation after being charged with sodomizing, raping, and kidnapping an 11-year-old girl he found through a dating app. It also referenced a UK report that found several dating apps, including Tinder and Grinder, failed to prevent child sexual exploitation due to weak age verifications. The report, which was the result of a public records request, also found multiple cases of child grooming as a result. Another joint report by ProPublica, Buzzfeed and Columbia Journalism Investigations found that while Match Group screens for sex offenders on its Match service, it did not apply the same policies to Tinder, OkCupid, or PlentyofFish.

While most of these sites claim to have policies that prohibit minors from using the service, the fact is many don’t have strong mechanisms in place to prevent minors from simply lying about their age. Most of the services rely on users reporting profiles they come across. Tinder has previously told Gizmodo that it uses a combination of manual and automated moderation and review to prevent minors from joining the platform, saying it spends “millions of dollars annually” on the effort.

Aside from inadequate screening measures, the subcommittee is also taking issue with reports that consumers “may not receive adequate notification of the commercial use of their sensitive personal information” such as sexual preferences, gender, employment, drug use, and politics. For instance, Gizmodo recently discovered Tinder’s new panic button shares user data with ad-tech companies.


As part of the investigation, the subcommittee is requiring the four companies to provide a mountain of documents pertaining to the apps’ monthly users, how much they pay for the service, age distribution, privacy policies, policies pertaining to sex offender screening, customer complaints regarding minors on the service. It’s also requesting details on what data is collected, who the data is shared with, and whether data collection is required to join the service. On top of all that, the subcommittee is demanding companies disclose whether private messages between users are reviewed, and all communications with law enforcement agencies. All companies involved have until February 13, 2020, to comply.

Gizmodo reached out to the four companies involved for comment, we’ll update this post when we receive a reply.


Update, 01/31/2018, 1:15pm: A Match Group spokesperson emailed Gizmodo the following statement.

“We don’t want minors and bad actors on our apps, and we use every tool possible to keep them off. But, this is a broader internet problem and everyone needs to do their part, which is why we implore third-party App Stores like Apple and Google who know exactly who is using these products to stop distributing them to minors and registered sex offenders. Furthermore, the registered sex offender database needs to be updated so that a perpetrator’s digital footprint can be tracked and blocked by our industry and all social media companies - particularly the ones that freely allow underage users on their platforms. We will continue to invest in technology to keep our users safe, but we call on all parties that have a role to play in keeping our children safe, to do their part as well.”


Update, 02/05/2020, 11:30am: A Grindr spokesperson has also sent Gizmodo the following statement.



Any illegal use of our app – including by those who are underage – is deeply troubling to us, as well as a clear violation of our terms of service. We continue to take steps to address this important issue, including by promoting online safety and working collaboratively towards industry-wide solutions and transparency reporting. In addition, Grindr recently launched an enhanced in-app reporting tool. When reviewing reports, our team follows specific protocols that provide for the banning of offending accounts and, where we identify child sexual exploitation activity, we work with the National Center for Missing and Exploited Children. In addition, Grindr cooperates with law enforcement agencies in support of their investigations into cybercrimes. We are also constantly working to improve our digital and human screening tools, including to prevent and remove improper underage use of our app.