If an app on Facebook behaved the way Facebook has been behaving, Facebook would probably have shut it down by now.

Tuesday’s scathing TechCrunch investigation all but guarantees it. The report found that Facebook has been paying people as young as 13 years old to download an app that grants Facebook access to users’ entire phone and web history, including encrypted activity and private messages and emails. The app, called Research, allows Facebook to see how people’s friends, who have not consented to having their data collected, interact with those users, too.

Facebook says the app was purely for market research. Explained another way: The app allowed Facebook to spot competitive threats on the horizon to help it retain its unprecedented power. Facebook has used another app, called Onavo, to collect similar information; for example, data from Onavo alerted Facebook to the growing popularity of the messaging app WhatsApp before the company acquired it in 2014.

"I think it speaks to the growth-at-any-cost mentality of the company," says Ashkan Soltani, who served as chief technologist to the Federal Trade Commission during its 2011 investigation of Facebook.

Facebook didn't respond to WIRED's request for comment.

At a time when Facebook is under the microscope for violating its users' privacy, such techniques are bold enough. But what makes the operation even more brazen is that Facebook continued running the program, which launched in 2016 and was sometimes called Atlas, even after Apple banned Onavo from the App store less than six months ago. Apple said it would no longer allow developers to collect information from other third-party apps.

Apparently undeterred, Facebook created a workaround for the Research app. It circumvented Apple’s vetting process using a technical loophole that is only intended for apps Facebook distributes to its own employees. That allowed Facebook to ingest everything a user did on their phones, including teens and minors. While kids under the age of 17 had to receive parental consent to participate, the disclosure form analyzed by TechCrunch minimized the extent of what could be done with all that data. “There are no known risks associated with the project,” it read. Facebook told TechCrunch only 5 percent of the app's users were teens.

Still, even the solicitations adults would have received about the app weren't entirely forthcoming. When users referred their friends to the app, for which they could also get paid, the email they received encouraged them to “Install it and forget it,” making the act of giving away unlimited access to their private communications sound as harmless as setting up a Ronco Rotisserie.

"I think it speaks to the growth-at-any-cost mentality of the company." Ashkan Soltani

The Research app is just the latest example of Facebook’s doublespeak. In public and even under oath, executives like Mark Zuckerberg and Sheryl Sandberg have spent at least a year---if not their entire careers---promising to do better by their users. But in private, evidence abounds that the company continues to flout every rule and attempt at oversight placed before it. They've promised to protect user privacy by cutting off developer access to data while continuing to give it away to corporate giants and major advertisers. They've vowed to investigate foreign interference in elections, all while withholding information about the extent of that interference on Facebook. They've launched efforts to make their ads more transparent, while crippling external efforts by organizations like ProPublica to pull back the curtain even further.

Even as privacy hounds and antitrust watchdogs at the FTC and on Capitol Hill sniff and scratch at Facebook’s door, the social media giant, apparently high on hubris, just keeps tossing them red meat. If Facebook has learned anything from the last two years of public and regulatory scrutiny, it has a funny way of showing it.