An estimated 130 people die from opioid-related drug overdoses each day in the United States, and 2 million people had an opioid use disorder in 2018. This public health crisis has left officials scrambling for ways to cut down on illegal sales of these controlled substances, including online sales.

Now the National Institute on Drug Abuse, which is part of the US Department of Health and Human Services, is investing in an artificial intelligence-based tool to track how “digital drug dealers” and illegal internet pharmacies market and sell opioids (though online transactions are likely not a large share of overall illegal sales).

New AI-based approaches to clamping down on illegal opioid sales demonstrate how publicly available social media and internet data — even the stuff you post — can be used to find illegal transactions initiated online. It could also be used to track just about anything else, too: The researcher commissioned by NIDA to build this tool, UC San Diego professor Timothy Mackey, told Recode the same approach could be used to find online transactions associated with illegal wildlife traffickers, vaping products, counterfeit luxury products, and gun sales.

As with most technical innovations, these tools pose concerns, too. For instance, drug policy experts caution that in the case of online opioid sales, such AI tools — depending on how they’re ultimately used by law enforcement — risk enabling the over-criminalization of low-level drug sellers. They also emphasize that these tools won’t ultimately help reduce the demand for these substances.

It’s more complicated than a keyword search

Coming up with a way to find illegal drug sales online is no easy task. You can’t just search “opioid” and expect to only find accounts illegally selling these medications. Think about it: Tons of people have written about opioids online. Maybe you’ve shared your thoughts about a Vox article on this topic, or posted about a loved one who passed away from an overdose. Neither of those is an illegal transaction.

In fact, only a small percentage of online posts that mention opioids are related to illegal selling or marketing, says Mackey. In one study of more than 600,000 tweets containing the names of several prescription opioids, he found that fewer than 2,000 tweets were identified as actually marketing those substances.

Another challenge: People selling these substances online don’t always use obvious keywords, and they change their strategies and quickly remove their posts. For instance, Mackey has noticed that some accounts that appeared to be online drug sellers have included pictures of exercise equipment in their posts. He says another common behavior is misspelling the names of drugs. That’s because Instagram, for instance, has blocked searching by some drugs’ exact names.

Similar to findings by BuzzFeed News, when Recode searched for “#percocet” on Instagram, there were no results, but our search of slightly misspelled “#percocert,” a search term suggested by Mackey, revealed thousands of posts, some of which had comments that appeared to be related to drug-selling. (Facebook and Instagram community standards ban this type of content. The company says that it encourages users to report this content and uses automated systems to preemptively catch it).

“For a platform like Instagram where we see a lot of drug dealers, it’s a number of hashtags associated with different opioid communities, and then it’s usually information about how to contact the drug dealer and buy from them,” Mackey says. He explains that there are also sellers who illegally represent themselves as internet pharmacies, which can advertise on social media or internet sites and then direct potential customers to some form of e-commerce platform. The FDA has repeatedly tried to clamp down on these sites.

That’s where data — lots of it — and artificial intelligence come in. Last year, Mackey and his team used a type of artificial intelligence called deep learning to track down illegal drug sellers on Instagram. This type of AI focuses on recognizing patterns in data, and in this case, in Instagram posts. The idea is to get an AI-based system to recognize what drug-selling content looks like so that it can automatically find new sale-related posts within a much larger set of internet content.

Mackey and his team have also used an AI-based approach called topic-modeling. Here’s how that works: You expose an AI system to a bunch of words and phrases from a larger set of information, like a database full of tweets that include the word “fentanyl” (a type of opioid). Then you let the AI system loose to figure out what words and phrases appear to be related to keywords like fentanyl. It’s a complicated form of sorting and matching that ultimately finds conversations, or “topics,” within the broader discussions of fentanyl.

The hope is that one of the “topics” the AI finds is related to suspected sales, potentially revealing relevant keywords or information you otherwise would not have known about. Such a method helped Mackey whittle down a set of nearly 30,000 tweets about fentanyl to fewer than 10 unique tweets that appeared to be marketing fentanyl and included links to external sites. Mackey has also used this method to sort through online conversations about other types of opioids, like oxycodone and oxycontin.

To fight online opioid sales, there’s growing interest in AI

It’s important to keep in mind that most illegal opioid sales probably don’t occur online. But it’s still a problem that the Food and Drug Administration (FDA), members of Congress, and the National Association of Boards of Pharmacy (NABP) are worried about. In 2018, FDA commissioner Scott Gottlieb called out social media companies, among other internet companies, for not being “proactive enough in rooting out these illegal offers to distribute opioids.” Soon afterward, Facebook CEO Mark Zuckerberg was hammered about illegal online pharmacies promoting opioids on his platform while testifying before Congress. He, too, pointed to AI as part of the solution.

That represents a broader trend. A Food and Drug Administration spokesperson told Recode that the agency’s criminal investigations office often gathers intelligence from public tips, the internet, the dark web, and social media, “oftentimes using a number various AI-enabled applications to correlate and understand information from multiple sources.” Last year, the government budgeted the FDA $20 million to create a “data warehouse” meant to be mined, in part, by machine learning algorithms, which would be used to identify and address emerging trends in the opioid crisis. (The Drug Enforcement Administration told Recode it would not comment on investigative techniques.)

Meanwhile, Reddit, YouTube, Twitter, and Facebook all told Recode they’re now using automated or AI-based technology to flag or investigate content that violates their policies, including illegal opioid sales.

Mackey said he’s only been in limited talks with Facebook and Twitter, which were spurred by an FDA summit in 2018 focused on cracking down on illegal sales online. But Mackey says a pilot study he ran for Google led to the removal of some opioid sale-related content on YouTube (mostly in the comments section of videos about opioids).

Now, as part of his work for NIDA, Mackey is developing a prototype based on his research, which he soon plans to commercialize (essentially, NIDA’s funding helped him launch a small company). He says that, for now, only the government is funding his work on tracking illegal opioid sales. Mackey’s hope is that the tool could ultimately be used by regulators, social media platforms, pharmaceutical companies, and even law enforcement agencies, like the DEA and the Federal Bureau of Investigation.

AI tools could help analyze broader trends in the opioid supply chain, but they come with risks

Mackey says his tool is needed because, even if proposed drug policy reforms succeed, we’ll still need to prevent the opioid crisis from extending itself online. While research shows that’s already happened to some extent, two drug policy researchers told Recode that online sales are probably a small share of overall illegal opioid purchases. The DEA told Recode that its investigations into retail-level sales are still dominated by traditional, and not online, sales.

“We still don’t know to what extent this is a problem or what’s the size of it, relative to other kinds of traditional drug supply avenues,” said Bryce Pardo, a drug policy researcher at Rand Corporation, a nonpartisan think tank. He said it’s possible a tool like Mackey’s could help find specific populations that tend to sell these substances online. But he cautioned that such a tool would be most applicable to finding sellers at the bottom of the supply chain, not large-scale importers that illegally bring massive amounts of these controlled substances into the US.

“When we try to target sellers, it becomes a game of whack-a-mole,” cautions Sheila Vakharia of the nonprofit Drug Policy Alliance. “Even when a seller is taken off the streets, taken off the web, or taken off a username or an account, there’s very little in place that’s going to prevent the next one from popping up.”

Mackey says it’s true some of the accounts his system will flag will be relatively low-level, but he emphasizes that their volume varies, and could also be used to find where drugs are being sold on other parts of the web. Ultimately, he says, the AI can help law enforcement link investigations they’re conducting online to those they’re pursuing offline, and to gain a better sense of the entire supply chain. He says the information they gain from this system could ultimately help them prosecute an existing case, target larger actors, issue a subpoena, or even conduct a “test-buy.”

Vakharia agreed that some applications of data-mining and AI might be useful. For instance, these tools could help those seeking opioids — or those who are already at risk of an overdose — access rehabilitative resources.

More broadly, AI is also being used to study how people talk about their drug abuse and recovery online. And researchers at the New Jersey Institute of Technology are working on a somewhat similar tool to Mackey’s — called DrugTracker — that uses social media and geospatial data to mine through slang related to, and detect risks of, drug activity. The idea, professor Hai Phan explains, is to keep local institutions informed of the risk of drug abuse in their areas. “Drug abuse uptick is really, really fast, especially when we have a new drug,” he told Recode.

Still, Vakharia said there’s a risk it could be used to crack down on low-level sellers in an ineffective way that would ultimately exacerbate a failing war on drugs.

“This is that fine line that we’re going to continue to walk for some time,” Vakharia said. “If we know that the internet is a big place where people are engaging in these transactions but also looking for information, it would be really great to be able to target messaging for them. But I think that assuming the best of intentions for all players who are going to get access to this information is naive.”

Open Sourced is made possible by Omidyar Network. All Open Sourced content is editorially independent and produced by our journalists.