stub AI Opens Up New Ways To Fight Illegal Opiod Sales And Other Cybercrime - Unite.AI
Connect with us

Artificial Intelligence

AI Opens Up New Ways To Fight Illegal Opiod Sales And Other Cybercrime

mm
Updated on

The US HHS (Department of Health and Human Services) and the National Institute on Drug Abuse (NIDA) are investing in the use of AI to curb the illegal sale of opioids and hopefully reduce drug abuse. As Vox reported, NIDA’s AI tool will endeavor to track illegal internet pharmaceutical markets, but the approaches used by the AI could easily be applied to other forms of cybercrime.

One of the researchers responsible for the development of the tool, Timothy Mackey, recently spoke to Vox, where it was explained that the AI algorithms used to track the illegal sale of opioids could also be used to detect other forms of illegal sales, such as counterfeit products and illegal wildlife trafficking.

NIDA’s AI tool must be able to distinguish between general discussion of opioids and attempts to negotiate the sale of opioids. According to Mackey, only a relatively small percentage of tweets referencing opioids are actually related to the illegal sales of opioids. Mackey explained that out of approximately 600,000 tweets referencing one of several different opioids only about 2,000 actually marketed those drugs in any way. The AI-tool must also be robust enough to keep up with changes in the language used to illegally market opioids. People who illegally sell drugs frequently use coded language and non-obvious keywords to sell them, and they quickly change strategies. Mackey explains that misspelled aliases for the names of drugs are commonly used and that images of things other than the drugs in question are often used to creating listings on websites like Instagram.

While Instagram and Facebook ban the marketing of drugs and encourage users to report instances of abuse, the illegal content can be very difficult to catch, precisely because drug sellers tend to change strategies and code words quickly. Mackey explained that these coded posts and hashtags on Instagram typically contain information about how to contact the dealer and purchase illegal drugs from them. Mackey also explained that some illegal sellers represent themselves as legitimate pharmaceutical companies and link to e-commerce platforms. While the FDA has often tried to crack-down on these sites, they remain an issue.

In designing AI tools to detect illegal drug marketing, Mackey and the rest of the research team utilized a combination of deep learning and topic modeling. The research team designed a deep learning model that made use of a Long Short-Term Memory network trained on the text of Instagram posts, with the goal of creating a text classifier that could automatically flag posts that could be related to illegal drug sales. The research team also made use of topic modeling, letting their AI model discern keywords associated with opioids like Fentanyl and Percocet. This can make the model more robust and sophisticated, and it is able to match topics and conversations, not just single words. The topic modeling helped the research team reduce a dataset of around 30,000 tweets regarding fentanyl to just a handful of tweets that seemed to be marketing it.

Markey and the rest of the research team may have developed their AI application for use by NIDA, but social media companies like Facebook, Twitter, Reddit, and YouTube are also investing heavily in the use of AI to flag content that violates their policies. According to Markey, he has been in talks with Twitter and Facebook about such application before, but right now the focus in on creating a commercially available application based off of his research for NIDA, and that he hopes the tool could be used by social media platforms, regulators, and more.

Markey explained that the approach developed for the NIDA research could be generalized to fight other forms of cybercrime, such as the trafficking of animals or the illegal sale of firearms. Instagram has had problems with illegal animal trafficking before, banning the advertising of all animal sales in 2017 as a response. The company also tries to remove any posts related to animal trafficking as soon as they pop up, but despite this there is a continued black market for exotic pets and advertisements for them still show up in Instagram searches.

There are some ethical issues that will have to be negotiated if the NIDA tool is to be implemented. Drug policy experts warn that it could enable the over-criminalization of sales by low-level drug sellers and that it could also give the false impression that the problem is being solved even though such AI tools may not reduce the overall demand for the substance. Nonetheless, if properly used the AI tools could help law enforcement agencies establish links between online sellers and offline supply chains, helping them quantify the scope of the problem. In addition, similar techniques to those used by NIDA could be utilized to help combat opioid addiction, directing people towards rehabilitative sources when searches are made. As with any innovation, there are both risks and opportunities.