By Andrew Zirkle | SEATTLE

In today’s algorithm-driven world, no company is more reliant on predictive math than Amazon, where one wrong variable could result in hundreds of thousands of dollars in lost profits. Although Amazon’s product suggestion system frequently results in meaningful customer purchases and helps Amazon garner more revenue, it was uncovered by UK Broadcasting Channel 4 that the algorithm was in some cases suggesting chemical combinations that could be used to make incendiary devices to its users.

According to the report issued by Channel 4, searches for a certain type of chemical would yield black powder and thermite in the “Frequently Bought Together” section. Another reported example states that the algorithm has also linked 3 chemicals that when mixed and ignited, could be used to create a large incendiary bomb. The report also stated that throughout the investigation, listings for steel ball bearings, push button switches, battery connectors, and cables all appeared in the “Frequently Bought Together” section.

Investigations also found that by creating a “shopping basket” in Amazon, customers in the UK are able to buy 45kg of black powder and have it shipped right to their door, which is an outrageously large amount compared to the current legal purchasing limit of 100g in the UK.

Amazon has since released a statement saying that all of its products must adhere to their [Amazon’s] selling guidelines and all UK laws. They also stated that they will work closely with police and law enforcement agencies should they need to assist with investigations.

Although it’s not clear if Amazon has fixed the issue with their algorithm, there have not been any reported incidents thus far of customers taking advantage of these algorithms to build IEDs.

Andrew Zirkle is a Reporter for 71 Republic.