Mason Marks is a law professor at Gonzaga University and a Research Scholar at NYU Law School’s Information Law Institute. Find him on Twitter @MasonMarksMD

Imagine you are a medical marijuana patient driving to a cannabis dispensary. As you pull into the parking lot, surveillance cameras record your license plate number. You step out of the car, and walk toward the entrance.

A sign above the door reads “please look up for entry.” You crane your neck and gaze into a camera paired with artificial intelligence that analyzes your face. A red light suddenly turns green, and the door slides open. You enter the store and bypass a line of customers waiting at the register, opting instead for a self-service kiosk.

As you approach the machine, in-store cameras feed images to algorithms that analyze your appearance to determine if you might be carrying a weapon, and compare your face to millions of photos in a law enforcement database. When you finally reach the kiosk, it scans your face, identifies you as a returning customer, and greets you with a coupon for your favorite cannabis product.

This may sound like a scene from a sci-fi movie, but these tools are employed in cannabis dispensaries today. The cannabis industry is embracing new technologies like facial recognition and advanced video analytics throughout the supply chain—from grow rooms and processing facilities to distribution centers and retail dispensaries. The companies behind the technology say it benefits cannabis businesses, employees, and consumers. But in an industry marred by decades of mass-incarceration that has discriminated against communities of color, face surveillance poses serious privacy risks, and can easily be used for targeted harassment.

"It is hard, if not impossible, to find an example of a surveillance technology that has not been turned against groups that are already vulnerable in our structurally inequitable system," said Shankar Narayan, Director of the Technology and Liberty Project at the ACLU of Washington, in an interview with Motherboard. Although legal for medical or recreational use in 33 states, cannabis remains illegal under federal law. Because it occupies a legal grey area, banks are hesitant to touch the industry, making it primarily an all-cash business and an attractive target for thieves. In Denver, Colorado, alone, there were 34 reported dispensary robberies in the first half of 2019.

Some tech companies see the risk of theft as an opportunity to sell facial recognition systems. Don Deason, VP of Sales for Blue Line Technology, claims his company’s platform has significantly reduced cannabis robberies. It works like this: When customers approach the front door of a dispensary, audiovisual cues prompt them to look up at a camera. If they comply, the system records an image of their faces, and the front door opens. If they decline or their faces are obscured, by a mask for example, then access is denied.

The system is also used to deter robberies and mass shootings in convenience stores, schools, and office buildings.

A facial recognition system from Blue Line Technology hangs above the entrance to a convenience store. Courtesy of Blue Line Technology

Deason told Motherboard that as long as customers don’t shoplift or cause a disturbance, “their information is deleted after 48 hours.” However, if a store's management believes customers are misbehaving, they can tag each face with a unique number, and the system retains that information indefinitely. If tagged customers later return to the store, the system recognizes them and alerts employees of their arrival by email or text message. Deason said Blue Line encourages dispensaries not to confront tagged customers, but ultimately "store owners set the store security policy and procedures," and, "the security response varies based upon store policy."

Blue Line’s platform also controls access to restricted areas of cannabis businesses such as grow houses, cutting rooms, and safes, serving as a replacement for keys and access cards. When paired with other devices such as RFID tags, which are affixed to cannabis products, face recognition systems can track cannabis as it changes hands from one employee to the next.

“Many cannabis robberies are inside jobs,” said Matthew Heyl of Helix Security, a Denver company that provides surveillance products and services to cannabis businesses. He claimed video analytics and biometric access controls establish a chain-of-custody and deter diversion of legal cannabis to illicit markets.

For those reasons, government agencies that enforce cannabis laws are interested in facial recognition, said Steve Owens, the CEO of Adherence Compliance, a Denver consulting firm that has partnered with Blue Line. “This topic is really resonating with the regulators,” Owens told Motherboard. “When we mention it to Alameda County, they get it right away, because it helps them with their investigations.”

In addition to tracking employees and controlling access, facial recognition is used in dispensaries at the point of sale for age-verification. A Las Vegas based company called 420 Cyber markets its Badass Budtender kiosk as a replacement for human “budtenders” who check ID at the register. The kiosks can be equipped with facial recognition to ensure customers are of legal age.

Inside dispensaries, facial recognition can do far more. 420 Cyber markets what it calls “Video Active Security Monitoring” (VASM), which it says can determine whether customers carry concealed weapons, if there are warrants for their arrest, and whether their appearance matches “be on the lookout” (BOLO) alerts issued by police. It can reportedly recognize A-list celebrities if they happen to visit your store.

Consumers using 420 Cyber’s kiosks can also opt-in to personalization services: The units can scan and identify people’s faces, interpret their emotional responses to products, and help dispensaries learn which brands they prefer. 420 Cyber’s website says this data can be used to deliver targeted content “designed for individual viewing based on age, race, gender, location and daypart [the time of day a customer visits the store].”

Despite what vendors say, face recognition technology remains problematic and controversial. Algorithmic systems naturally adopt the objectives and values of their creators, and research shows that systems trained on insufficiently diverse datasets are often inaccurate and sometimes discriminate against women, racial minorities, and members of the LGBTQ community.

Even if the system is working as designed, face recognition can easily be adapted to target immigrants, activists, and other marginalized groups with little or no oversight. Citing those risks, at least three cities including San Francisco, Oakland, and Somerville, Massachusetts have banned municipal use of the technology. In June, the leading supplier of police body cameras, Axon, removed facial recognition from its services after an ethics board concluded it was “not yet reliable enough to justify its use.”

"Despite what developers may say, facial recognition technology has the potential to reinforce the racist and classist policies of prohibition”

“Technology makes a lot of promises, but there’s no guarantee they can deliver,” wrote Kamani Jefferson and Tyler McFadden in an email interview with Motherboard. The pair founded North Star Liberty Group, a DC-based government relations firm that advocates for ending cannabis prohibition while promoting racial and economic equality.

Jefferson previously served as President of the Massachusetts Recreational Consumer Council, where he helped push for a state-run social equity program that helps groups disproportionately impacted by the War on Drugs participate in the cannabis industry through professional training and mentoring. In July, Michigan announced its own social equity initiative. California created one last year, and San Francisco, Sacramento, and Los Angeles have local programs.

"Despite what developers may say, facial recognition technology has the potential to reinforce the racist and classist policies of prohibition,” Jefferson said. “It's a classic case of a slippery slope, and until there's a guarantee that not one innocent person will be thrown in jail due to the faults of this technology, I wouldn't recommend cannabis facilities waste their money."

Grayce Bentley is the Social Equity Coordinator for Cannabis Advising Partners in Long Beach, CA. In a phone interview, she told Motherboard: “I don’t think this is right at all, especially if facial recognition has been shown to be biased based on race, gender, et cetera.” Moreover, Bentley said most dispensaries serve a clientele consisting of both medical and recreational cannabis consumers, and “facial recognition should not be used in businesses where medical patients could be present.” She argued that collecting face data could violate federal health privacy laws such as the Health Information Portability and Accountability Act (HIPAA).

Data breaches will likely be a growing problem for the cannabis industry as well. In 2017, a company called MJ Freeway, a major provider of software to cannabis businesses, suffered multiple hacking attempts. In one incident, hackers obtained consumers’ date of birth, contact information, and other unspecified data. If the company had also kept images of customers' faces, the breach could have been more disastrous.

All the companies Motherboard spoke with said they make efforts to protect face recognition data through encryption. "But encryption is not a panacea," said Ido Kilovaty, a law professor at the University of Tulsa who specializes in cybersecurity. "Hackers can launch brute-force attacks or look for other vulnerabilities, and there is always a risk of insider threats."

“It doesn't matter if the developers ‘don't see race’ when their algorithm and security staff undoubtedly do.”

Even if impenetrable cybersecurity was achievable, it wouldn’t protect consumers from discrimination based on facial recognition. In recent years, there has been a rash of troubling AI systems that attempt to make assumptions about peoples’ sexuality and potential criminality based solely on their facial features. Tech ethicists have warned that the trend threatens to revive long-disproven pseudoscience practices like physiognomy, which have historically been used to justify racism and discrimination.

Shankar Narayan said he’s concerned about mission creep—when technologies implemented for a specific purpose are shifted to another application. A cannabis business might start out using facial recognition to analyze people’s emotional responses to different products, “but you can take that further, and start analyzing people’s propensity for violence,” said Narayan. Since facial recognition may be biased against vulnerable communities, it could disproportionately mischaracterize members of those groups as dangerous.

Narayan also noted that private surveillance systems can easily be repurposed for use by law enforcement and federal agencies. One example is police use of Amazon’s Ring doorbell cameras, which was recently reported on by Motherboard. “While being operated by an individual entity, it’s a private camera, and it need not conform to any rules around surveillance that apply to government cameras. But the company may turn the data over to the government,” Narayan said. “And then for all intents and purposes, it’s functioning as a government camera.”

Some companies marketing facial recognition to the cannabis industry have deep ties to law enforcement. Blue Line was founded by Joseph Spiess, Tom Sawyer, and Marcos Silva. Spiess is Chief of Police for the St. Louis suburb of Brentwood, Missouri. Sawyer, a retired St. Louis detective and DEA agent, built his career investigating drug crimes. Silva, an Army veteran who served in the Iraq War, is a St. Louis police detective who designed, implemented, and oversees the city’s real-time crime center (RTCC).

Michael Kwet, a fellow at Yale Law School’s Information Society Project who researches surveillance technology, expressed concerns: “For years, these officers locked people away for possession and sale of marijuana, with devastating effects on communities of color. Now they’re cashing in to protect the legal marijuana industry with facial recognition, while people previously persecuted languish behind bars."

According to its website, the RTCC operated by Blue Line's Silva “is focused on monitoring, deterring and evaluating criminal activity in real-time with the help of the advanced technology in the center,” which includes license plate readers, gunshot spotters, and crime analysis software. In 2015, former Police Chief Sam Dotson told St. Louis Public Radio the RTCC would tap into surveillance cameras owned by private companies and use "new software that would allow the analysts to better predict crime."

Blue Line told Motherboard it does not have access to the face recognition databases of the cannabis businesses it serves, and therefore, it cannot share that data with law enforcement. However, because its clients set their own security policies and responses, store owners are free to turn facial recognition data over to police. Through this kind of sharing between private and public surveillance networks, police could gain access to face data stored by dispensaries even in cities where facial recognition is banned for government use.

Prior to his current role at the ACLU of Washington, Shankar Narayan was the organization’s Legislative Director, and he worked on Initiative 502, Washington State’s recreational marijuana bill. Before that, he worked on medical marijuana legislation.

“In the context of that medical marijuana law, we went through a lot of these same issues, and there was intense concern over patient privacy. Coming off of that very intense discussion, there’s some deep irony that in the name of security, entities that sell cannabis are now installing these highly invasive surveillance mechanisms. That is really the opposite of the spirit in which we had the discussions around medical marijuana dispensaries, and I think we should be deeply concerned about privacy in that context.”

Addressing concerns about bias, Don Deason told Motherboard that Blue Line’s face recognition system “recognizes everyone equally,” and that the company is “not tracking age, gender, race, or what products people buy.” He said the system sorts faces into only three categories: “known, unknown, or threat,” and people are categorized as threats based solely on their behavior inside a cannabis business, not on their physical traits or facial expressions.

Os Keyes, a doctoral researcher at the University of Washington who studies human-computer interaction, told Motherboard that Blue Line “has an incredibly shallow understanding of the concerns about bias in facial recognition.” They noted that whether security guards or police stop and search customers or accuse them of shoplifting may be influenced by personal prejudices.

"Whether someone is accurately matched by facial recognition is, similarly, something that we know has racial and gender biases," said Keyes. “It doesn't matter if the developers ‘don't see race’ when their algorithm and security staff undoubtedly do.”

Despite tech company efforts to protect face recognition data and reduce bias, many cannabis industry experts remain uncomfortable with the technology.

Kamani Jefferson and Tyler McFadden implied it is unnecessary. They referenced statistics suggesting crime has decreased in states and neighborhoods with licensed cannabis dispensaries.

“There’s no reason to believe that trend won’t continue,” they added.

Griffen Thorne, an attorney with the law firm Harris Bricken, expressed doubt that adopting facial recognition technology would help businesses comply with state and local cannabis laws. “In California, cannabis businesses must have a security plan. They must have video recording, and doors that lock,” he said. “Beyond those basics, you don’t need to use fingerprint scanners or facial recognition technology.”

Shankar Narayan asked, “How can we be a free society with this level of surveillance? It kills free speech, it chills constitutional activity, it disproportionately impacts communities of color, it’s subject to abuse, [and] there’s not a lot of checks and balances here.”