I Buy Google Ads to Reduce Mass Shootings

Ad click data is still ethically flawed, but could be a better alternative to predictive A.I.

Credit: NurPhoto/Getty Images

Here’s a word-for-word search someone in the United States recently typed into Google:

“I am going to shoot up the school tomorrow”

Here are four more:

“I want to shoot up my school”

“What do you do when you want to shoot school up”

“I want to shoot up my workplace”

“I want to shoot a school up please help me”

Every day, Americans tell Google they’re going to shoot up their school or workplace. I know this because I serve ads on Google to prospective mass shooters.

When I have some extra money, I put up a credit card and buy keywords I imagine they might type into Google. When these keywords match with what prospective shooters are searching for, my ad shows up as the first result on Google throughout the United States.

My ad encourages prospective shooters to speak with a mental health professional — not that they ever call. But when my ad is clicked, the prospective shooter’s data passes into my system. This includes the words they type. But it also comprises their age, gender, and income, their parental, marital, and homeownership status, where they live, their browsing history, where they work, the age of their children, the devices they use, the time of their search, and more.

(Anyone who runs ads on Google — for any purpose — can see this kind of data about who clicks their ads.)

After the shootings in Dayton and El Paso, Trump called on social media companies to develop tech that could “detect mass shooters before they strike.” In other words, predictive A.I.: algorithms that analyze social media profile data — including followers and posts — to anticipate who might do a mass shooting in the future.

A.I. doesn’t understand context or nuance or trolling.

This type of machine learning already exists, in an early stage. Used for this purpose, it is prone to bias and error. One reason is there’s not enough mass shooter data to feed and improve the algorithms.

Meanwhile, social media is not the best channel to deploy this A.I. Social media users tend to have a siloed audience and might have all sorts of reasons to tell their followers they’re going to do a mass shooting. A.I. doesn’t understand context or nuance or trolling.

Google search data, on the other hand, isn’t as ambiguous as social media data. Google users have no audience to influence what they’re typing — unless you count marketers (like me).

Furthermore, the majority of Google users don’t know they’re essentially taking part in a one-sided conversation with a third party every time they search. They don’t know there’s someone on the other side of their screen who can see what they’re typing and doing online, and put it to use.

So people tell Google things they tell no one else — not their spouses, doctors, or shrinks.

This is part of the reason why Seth Stephens-Davidowitz, author of Everybody Lies: Big Data, New Data, and What the Internet Can Tell Us About Who We Really Are, thinks Google searches have become “the most important dataset ever collected on the human psyche.”

And I see it, too. People no longer only use Google for informational, transactional, or navigational guidance, but also as a place to deposit their confessions. (Of the searches Google receives every day, 20% have never been searched before).

Lastly, people tell Google they’re going to do a mass shooting infrequently enough that the data would not need to be made sense of by an algorithm. A small team of humans would suffice.

The location data that Google provides us marketers only gets as granular as a zip code: enough to shut down schools in a single town if the data hints at a future shooting, but not enough to pinpoint a would-be shooter’s address. This is one reason preemptive action — preventing a mass shooting Minority Report-style — won’t happen until Google provides more detailed location data.

But the Google ad click data we already have to work with — these confessions in aggregate — could, in higher volume, reduce mass shootings if it were paired with gun reform.

Click data can reveal, for example, the American cities and demographics driving the highest demand for black market guns, or the school districts most in need of workshops for troubled teens, or the research prospective shooters are conducting to finance their violence, or the U.S. Walmarts most in demand by those seeking lax background checks, or where copycats — showing intent with their clicks — spring up most often after a recent mass shooting.

In the Google Ads system, there is no keyword that can’t be bought.

However we want to understand prospective shooters to create a roadmap for policymakers, we don’t need to rely on the results of patterns that an algorithm — with its varying objectivity — has identified as a precursor to violence.

Instead, we can buy keywords we want to match with confessions. In the Google Ads system, there is no keyword that can’t be bought. No ad that won’t be clicked if written in a way that appeals to — or misleads — the searcher.

With an aggressive enough budget (with enough daily funds to spend on clicks), we would have statistical relevance on aspects of the prospective shooter’s psyche — transformable into policy — that we don’t understand yet. Well before predictive A.I. is realized.

If I have this data, it’s because Google has it. For someone who wants this data directly from Google, they’ll need legal process. For anyone else who wants access, there’s click data. And anyone with a credit card and a basic website can run ads and get this data and do something with it.

Whenever I buy keywords that prospective shooters search for, mine is almost always the only ad on Google showing in the United States. Whenever my ad isn’t showing, the shooter’s data is lost in the ether. (Running these ads isn’t as expensive as developing A.I. To receive a click from a finger that might go on to pull a trigger is little more than $1.)

If a “techno-solutionist” approach to gun violence is inevitable, then Google ad click data — a less technically flawed alternative to predictive A.I. — should be considered. Even if it is no less ethically flawed.