Can Google succeed where politicians, imams, journalists, police officers, friends, and family have failed, in dissuading young people from joining ISIS, or other extremist groups? Jigsaw, a Google technology incubator, has joined forces with U.K.-based startup Moonshot CVE to do just that, as revealed at an event hosted in early September by the Brookings Institution.

The method, called Redirect, is straightforward. Find out what people want to see when they search for ISIS material, and place advertisements alongside the results that point to countervailing content, thereby cracking the walls of the echo chambers that they are (perhaps unwittingly) building around themselves.

Let’s say straight off that this is not a silly idea. In our recent book The Devil’s Long Tail, David Stevens and I argued that much anti-extremist policy was premised on a flawed contagion model that assumes that people who joined extremist groups are Dumb And Malleable, consuming extremist content and falling under its spell. We call it the DAM thesis.

It isn’t like this. Some people, alienated from their society for whatever reason, wish to live in a state of tension with it. Extremist groups bring such people together by providing a strong sense of identification and belonging; the details of ideology are far less important. There are no free riders. Everyone does his or her share and shoulders the risks. It isn’t brainwashing, and it’s not an irrational way to seek the human solidarity that most of the rest of us take for granted.

Many extremists don’t feel they belong in more conventional social networks. Some are drifters and petty criminals, others have mental health issues. But the most important group comprises second-generation migrants, who are made to feel welcome neither where they live nor where their parents were brought up. To get a sense of how devastating this can be, it is worth looking at the literature of the desolate Parisian banlieues, many of whose residents are young, unemployed, and of North African origin; check out Medhi Charef’s Tea in the Harem or Faïza Guène’s Kiffe Kiffe Tomorrow.

Indeed, Action For Happiness (whose patron is no less than the Dalai Lama) recently set out 10 keys for happier living. Joining ISIS would get you well on your way on their account, by connecting with people (key No. 2), learning and doing new things (No. 5), working for others (No. 1), working toward a goal (No. 6), being part of something bigger (No. 10), becoming resilient (No. 7), and so on. A budding extremist could probably hit seven or eight of the 10 keys in Raqqa, Syria, compared to a measly two or three in the crummier parts of Paris or Brussels or Cairo.

Tackling extremism involves empathy, understanding the goods that extreme groups provide for their members. The Redirect initiative does this, taking seriously the search terms that those people with an already positive view of ISIS are using—for instance taking key words from ISIS slogans or narratives such as its motto “remaining and expanding.” It also realizes that you can’t just counter radicalism by putting in links to CNN or BBC stories. The ads point extremists to content that they might find credible—citizen journalism, say.

Eventually, no doubt Redirect could get pretty good at steering people away from ISIS’s narrative and toward the counternarrative. During the pilot phase, 320,000 people came under its beady eye, and together they watched 500,000 minutes of video to which they were Redirected. I don’t know whether that’s a good or bad result—it’s about a minute-and-a-half of video per person—but no doubt it can be improved incrementally. Will this have an effect on ISIS? Probably. Even a small reduction in the flow of recruits is better than none, and better data will improve effectiveness.

In that case, why am I skeptical? Broadly, two reasons. The first is that success here and now with some whizzy new method does not always equate to success in a future where all the actors have adapted to the new landscape. We assume when we search that the results are the output of a neutral algorithm, and the ads are placed by people who want us to click on them for their own commercial, religious or ideological purposes. But Redirect subverts this assumption: The ad isn’t there for the advertiser’s benefit but for mine. It intends to make me a better person (on the advertiser’s definition of “better”). In a future world where such techniques were known about and understood, wouldn’t we just stop clicking on ads, especially when seeking edgy or transgressive content? Won’t this method undermine itself over time as awareness grows?

Furthermore, those seeking solidarity in extremism are looking for something. However wise it is to steer them away from finding it in ISIS or similar groups, Redirect isn’t going to help them reach their goal. They will keep looking. To be fair, the Jigsaw/Moonshot researchers are aware of this issue, and they say that in later phases they hope to include therapeutic resources such as online counseling.

But even if that happens, we have to grapple with a second, and bigger, problem: Even though putting people off ISIS is clearly a good thing, can it be done without undermining other important tenets about how we live together and organise ourselves, including free expression and access to information?

Redirect’s techniques are privatized and reproducible. Anyone with the money could buy ads to steer people away from Trump, or Clinton, or porn, or climate science, or worries about high crime or immigration. In one sense, this is an unobjectionable part of free expression—no content is removed from the web, and people are oriented toward existing content that they may also trust. It’s less Orwellian than redolent of the covert influence of preferences in Aldous Huxley’s Brave New World. Does this cross what Eric Schmidt once creepily called “the creepy line”? We are being influenced, but not told how or by whom. Without transparency in this area, can we really consider ourselves autonomous individuals, masters of our fates?

In any case, much of the work is in tracing and understanding the pattern of people’s searches. Redirect affects only the links between content. But, although no one accuses Jigsaw of this, it is not a big step to use the methods to censor the content searched for (it is already routine to do this for child porn, and Google is similarly trying to suppress revenge porn), or to classify the searcher (on highly uncertain evidence that will inevitably produce a lot of false positives) as a potential terrorist. If we could stop this at ISIS, fine. We can’t.

We are starting to live, as many have argued, in a post-truth world where feelings are more important than facts, where the internet forces information to compete on level terms with misinformation, and where trust in authority and the traditional gatekeepers of content is at a historic low. Many see scholarship, science, evidence, peer review, experience, and expertise not as reliable guarantors of truth but rather as incontrovertible proof of an establishment conspiracy. Redirect is a noble and typically clever techie initiative to help divert people away from truly appalling and vile apocalyptic nihilism, but it does nothing in itself to re-establish the credibility of mainstream views.

To that extent, it is an example of traditional authorities’ adaptation to the post-truth environment. By implicitly placing themselves alongside extreme actors such as ISIS and positioning themselves as equal (if better resourced) players, maybe such authorities thereby undermine their own myths of uniqueness and superiority? Let’s hope they are not storing up even more trouble in the longer term.

This article is part of Future Tense, a collaboration among Arizona State University, New America, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, follow us on Twitter and sign up for our weekly newsletter.