One minute you are contemplating your next vacation, the next you click on Facebook and see adverts for hotels and cheap flights. Coincidence? Surely. But what about that smart speaker in your kitchen? It doesn’t just play music and provide an on-demand weather forecast. It hears about the groceries that need replenishing, the cat’s trip to the vet, and, of course, your vacation plans.

Now imagine what such a listening device could do in the workplace. The whispered conversation about difficulties with your latest project is automatically reported to your manager. The half hour spent discussing child care arrangements is sent to Human Resources. And what about the things you don’t say: the colleague you don’t greet in the morning, the co-worker you never ask to join you for coffee? These non-conversations are likewise noted and filed away to be discussed at your next appraisal.

If researchers at Northeastern University have their way, one day soon we will go to work and have not only our words but our thoughts and feelings monitored and analyzed by a listening gadget just like the smart speaker in our kitchen. Associate professors Christoph Riedl and Brooke Foucault Welles are in receipt of a $1.5 million grant from the U.S. Army Research Laboratory to come up with just such a device. They will spend the next three years studying how teams interact with each other and with smart devices “using a combination of social science theories, machine learning, and audio-visual and physiological sensors.” A key aim for their final product will be to ensure “the equal inclusion of all team members.”

This yet-to-be-invented machine is already being heralded for its potential to revolutionize equality and diversity in the workplace by alerting users to instances of implicit bias. It will record verbal and nonverbal cues, as well as the “physiological signals” shared between members of a team. Then, having noted and analyzed all these tiny interactions and non-interactions, the speaker will make recommendations for improving inclusivity and productivity.

To any sane person, this is a truly terrifying prospect—not because we arrive at the office each morning desperate to dole out racist and sexist abuse to our colleagues, but because of the opposite: we want to get on with our jobs and get on with our co-workers. We know that a spying machine, watching, listening, monitoring, and advising, is far more likely to interrupt our work and fuel dissent than it is to increase productivity.

The Northeastern researchers want their device to play a role in tackling “implicit bias,” which they define as “the automatic, and often unintentional, associations people have in their minds about groups of people.” Implicit bias is, in other words, the content of our subconscious, the unarticulated and perhaps even unformed thoughts and feelings that apparently shape our interactions with each other. There’s a lot to unpack here. If our subconscious thoughts are not formulated, then how can any machine or online test purport to access them? And even if our innermost thoughts and feelings can be accurately measured—who cares? Aren’t our actions and the words we actually say out loud more significant when it comes to assessing discrimination than something we may or may not think?

Workplaces are following where universities lead. Over 100 colleges now have Bias Response Teams that aim to provide “advocacy and support to anyone on campus who has experienced, or been a witness of, an incident of bias or discrimination.” Without an ability to read minds, Bias Response Teams resort to rooting out microaggressions. Ask your classmate where they are from, assume someone’s gender, ask an Asian student for help with math, and the Bias Response Team will swoop in to protect the victim and re-educate the perpetrator.

The drive towards office surveillance suggests that the politics of the campus has entered the workplace. We are all students now. Rather than colleagues with interests in common, we are to see the workplace as divided, not between a business owner intent on making a profit and employees scraping by, but between oppressors—let’s be blunt: straight white men—and the oppressed—everyone else. These machines will be grievance incubators, sowing dissent where none was previously apparent.

The perceived need for a speaker suggests that colleagues cannot resolve issues between themselves. The researchers ask: “But what if a smart device, similar to the Amazon Alexa, could tell when your boss inadvertently left a female colleague out of an important decision, or made her feel that her perspective wasn’t valued?” Put to the side for a moment the impossibility of a machine knowing how a member of a staff feels. Why should we assume a woman is not able to speak up and make her perspective known? The example of the victimized female colleague suggests the Northeastern inventors may have some implicit biases of their own.

Relying on machines to monitor our eye contact with colleagues and note what we say and do not say is infantilizing and incapacitating. It reduces us all to the level of schoolchildren in need of constant supervision. This is good neither for running a business nor for workers who just want to get on with their jobs. Let’s hope the grievance incubator spying device never sees the light of day, or at least has an off switch.

Joanna Williams in the director of the UK based think tank Cieo.