There has been extensive public discussion of how automation may fundamentally change the job market, eliminating so many positions that some sort of universal income will be required. While the universal income discussion may be new, anxieties about machinery and automation replacing human work go back over a century, with worries growing with the advent of robotics and machine-learning algorithms.

In general, it's clear that people worry about jobs lost due to automation. But they tend to worry about job losses in general. To find out whether job losses due to automation produce a distinct set of worries, researchers did an extensive study of how people respond to job loss. It turns out they are concerned when other people are displaced by automation. But when it comes to themselves, they'd rather be displaced by an algorithm than a person.

Social concerns vs. social status

The researchers involved in the new work—Armin Granulo, Christoph Fuchs, and Stefano Puntoni—cite an extensive survey of European residents, which showed that they tend to view robots as displacing human employment. This worry was true even for students and management employees, who aren't currently at risk of being replaced themselves. The researchers suggest there are two ways to interpret this. One interpretation is that this is based on personal worries that they may end up in a job that's vulnerable to automation, the other is that it could be a more general pro-social view, driven by concern for other people losing their job.

While there's good information on general worries about automation, there's far less data out there that addresses how people feel about automation affecting them personally. So, the three researchers decided to rectify that. They figured there are two possibilities. People would view losing a job as equally bad when they were displaced by a person as if they were displaced by a robot. Alternately, there might be a social status issue. Losing your job to a person would mean that someone else was judged superior to you in some way, which might be worse than losing your job to a hyper-efficient robot.

To get a better sense about how people feel, the researchers performed a few experiments with a variety of participants. Some of these were your garden-variety college students, both from the US and Europe. But others brought in people from online labor markets, currently employed managers, and the recently fired. Collectively, these experiments provided some indication that the results could be replicated and weren't specific to a single country (though they were limited to industrialized societies).

In the simplest test, people were split up into four groups. Two were told that people were going to lose jobs; half of these were told that the workers would be replaced by other people, and half told that the job was being taken over by a robot (or, in some experiments, software). The second group was told to imagine their job was about to be lost, either to a person or a robot. By a two-to-one margin, when the job was someone else's, the participants preferred to see the replacement be another human. But when the job being lost was their own, only 40% of the participants wanted to see it lost to another human. This was about the same as the percentage who preferred to see a robot take over (the rest didn't care or didn't provide an answer).

Strong feelings

Different groups of participants were asked to consider a similar set of scenarios but rate how they felt about it. When the job being lost was someone else's, replacement by a robot induced stronger negative emotions. When the job being lost belonged to the person being asked, the situations reversed, and human replacements produced stronger negative emotions. Another study swapped out robots for software and produced similar results, leading to the rather unusual statement that "Participants displayed a strong and significant preference for being replaced by software."

The same pattern held true when the researchers zeroed in on factory workers who had indicated that they were concerned that their jobs would be replaced by automation, as well as people who had recently lost their jobs, suggesting it has real-world relevance.

In one of the more intriguing scenarios they asked about, the researchers had people consider being replaced by AI software—and being replaced by a human who used the AI software. Participants rated these as roughly equivalent (and both were considered better than simply being replaced by another human). So, giving a human a technological supplement was enough to reduce the emotional hit of having another human being considered a superior fit for your job.

This isn't to say that people weren't worried about the phenomenon of replacement-by-robot. When specifically asked about future economic concerns, there was clear evidence that automation was a worry. It's just that, when confronted by immediate replacement of any sort, the threat to self-image from a human replacement turned out to be stronger—researchers estimated its effect is four times stronger.

Overall, the results indicate that our feelings toward automation taking our jobs are complex. When it comes to the phenomenon in general, our concerns about our fellow humans win out, and we're against seeing them lose their jobs to machines. But when it comes to losing our own jobs, things get more complex. There's obviously some negative feelings when it comes to job loss, but those are actually moderated somewhat when we're not losing the job to a fellow human. "When one’s own job is at risk," the researchers note, "social comparison processes become more relevant and overshadow prosocial feelings."

In addition to being a somewhat unusual window into the human psyche, the authors note that it has implications about everything from automation and employment policy discussion to how to structure retraining for people who have lost their jobs. In science, the border between "that's strange" and "that's useful" can often be nonexistent.

Nature Human Behavior, 2019. DOI: 10.1038/s41562-019-0670-y (About DOIs).