In 1979, executives at the Xerox Corporation realised there was a problem with the company’s new photocopier, the 8200. Its selling point was simplicity; a big green button had been added to the control panel so that users knew exactly what to press. But according to customers, it was bafflingly complicated, and they were having trouble making it work. The executives threw the problem to the boffins at Xerox’s research and development arm, the Paolo Alto Research Centre (PARC) in California.

In the 1970s, PARC was an intellectual playground where brilliant scientists from different fields were given free rein to invent the future of the office. PARC played a pivotal role in creating some of the machines that now populate our workplaces, including the personal computer, the digital mouse and the laser printer.

The 8200’s usability problem found its way to a PhD student on a work placement. Lucy Suchman was an anomaly at PARC, in more ways than one. First, she was a woman. Second, she had never used a computer – she wasn’t a programmer or an engineer, but an anthropologist. Suchman’s PhD was on how people interact in corporate environments. An acquaintance had arranged for her to spend time in one, and it happened to be PARC.

Nevertheless, Suchman quickly became fascinated by the photocopier problem. She usually studied humans interacting with humans; what could we learn from studying humans interacting with machines? Suchman persuaded Xerox to install one of its photocopiers in a room at PARC, and rigged the room with video cameras.

Later, in a presentation to Xerox executives, she showed a tape of two men in shirt sleeves getting increasingly exasperated by their failure to make double-sided copies. The executives scoffed at the technological illiteracy of the men until Suchman pointed out that the unlucky pair were computer scientists, and that one of them was Allen Newell, winner of a Turing Award (the computer science equivalent of a Nobel Prize). If these guys couldn’t use the 8200, what hope was there for anyone else? Suchman’s message to the executives was that no matter how clever a machine is, it has to be designed to meet the needs of people. And people are not like machines, no matter how much engineers might wish them to be. They don’t have the patience to follow complicated instructions. They don’t carry out plans logically and predictably. They muddle through, responding intuitively.

They also bring certain prejudices to their choices. Barton Friedland, a scholar of Suchman’s work, is now a digital strategist for Deutsche Bank, but back in the early 1980s he was a salesperson for Apple. He spent a lot of time with lawyers, selling them on the benefits of having a desktop computer. They mostly agreed with him, which made it hard to understand why they were so reluctant to buy. One day, he was with a lawyer in his office, when he had an epiphany. “I was telling him about all the wonderful things he could do if he had his own computer. Finally, he said, ‘Why would I want one of those on my desk? That’s what they have.’ And he pointed towards the secretary pool.” Technology, Friedland realised, is never just about technology.

Lucy Suchman went on to lead a team of anthropologists at PARC tasked with understanding how people incorporate technology into their lives. Today, user experience, or “UX”, is taken very seriously by the tech business. Companies such as Apple and Airbnb employ anthropologists and psychologists to understand the role that non-silicon agents (also known as “humans”) play in their systems. This is far from saying, however, that they put people first.

After her time at PARC, Suchman resumed her academic career, taking a post at Lancaster University in the UK. These days, Professor Suchman is preoccupied by the politics of automation. She believes the debate is too often framed in a way that implies computers can simply substitute for people. (Roboticists often note that their creations, mind-blowing in other ways, struggle to master the kind of tasks a toddler can do.) “I want to recover the differences between humans and machines,” says Suchman. She would like us to take a step back and ask whether we are building a world fit for humans, or for robots.

For example, machines are not comfortable with navigating complex, unpredictable environments – such as the home. Suchman is interested in how hard it has proved to automate housework. After the early successes of the washing machine and the dishwasher – which, of course, still have to be loaded by someone – engineers have had a hard time realising the perennial dream of a “smart home”. Perhaps the labour involved in running a household is of a higher-level, more sophisticated nature than (mostly male) roboticists have assumed?

One answer, of course, may be to build homes that accommodate technology first and people second. Suchman highlights a recurring pattern: when machines cannot navigate human environments, we end up engineering environments to fit machines. A factory assembly line is a home for robots. If self-driving cars cannot navigate country roads, we must build new highways.

Today, humans themselves are increasingly required to get with the program. Employees are subject to rigid, algorithmically-driven work regimes, whether that be in call centres, the driving seat of ride-sharing taxis, or warehouses in which workers have their every movement dictated to them by an electronic voice in their ear.

Our most urgent problem is not that robots are supplanting humans, but that we are forcing humans to be more like robots. That’s why I think Suchman’s insistence on the difference between the two is so important. Neither should be a copy of the other.