For a long time, highly skilled and vaunted professions like medicine were viewed as fundamentally human, and largely impervious to the sweeping specter of automation. In health care, this immunity was granted by the high levels of cognitive and emotional skill presumed to be required for effective medical care.

But that’s changing, fast. Advances in AI and increased use of digital tools by doctors have made it realistic that machines could come for health care jobs, particularly those composed largely of repetitive cognitive tasks, such as diagnostics and scans. Not only are machines approaching expert skill levels in these areas, they are already surpassing them in some cases. As machine-learning technology develops and improves, (and more data are gathered), AI will become even more pervasive in the health system.

Except medicine requires more than just technical expertise: It needs empathy, too.

Empathy has been noticeably lacking in medicine as of late. In the past few decades, doctors have developed a reputation for being cold and aloof, for treating patients as numbers and objects, not human beings with valid lived experiences and unique histories. One of the most common complaints among patients today is the “clinical” attitude of their attending physicians. That word has become synonymous with detached, unempathetic, and impersonal treatment—everything many of us would much rather our attending physician not be.

Zack Rosebrugh for Quartz

One of the excuses given is that doctors simply don’t have time to be empathetic. When overrun with patients, basic human empathy is sidelined in favor of the objective aspects of care—better an unhappy patient who lives than a happy one who dies.

“Quite easily in medicine, time pressure becomes just another excuse to avoid difficult conversations,” says Chris Lovejoy, a physician-in-training at St. George’s Hospital in London. And on the patient side, when we receive detached, impersonal treatment, we justify our doctors’ lack of empathy as a necessary consequence of better technical expertise. It’s easy to understand why. As long as they do their jobs, diagnose us correctly, and help us get well, why should we complain about our doctors’ poor bedside manner, their lack of sympathetic smiles or genuine human connection?

Except empathy is just as vital for medicine as technical proficiency. More empathic care demonstrably improves patient satisfaction, leads to better patient outcomes, and lowers the risk for errors and malpractice suits. It also helps decrease the risks of physician burnout.

“Empathy is almost sacred in medicine,” says Matthew Fenech, a former clinical fellow in diabetes and obesity at the University of East Anglia in Norwich, England who has since shifted his attention to AI policy. “It’s one of the reasons why people trust doctors. Medicine has always been an art and a science, and that human aspect is becoming more important now that the science part is becoming easier.”

Lovejoy agrees. “Empathy is, and always has been, a core part of medical practice,” he says. “I remember a psychiatrist at medical school who would always say ‘the first thing you prescribe to a patient is yourself.’ No matter how advanced technology gets, the doctor will still play a key role.”

Care is not just restricted to doctors, says Anne Cooper, the former chief nurse for NHS Digital, the England National Health Service’s digital arm. “It extends far beyond medicine, and it’s so much more than just the administration of technical interventions,” says Cooper. “Care will always be a human as well as a technical interaction.”

So could AI help bring medicine back to its humanistic roots? Supporters of artificial intelligence say that, beyond the obvious advances in technical care and access, the technology will alleviate doctor burnout, and give time back to physicians to focus on the diminishing human aspect to medicine. By liberating physicians, at least partly, from the weighty goal of maintaining technical expertise and undertaking complex technical tasks, AI allows physicians to get back to what they are: human beings.

This is as important for those delivering care as it is for patients. Fenech says there were a number of reasons he decided to leave medical practice for policy work, but one big one was “that I found it hard to give patients the time that they needed. Ten minute appointments and moving on to the next one on the list again and again. It’s frustrating, for us and for patients, and I know many other [health care providers] find it frustrating too.”

Giving doctors back the time they need to be empathetic is the hope AI offers. However, pessimists believe it’s just as likely AI exacerbates the situation medicine finds itself in today. Anecdotally at least, technology doesn’t seem to have helped us so far: As doctors’ offices and hospitals level up technologically, empathy appears to go down. “It’s not straightforward,” says Cooper. “Adding AI alone will be unlikely to have a direct improvement on empathy.” We’ll need something more.

Zack Rosebrugh for Quartz

At the same time, the challenges medicine will face in the future will be different from those it faced in the past. Globally, populations are growing and ageing, developing increasingly complicated health problems that necessitate an ever-more-expensive arsenal of medicines. That means medical budgets around the world are continually squeezed and increasingly subject to political scrutiny. Physicians are an expensive resource, and advances in AI could be used to justify budget cuts. If smart machines are freeing up all this time for doctors to be empathetic, why not just hire fewer doctors and save money? Given the tendency to see empathy as a nice-to-have, it’s easy to see who—or what—may win when hard decisions must be made. “These are complex chronic medical conditions requiring intensive and nuanced input from many different professionals,” says Fenech. “I wouldn’t be surprised if, on some measure, demands are increasing and that that sort of environment would risk a decrease in empathy.”

The result might be a shift from today’s physician-dominated primary care to one run by other professionals with a presiding algorithm. This may still lead to a good standard of “clinical” care—after all, AI is excelling in the technical side of things, and other professionals, such as nurses, who typically spend much more time with patients while administering care and may be expected to make up for the lack of empathy doctors display today, will probably still be around in a similar capacity. In fact, despite the fact that many perceive it as a nice-to-have, being human is one of the areas most resistant to the onslaught of automation. Jobs requiring more emotional and empathic skills, such as nursing, are consistently ranked as being relatively safe from automation in the near future.

That still requires health care providers to embrace a role geared more towards human connection. “In theory automation will free up doctors to empathize more,” says Lovejoy, “but what I’m worried about is the fact that so many doctors will have become used to this style of working that even when the pressure is removed they will think of it as ‘free time’ rather than extra time to spend with patients.”

A failure to properly prepare and train future (or retrain current) doctors in empathy could place those around them under even greater strain. Nursing today is already known for burnout, increasing workplace pressure, and unmanageable numbers of patients.

Some say human doctors who don’t already display empathy are simply incapable of empathizing with those under their care. “You can’t learn ethics or compassion. You either have it or you don’t,” wrote one physician-teacher in an 2009 New York Times op-ed. This view remains controversial, but common. “Certainly medical schools focus on empathy,” says Fenech, “but it’s certainly not enough.”

But many others maintain that empathy can, at least in part, be taught, through programs like Oncotalk. These courses force students to work on acting in specific ways—pulling up a chair next to a patient, making eye contact, not standing over a patient in a bed—and ingrain certain procedural changes, such as making more time when delivering difficult diagnoses and not answering statements of feeling with statements of fact.

Zack Rosebrugh for Quartz

The same people who support programs like Oncotalk note that decades-worth of entrance exams such as the Medical College Admission Test (MCAT), the standardized test used to assess those applying to medical school in the US, which have deprioritized compassion, may have led to generations of doctors currently in practice whom may not be naturally empathetic. In fact, while medical schools claim to teach empathy and good bedside manners, studies suggest doctors graduating medical school have noticeably diminished empathy compared to when they started. This, they say, is likely a side effect of the heavy emphasis schools place on the technical aspects of medicine. In a future where AI handles much of the technical aspects of the health care practice, medical schools will need to improve and expand their instruction dedicated to fostering empathy.

It seems some of the people most invested in the medical profession are beginning to realize that this is the best course of action—both for patients, and to save doctors from the pending AI inundation. For example, recent changes to the MCAT, have added questions on human behavior, neuroscience, and psychology. These changes were made to recognize that “being a good doctor is about more than scientific knowledge,” according to Darrell Kirch, the head of the Association of American Medical Colleges. “It also requires an understanding of people.”

Medicine is, and has always been, an endeavor of two parts. Recently, the cold, hard, scientific part seems to have been been prioritized at the expense of the emotional, human part. But the cold hard science is what machines seem best positioned to take over. The human element of medicine, for the foreseeable future at least, can still only be implemented by another human, and to many professionals it always should be. Doctors should be doing their best to reclaim their humanistic heritage. In the not so distant future, it may well be that it is all there is for them to claim.

This story is one in a series of articles on the impact of artificial intelligence on health care and medicine. Click here to sign up to get alerted when new stories are published.