Usually, talk of the troubles that doctors have with electronic health records revolve around issues of frustration and burnout, and the drag on the doctor-patient relationship when a screen is between them.

(Not for nothing was "Death By A Thousand Clicks: Leading Doctors Decry Electronic Medical Records" our most viral post last year, followed by live discussions of how to improve the system.)

Now, a paper published in the journal JAMA adds to those concerns: It looked at how "usability issues" — design problems ranging from data entry and display, to defaults and drug orders — can and do hurt patients.

The federally funded study looked at more than 1.7 million reports of safety issues mainly in Pennsylvania, and found 1,956, or .11 percent, that mentioned a top-five health record system as a cause of patient harm. Some 557 (0.03 percent) reports had "language explicitly suggesting EHR usability contributed to possible patient harm," and among those, 80 caused temporary harm, seven may have caused permanent harm and two may have been fatal.

I spoke with the study's senior author, Raj Ratwani, director of the National Center for Human Factors in Healthcare, part of MedStar Health, a 10-hospital system in the mid-Atlantic. Our exchange, edited:

What prompted you to undertake this study?

Electronic health records have been adopted very rapidly across the country since 2009. Nearly every large health care system in our country purchased an electronic health record to replace the majority of the paper-based work that they'd been doing. And because our center focuses on human factors — which is studying how people interact with technology — that became a natural area of focus for us.

And what did you look at?

The use of electronic health records has been incredibly beneficial to our health care system. It's made it so that we can now reduce generally the number of medication errors that happen. It's allowed us to collect information more easily, to transmit information more easily.

But like any kind of adoption of new technology, there are going to be some challenges. And so we started looking at what some of those challenges are, and to do that, we looked at what are called 'patient safety event' reports.

Most health care systems in the country have a way for their doctors and nurses to type up a report when they identify that there can be an unsafe condition or when there's something that actually harms a patient or 'reaches' the patient. Health care systems collect these reports all the time, and it should be noted that collecting those reports is a very positive thing. We want to know what's happening so that we can fix it.

Our analysis focused on those reports primarily from the state of Pennsylvania and also a health care system outside of Pennsylvania. And we analyzed those descriptions to see which named one of the top five electronic health record companies' products, and which ones related to usability.

Was this study a first?

We and others have done a lot of work analyzing patient safety event reports in the past. But to do it with this kind of scale, focused on health IT, is unique — and we've learned a lot from it. We looked at a total database of over 1.7 million reports and those reports came from over 500 different health care facilities.

And you found?

The primary finding here is really that the usability of the electronic health record is associated with patient harm events.

And when we say usability, what we're talking about is essentially how easy and intuitive it is for our clinicians to interact with and use electronic health records. And the finding is essentially that the poor usability of some of these electronic health records can actually impact the patient and can do so negatively. So we're seeing patients being harmed potentially because of the design and use of these systems.

If you extrapolated your numbers out nationally, how many cases of patient harm might we be seeing?

It's a great question. It's really hard to peg an estimate on that, and so I wouldn't want to throw out a number that is grossly inaccurate. What I will say is that we focused, in the grand scheme of things, on a relatively small number of reports. I know I just described how big the data set is, but the data were sourced from really primarily from one state. And so we can imagine how this would grow if we think about all 50 states. We also focused only on the top five electronic health record vendor companies and products.

These reports are also typically under-representative of what's actually happening in the health-care system. It's recognized that many times these reports don't get filed. So I would say that I believe these are a small representation. I think in some ways it's the tip of the iceberg in terms of the total number of events that are actually happening.

I will just add that, thankfully, we have very, very well-trained clinicians in our hospitals and in our care environments. And so, while these events are probably happening fairly often, they are often caught by our bright and intelligent caregivers.

Could you give me some examples of how a serious incident caused by an EHR system could happen?

• The first is in the pediatric setting. In the U.S., we generally measure in pounds, and so our scales are in pounds. So if the nurse enters weight information for a child into the electronic health record, the display may take the weight in pounds, or it may have a field for kilograms, and oftentimes it can be confusing as to whether it's in pounds or kilograms because of the design. So, hypothetically, the patient is 20 pounds. If that gets entered in kilograms, you've essentially just doubled the patient's weight. Now, any medications given to that patient are based off of their weight. And so you can have an unbelievable overdose because of that weight difference.

• Another example: In an emergency department, they will often order lab tests, and those lab tests come in different 'panels' — the panel is just a different number of things that they're testing for, so you can think of it kind of like a menu. Let's say a panel has 11 different elements — 11 things that the doctor is ordering. The laboratory then runs the test and sends the results back to the electronic health record for the physician to look at.

What the lab often does is they send the results back as soon as they get them. They don't wait for all 11 results to come back — and that can be a good thing because as soon as something's back you want the doctor to know it. But if there are 11 items, and only 10 come back, a display may not show that that 11th one is pending. It doesn't have an open box that says 'waiting for results.' It just doesn't show up.

So doctors in the emergency department are not sitting there counting each one to say, 'OK, I can see that I got 10 back, and I need to wait for one more.' They're quickly looking at it, and they may miss the fact that the 11th item did not come back because visually, on the screen, there's no placeholder.

So if they see that all those 10 results that came back are normal, they may discharge the patient. And then later on, the 11th result comes back, and it's abnormal, and that can lead to very, very serious harm because you just discharged a patient that had an abnormal value. And now you have to even get back in touch with that patient. And of course, through that time the patient may be harmed because of their health condition.

• One other example: This has to do with what's called a pending order. When a physician or a nurse is putting in orders for patients — it could be an order for a particular medication, it could be an order to have a lab result done — they can put these orders in to plan them, because a lot of these orders can be very complex for patients that have serious conditions. They may put in several different orders: for five medications, for two labs that should be done. And they can pre-plan those, but then they have to push a button to actually activate them.

So it's sort of like when you're online shopping: You put five or six things in your basket, and then you may forget to actually order them. So they just sit in your basket. So the same thing happens with electronic health record. Because of the way the interface is designed, they may actually forget to activate them.

In the two deaths that you documented, what happened?

We're not able to talk at that level of detail on these particular results. But what I can tell you is that we're often seeing these challenges with the weight issue that I described in the pediatric patients. What we would hope in that instance is that these systems are built to have alerting.

I've been describing how complex health care is, and how complex electronic health records are. But it should also be recognized that there are really great usability and design principles to solve these challenges.

Is there a mechanism for that?

The health care systems will communicate some of these challenges to the electronic health record vendor companies, and the companies then try and make changes — and do make changes — to the systems when they're notified. The challenge is that these can be happening across many different healthcare systems, across many different EHR vendor products, and not all of these may be documented and communicated to the vendor.

And that space gets complex, because health-care systems may be reluctant to share some of this with the vendor because of liability reasons; and vendors may be reluctant to be very transparent about the kinds of challenges that are happening. So I think one area where we certainly need to focus is improving having our vendors and our providers working together to solve this problem.

We can't get into the blame game. This space is a complex one and to get it right, we need to bring all stakeholders together. In the 21st Century Cures Act — the last piece of legislation pushed by the Obama administration, with bipartisan support — is language around EHR usability, transparency and testing. And that's something that the Office of the National Coordinator, which is part of the Department of Health and Human Services and is really the agency behind electronic health records, is going to work through defining over the course of the next several months.

And the important piece is that we ensure that there's transparency on the usability of these electronic health records. There should be clear transparency on which products are usable and which products are safe.

There's also work under way on creating what's called a Health Information Technology Collaborative, or a safety center, and that is going to be critically important to improving these systems. The study we did here is a proof of concept of the kinds of things we can learn if we get access to the right data.



And meanwhile, is there anything the patient can or should do?

Absolutely. We should all be active in our care. Studies have shown that the more engaged the patients are, the likelier they are to pick up a discrepancy or a safety challenge. And patients should actually look at their records. Patients should do this because they can then check basic information — height, weight, medications — and they can start engaging very concretely to catch any potential challenges.

And we should have patients involved in these national policy conversations. This should not be something that's just driven by the federal government and by providers. All of us as patients are going to be impacted.

And what about deeply frustrated doctors?

Part of this is, if we can bring transparency around usability, that creates some competition. That creates an environment where EHR vendors and others will strive to create the most usable systems possible, and that'll bring frustration levels down. And if you want to get to a highly usable product, you have to engage your end users. So that means studying how physicians and nurses and others do their work, and engaging them in the design and development process and in the process of actually implementing the systems.

Note: Dr. Ratwani is a member of the 21st Century Cures Act Health Information Technology Advisory Committee, but the study and his views are independent of it.