If you're new here, you may want to first register and subscribe to the RSS feed. Thanks for visiting!

“Many problems” in this case

Yes it can. We’ll get to to how big data or semantic meaning can help in a moment.

First a few observations. As the Prof. Redlener, the NYT’s expert on disaster preparedness put it, “There are many, many problems that have been revealed by this single case.” This is a polite way for saying what at the Dallas hospital was a major screw-up that needlessly put lives in danger and unnecessarily forced additional people into a 21-day quarantine.

We can talk about using semantic text technologies to prevent these kinds of hospital errors, or big data to improve traveler screening processes. At the end of the day, however, this is a type of error that the billing department at the Dallas hospital should have been able to catch.

EbolaCare(TM) insurance

“Where do we send this bill?” “Hmmm, address in Liberia. Looks like he has EbolaCare(TM), the national health plan of Liberia.” (The Ebola patient, of course, didn’t actually have insurance.) “EbolaCare(TM) care, huh. Socialized medicine, those pinkos in Liberia. Well, he’s out of network, since we don’t take EbolaCare(TM). So we’ll soak him at 4-10x the in-network insurance rate. EbolaCare(TM) insurance …. Say, since he had EbolaCare(TM) insurance from Liberia, you don’t think that he could have Ebola, do you?” “Naw. Just some mild fever and viral complaint.”

Now, that particular fictitious and somewhat unprofessional conversation in the Dallas hospital billing department apparently never occurred. But, it’s a valid point because, in a democracy, everyone needs to hold everyone else accountable for the system to work (as people on Wall Street managing investments will be the first to tell you). Specifically, under existing regulations, county health departments get statistics and reports on certain illnesses showing up at an acute care facility. In the wake of this particular train wreck at the Dallas hospital, these rules have been strengthened (or clarified) so that any travelers from West Africa with Ebola-like symptoms appearing in an acute care facility now need to be reported to the county health authorities (and, presumably, from there to the CDC).

So, even if there was a problem in the electronic record system, it did capture that the patient was a recent traveller in West Africa showing a fever. Under the new rules, if not before, this would need to be reported to county health authorities. That’s not quite the billing department, but large institutions (such as the network running the Dallas hospital) typically have highly paid IT people on staff to update their business logic rules to handle emergency regulatory changes like these (and there always are emergency changes like these). Given all the outreach efforts made by CDC prior to this incident, rules like this should have been in place before last week. So, even if the patient’s history wasn’t properly communicated to the doctors, the error should have been caught administratively in the form of administrators (or computer systems) required to submit a report on the patient to authorities.

That’s a check-and-balance on incompetence (or callousness) on the part of the hospital. If that rule wasn’t in place prior to this week, it’s the fault of the authorities (either CDC or county health). If those rules were in place, but no report to authorities was generated due to negligence on the part of the hospital, then the hospital needs to be penalized. So it’s quite the billing department where this should have been caught (at last resort), but it’s close.

And now we can turn our attention to the ER nurse that entered this into the electronic records system but didn’t manually alert physicians. This is where we can bring up the topic of semantic text systems. In our earlier article from April, we mentioned a “bear in the woods” scenario. The idea there is that structured data, such as the forms used in hospital electronic records systems, are fragile.

Ebola: There’s a bear in the woods, and some thought it was tame.

The example we used was learning about a bear in the woods, believing that the bear isn’t dangerous (topical at the time, given events in Europe). Since the bear wasn’t perceived as dangerous, it was too expensive to structure most of this information on bears in the woods, so the information is mostly unstructured. Suddenly, years later, you learn that the bear in the woods was, in fact, dangerous. If your system is limited to structured data, you are in trouble. If you can handle unstructured data, as computers are very recently becoming good at, you can now reprocess all of your earlier information on bears in light of this new information. Our earlier blog article on this talked at length as to why humans had evolved to be so good at unstructured data.

This hospital case with travel histories is exactly the bear in the woods scenario. Travel histories are not presented to physicians, because, in the past, they weren’t very important. Nurses were the only ones that needed to deal with travel histories, so it was hidden from physicians by default. To change this, the workflow needed to be restructured, the defaults changed, and new rules added to specifically highlight risks from Ebola (which the hospital, under prompting by the CDC, should have long-ago implemented in their systems).

Structured data systems are fragile. New code must be rewritten to add rules for Ebola. Unstructured data doesn’t have this problem. If there was a free form field in the patient record, the nurse could simply have written “possible Ebola case.” Systems that are very good at handling unstructured text (such as the human doctor, or new semantic meaning systems from a variety of vendors) would have picked up on the new significance of this term from the new training corpuses on Ebola, and would have appropriately flagged the record. No new programming is required, beyond the constant feeding in of new medical research that is happening anyway with these systems.

However, the ER nurse should also have been very good at handling unstructured data. She should have gone over to the physicians and said, “there’s a bear in the woods, and it’s dangerous.” (or, as commentators in major media have said, should have gotten up from her computer, walked over to a senior attending physician, and said, “possible Ebola case.”)

At a minimum, the erroneous discharge should have been caught the next day. You can imagine the ER nurse hovering around the water cooler, saying to colleagues “we may have had an Ebola case come through here yesterday.” Totally justifiable water cooler gossip; Ebola has frequently taken the lives of healthcare workers. It should have been caught at the water-cooler.

Kings Cross management burn-out

And this water-cooler isn’t speculative theory. According to a parliamentary inquiry, Kings Cross subway station in London burned to the ground with significant loss of life because ticket attendant had been trained never to talk to another department directly. If the subway station is on fire, they are to continue selling tickets. Fire is the safety department’s responsibility, not the ticket agents. If ticket agents wish to communicate something to another department, they need to go through their supervisor. If their supervisor isn’t around, and the station happens to be on fire, best of luck and hope your family has a good life insurance policy on you.

Advanced Soviet business management practices

Ever reputable business leader has said that it’s extremely bad management practice to forbid your employees from talking with other departments. Steve Jobs, in particular, believed that all the creative innovation happened when different departments interacted. He famously designed several of the campuses for Apple and Pixar to force people to walk to other parts of the building (e.g., the restrooms), so as to encourage random water-cooler interactions. (The only management theory we could find that though forbidding communication was a good idea was the Communist Party of the Soviet Union. Don’t talk to your fellow traveller in the cubical next to you, comrade, he may be a counter-revolutionary CIA spy. Seriously.)

Yet one still finds dysfunctional organizations where managers expressly forbid their employees to talk across departments, or across cubicles. According to one business school professor with a best-selling book on corporate politics, this is done to make managers more powerful (and more secure in their jobs) at the expense of both their underlings and the overall company (mess less likely to IPO). It tends to happen in extremely dysfunctional, extremely political corporations or departments. There is such intense rivalry, sabotage, and back-stabbing between competing department heads, that they seek to prevent friendships from developing across departments that could turn into political alliances against the managers. This, according to the parliamentary inquiry, is the sort of political environment that resulted in death when Kings Cross station burned down.

(And, yes, big data analyses have shown these sorts of management practices are extremely bad, if more common then we’d like to think. There have been some very good studies, as well as data sets, on different corporate cultures, and their impacts on outcomes like IPOs or corporate growth. Unfortunately, as one business school professor wrote, the senior managers who permit these sort of corporate cultures aren’t always as interested in corporate growth or IPOs as one might think. Often, they’re more worried about keeping their job in between various corporate crises. For a CEO to suggest, after several years, that he or she messed up, and now needs to fire senior managers to change the corporate culture is a good way for that CEO to stop being CEO. And, you guessed it, analytics and corporate management is a good topic for a future blog post here. We’ve already touched on the implications of some of these dysfunctional management practices at the national or governmental level, and how they can led to international conflict.)

Nurses even more robotic than IBM’s medical computer?

Did a political environment like this at the Dallas hospital also result in unnecessary walking corpses? (We might not know for a few days, or even ever, if sooner admission might have saved lives.) Had this nurse been turned into such a robot, told never to leave or station or do anything other than enter records into her computer? Was she told under no circumstance to engage in “idle chat” with the doctors? They are competent, they know what they are doing, they will make the decisions from the electronic patients records without input from you. Do not waste their time or interrupt their (or your) focus.

This gets into biomedical informations and medical decision making, which might be a good topic for another blog. There is a body of academic research that shows medical teams make better decisions when all members of the teams (patients, nurses, doctors, and technicians familiar with the patient’s condition, and not just the doctors) are consulted on appropriate treatments. Often, the doctor is allowed to override the team’s group decision, but studies show that errors are much more likely in these cases when the doctor overrides the opinions of expert staff. Things are probably too rushed in an ER room for this kind of team effort (we aren’t physicians), but it is still a strong argument for turning an ER room nurse into a complete sort of robot.

American’s bad geography responsible?

(Yet a final possibility is that the nurse was unaware of geography (despite Liberia being in the national news every day recently due to Ebola). The computer systems hadn’t yet been flagged to be aware the Liberia was an Ebola country. This is where things get interesting. Normally, if you walk into a hospital with an apparent viral illness, as the Ebola patient was originally diagnosed with despite the antibiotic treatment, the doctor will ask about your history. “Anyone sick at work?” And, if so, do you know what they had? At that point Mr. Duncan, the Ebola patient, should have volunteered to the doctor, for perhaps the 3rd or 4th time, that he just come from Liberia. At which point the doctor’s failure to diagnosis would become inexcusable, lack of geographical knowledge and all. This isn’t an especially credible scenario, but does point to at least a lack of assertiveness on the part of the patient. This is probably why the government is now placing posters in all acute care environments informing West Africa travelers that, never mind the ER room queue, they need to immediately speak up and underline their status as possible Ebola.) Somehow, even the “Catch Me if You Can” fake diploma doctor would have channeled Dr. Kildare and figured out this was a possible Ebola, right?

Parkland Memorial Hospital Conspiracy Theories (and legal visiting aliens, too?)

So, of course, there is now an Internet conspiracy theory around all of this. It is said there is a gentleman’s agreement between Dallas area hospitals to dump all uninsured and medicaid patients at Parkland Memorial Hospital. Parkland Memorial is somehow connected with nearly every good Internet conspiracy theory. Parkland would be extremely good at treating uninsured patients as they are still treating JFK’s ghost there after all these years, and there’s not a ghost of a chance he’s got private insurance. It is made very clear to uninsured and medicaid patients, like poor Mr. Duncan the Ebola victim, that they will not receive good care anywhere except Parkland.

Dr. J.R. Ewing to surgery

This being Dallas, one can just imagine a J.R. Ewing-type ruthless hospital administrator or attending physician, perhaps with a touch of wishful thinking that this would not turn out to be Ebola and international news, and summarily dismissing the Ebola patient with a “take two antibiotics and call me in the morning.” After all, something like a dozen or more other uninsured West African travelers have been placed in isolation throughout the country as a precaution, and all tested negative for Ebola. So, at least in this conspiracy theory, the odds would have been in the fictitious Dr. Ewing’s favor.

Anyway going into an ER room in the United States knows insurance is one of the first thing hospital administrators will ask for, sometimes several times. (“Are you sure you still have that insurance?”) God forbid the patient is rolled into the ER unconscious and not able to complete the insurance forms. Some hospitals have automated systems that will contact family members in this eventuality, seemingly before the doctors have even established a treatment plan. An immigrant from Asia, not familiar with our healthcare system, once remarked, “well, you can always negotiate prices with the hospital afterwards if you don’t have insurance.” Unconscious in an ER room isn’t exactly negotiating from strength, and anyone who thinks they can negotiate a fiat accompli after the fact doesn’t know the first thing about negotiations. So it’s no wonder these kinds of Internet conspiracy theories crop up, however unrealistic.

1 2