Policies adopted in response to a public outcry can prove counterproductive in practice. Poor design of incentives for the actors concerned can deliver perverse outcomes.

Prof Canice Prendergast of the University of Chicago, and formerly of the ESRI, has done important research on incentives and accountability, which is worth revisiting.

A case in point was the response of the Los Angeles police after the Rodney King affair, where some police officers had been caught on camera beating up an innocent African-American.

Part of the subsequent reform was a new ruling that, whenever an individual made a complaint about a police officer, that officer would automatically be suspended pending an investigation. However, LA criminals soon realised they could use this rule to hobble the police force – all they needed to do was lodge a complaint and officers would be suspended.

After the rule’s introduction, a “drive and wave” policy evolved, where the LAPD drove through crime ghettos and just waved at criminals, but did not attempt arrests. Prendergast’s research shows that arrests fell by 35 per cent and the murder rate trebled.*

Learning from mistakes A key aspect of progress in any field, but particularly in medicine, is a culture of learning from experience, including mistakes, and reviewing regularly what could have been done differently. Well-managed hospitals conduct regular reviews of difficult cases, with the emphasis being on learning from experience how similar cases could be better handled in the future. Related How Ireland managed to green the White House

John FitzGerald: Economy shows resilience in teeth of pandemic

Why this recession will not be as enduring as the last

For this to work, doctors have to be strongly encouraged to put cases that had suboptimal outcomes to review, especially where they themselves may not have got it all right. However, our litigious society, where all adverse outcomes are assumed to be someone’s fault, has given rise to huge disincentives to reporting and discussing cases where errors, even minor ones, have been made, as well as those where judgment calls were finely balanced. That culture has led to outcomes where a failure to recognise mistakes in time may have disastrous results.

Over recent decades, a large body of labour law has been enacted to protect the rights of workers. A key feature is the right of employees to due process and to have their cases properly heard. Most adverse findings against employers centre on failures of due process.

While there has been public criticism of how senior bankers and others were sheltered from accountability by such arrangements, the pendulum seems to have swung in the opposite direction. Over recent years a “heads should roll” approach to public accountability has taken precedence both over a culture of reflective examination of error, and even over due process to examine evidence of wrongdoing before someone is driven from their job.

While in some cases the resignation has proven to be warranted, in others a proper hearing and evaluation of evidence could have resulted in a different outcome.

Many dangers This “shoot first” approach to accountability poses many dangers, and makes it harder to recruit the right people to take on the most responsible and challenging jobs.

We have temporary stand-ins among our hospital consultants, as Garda Commissioner, secretary-general of the Department of Justice, and now as head of the HSE. All these are complex jobs demanding high levels of skill and dedication. However, they are also jobs where adverse occurrences can and will occur. But are these more likely to be averted by fostering a culture of disclosure and reflection on mistakes, or by insisting that accountability means heads will roll?

Of course, women whose cervical smear examinations wrongly failed to detect pre-cancerous signs should have been told. It’s a pretty basic principle that people, not their doctors, own their own health information.

Doing look-back audits is best practice. Not all countries do it but it is an essential quality check on screening programmes that have inherent margins of error. It would be a huge mistake if the current furore were to send a strong message to clinicians that researching the effectiveness of screening programmes is to be avoided as it puts them in the firing line.

It is the failure to screen, and the failure to continuously quality-check the screening process, that costs lives, not the audit process, however poorly communicated.

If the public service is to work well, it is vitally important to recognise that public service workers – be they doctors, gardaí, or administrators – are human. Mistakes happen, but the system needs to incentivise all those involved to learn from their mistakes without fearing or waiting for a big stick.