The New Yorker published an article describing how prosecutors of a high profile defendant in New York made a wrong decision about a key piece of evidence. Instead of firing the lawyers responsible as expected, the District Attorney decided to inquire into the organisational errors that had led to the mistake.
She knew the lawyers were skilled professionals. She knew that they had not intended to make the error. 'What factors, she wondered, had caused competent people to make bad choices?'
The DA introduced a procedure well known to the health care and air transport industries where objective searches for causes of error take precedence over blame and personal liability.
What emerged was a 'complicated web of events and conditions'. It was 'a classic organisational error: a series of small slip-ups that cascaded into an important mistake'.
The DA concluded that 'even in a busy office like hers, she needed to create a step in which everyone could pause during certain complex or high-profile cases and have someone else take a fresh look at the evidence.'
Mistakes are treated as inevitable in decision making as successes and thus there needs to be the capacity for dealing with, and learning from them in a blame-free environment.
Another study of errors in prosecutions culminated in several jurisdictions agreeing to each doing a systems analysis of a high-profile criminal justice failure.
'In every case, the horrendous legal accident turned out to have multiple causes embedded in the legal system. There was no single bad actor. '
One case convened a group of more than thirty people representing every agency that had made contact with a repeat offender. It was discovered that 'in almost every incident, the people who made decisions about the boy had not seen his larger pattern of violent behavior because they did not have access to his complete records, or did not see them.'
In another involving a police officer who had committed multiple acts of professional misconduct, the review was able to 'identify seemingly minor perturbations—poor performance evaluations, excessive medical leaves, discourtesy complaints—as warning signs for early intervention.'
One participant in the studies said that 'the idea is to create a culture of learning from error—to look at what went wrong, what factored in the cases, and how to change the system so that doesn’t keep happening.'
As an expert adviser from air transport safety stated:
'I stressed the fact that, although it’s perfectly reasonable to be angry at a staff member who makes a mistake, you’re deluding yourself if you think simply firing someone gets to the underlying cause of the error in the first place.'