Title: Black Box Thinking – Why most people never learn from their mistakes but some do
Author: Matthew Syed
For me, this book stands out among all the other books I have read on success. When we think about high performing individuals or businesses, we don’t often associate them with high failure rates. But in fact, some of the world’s most successful people have been able to use their failures as learning opportunities to grow and develop. Black Box Thinking explores the concepts such as cognitive dissonance, marginal gains, closed loop archetypal to pinpoint where success lies. In this book you come across a broad range of examples from the Aviation industry to the Healthcare industry and Key figures such as David Beckham and James Dyson.
Matt talks about the inextricable connection between failure and success. The book starts with detailed comparisons between two industries where safety is critical and important – Aviation & Healthcare. Both industries are very different in terms of culture, institutional changes, psychology, and their approach towards failure, which is the most important one.
In the aviation industry, they follow a black box. An aircraft has two almost indestructible black box wherein the instructions are recorded and sent to the onboard electronic systems and also records the conversations and sounds in the cockpit. So whatever happens, if there is a mistake, there is a crash, they have the entire data with them. They open the collected data, analyze and learn from the mistake. So because of that, the crashes are very less. Inflight. When you compare the data, there is only 1 accident per 2.5 million flights, which is pretty low. It is a very impressive safety record. In the healthcare industry, things are quite different here. A 2013 study showed premature deaths associated with preventive harm were more than 4 lakh per year in the US. John Hopkins university school of medicine pointed out that this is equivalent, the no. of cases, the no. of people who die or are harmed can be prevented. This includes misdiagnosis, dispensing of wrong drugs, insuring the patient during surgery, operating on the wrong part of the body, improper transfusions, these are all avoidable. Because of avoidable mistakes, so many people are harmed. And they said that these numbers every day a 747, two of them crashing, every month 9/11 is occurring because of this. A study into acute care hospitals found that one in every 10 patients is skilled or injured as a consequence of medical error or institutional shortcomings. Now, why do so many mistakes happen? The problems are quite complex over here, there are several issues such as scarce resources, doctors are overworked, hospitals are stretched, hospitals need money, doctors have to make very quick decisions at serious moments. In serious cases, there is rarely sufficient time to consider all of the alternative treatments and sometimes procrastination ends up being one of the biggest mistakes. So all these things are there and the problem is all these errors that are committed in the hospital have a particular trajectory, subtle but predictable patterns, what accidental investigators call signatures. They do not report openly, the doctors won’t tell you exactly what happened, they won’t tell the patient this was done, and these are the procedures we could have done but we couldn’t do because of this or we couldn’t think of an alternative. All these things do not come out so you don’t know, it is not reported the reasons why people die. The exact report is not made. So the reporting is not open, the evaluation is not honest, and when these errors are spotted how are you supposed to stop it from happening again. How do you stop errors from happening in the healthcare industry as you do in aviation?
Then going further we read about a concept called closed loop. Now they explain it with an example, in about 2nd century AD there was this treatment known as bloodletting this treatment, there were a lot of treatments which were ineffective and highly damaging, but then they were believed to be the best one. One such treatment was called bloodletting in which if a patient is very ill, what here is done is you remove blood from a patient’s system. So they used to believe that the germs or poisons in the blood if you remove it the person will become better. So if you remove it and the person gets better and recovers the doctor would claim that bloodletting cured him. But if the patient died the doctor would say oh no he was probably too ill because he couldn’t recover after going through the magical cure of bloodletting. So this is an archetypal closed loop.
In the medical profession, they are taught how to talk about, they learn how to talk about an anticipated outcome until a mistake morphs into a complication. They learn not to tell the patient anything. They have resistance to disclosure and they go to lengths to justify the habit of non-disclosure, technical error, complication, unanticipated outcome, they talk that. This does contain elements of truth but they do not provide the whole truth. Now see the problem is not just about the consequences of failure, it is also about the attitude towards failure. In healthcare, competence is often equated with clinical perfection. If you make a mistake then you are considered to be inept. The very idea of failing is unwelcoming.
A small example is given about the aviation industry. In the 1940s there was a Boeing B17 Bomber, which was involved in a lot of runway accidents. The US army air corps called a psychologist from Yale for investigation. He studied the crash, the chronology dynamics, psychological elements, and everything. He identified that the cockpit design was poor and that was the contributing factor. What had happened was, there were two switches, one was controlling the flaps in B17 and the other one was controlling the landing gear. The flap and the landing gear and were both identical and placed side by side. There was not a problem in recognizing the switches when a pilot was relaxed and flying conditions were perfect. But under the pressure of difficult landing, the pilots were pulling the wrong lever. Instead of retracting the flaps so that they reduced the speed, they were retracting the wheels which led to catastrophic results on the runway. So he said that let’s change the shape of the levers so that they represent the equipment they are linked to. So they added a small rubber wheel to the switch of the landing gear and now even under pressure, the pilot can easily identify both the switches. So these accidents disappeared overnight, due to this small change.
This method of learning from mistakes has been applied to commercial aviation for many decades now with remarkable results.
The book is addressed how industries, businesses even individuals also deal with failure in the first chapter they introduce and talk about the airline industry and the healthcare industry. They are making a comparison of how the culture of failure is? So in the healthcare industry of the doctor has made an error, given you the wrong medicine or couldn’t decide what products to carry on at the last minute and it resulted in the patient dying. So they try to be evasive, they say it happened, it is a one-off. But in the aviation industry, they learn from their failure, and in medical they try to go away, they don’t want to admit the failure and there is no data. The aviation industry is data-driven, they have a lot of data through which they can identify what went wrong every single time and what they have to do to avoid such things to happen in the future again.
The book further goes and talks about science discipline, which has also learned from failure. There are many scientific theories that people used to believe to be correct but then they have been proved by the other scientists who come. People have learned what went wrong and then they accepted the new theory. Scientific theories make predictions that could be tested and that’ why they were always superseded by better theories. They were like all the failed scientific theories were stepping stones to successful theories that we see today. This book is ideal for anyone looking to change their mindset towards failure and understand how each one of us can do our part to lift society out of a stigma when it comes to failure. Matt talks about having a system of capturing and investigating all mistakes for businesses to succeed and we can use this in all areas of our lives from work to personal lives. Have our little black box to record all the mistakes. Take time to engage with your mistakes and investigate why exactly they happened and what can be done.