A diagnostic error, defined as an incorrect diagnosis or a correct diagnoses made only after avoidable delay, can have devastating consequences for patients and their families.
We have seen this in the recently reported case of an infant boy whose meningitis was not diagnosed early enough to prevent significant brain damage. Diagnostic errors can also have tremendous impact on the personal and professional lives of the medical practitioners involved.
In the wake of significant errors however there is often a rush to examine and fix flawed medical systems while neglecting what may have gone wrong in the thinking of the clinicians caring for the patient. Errors in clinical reasoning are called cognitive errors and are more common than anyone would like. While difficult to define and count, the medical literature suggests that fifteen out of every 100 diagnoses a doctor makes will be wrong.
An excellent doctor once told me early in my training that if I couldn't recall a major mistake, I hadn't been practicing long enough. The non-medical person, however, may well wonder how it's possible in such a technologically advanced age for doctors to make mistakes as often as we apparently do. The answer is a complicated but important to understand.
Firstly, despite all the protocols, guidelines and decision support tools that exist to "aid" the clinician in diagnosis, the interaction between health provider and patient is still a fundamentally human encounter and will continue to be so for the foreseeable future. There is yet no computer system that can improve on what happens in the mind of a doctor as he or she takes a medical history, performs a physical examination, interprets lab tests or x-rays, and then puts it all together to make a diagnosis. But the process is not a perfect one.
Secondly, doctors are trained to recognize patterns of illness. Much of this recognition happens below the level of consciousness in the same way many of us drive to work each morning without thinking consciously about it. After some amount of experience, we develop a visual and spatial memory of what it looks like to drive to work and though we may not be able to name any of the streets en route we get there just fine. When doctors hear a story and examine a patient they are often sifting through patterns of illness call "illness scripts" which they've learned through training and experience to find the best fit.
This kind of thinking, while efficient and usually accurate is prone to a surprising number of recognised problems or "biases" which are innate to the way our brains work. A common example is something called "confirmation bias" which refers to the way the brain filters any new information it receives after having "solved" a problem so that it gives extra weight to information that supports the solution and discounts that which conflicts with it.
Another common bias is called availability bias which refers to the way the brain thinks first of those things it has seen most commonly or most recently. These biases along with many others can contribute to diagnostic error. Combine these inherent biases with the stress and fatigue of a busy emergency department or clinic and the mixture can be far worse than either alone.
The best way to avoid the problems inherent in subconscious thinking is to force oneself to switch to a less efficient but more accurate analytical mode. In the medical world this entails developing a list of possible diagnoses for a particular problem called a differential diagnosis, prioritising that list, and working through it logically and systematically.
Why is it important for patients to have some idea of what is going on in the minds of their health provider? In his excellent book entitled How Doctors Think, Boston-based physician Jerome Groopman advocates for the empowerment of patients to ask one question of their doctors to force them from the intuitive mode. That question is, "Doctor, what else could this be?" This seemingly simple question requires the doctor to develop a differential diagnosis and can help avoid cognitive errors.
In addition to patients, we clinicians need to be better understand how we think, the errors to which our intuitive thinking is prone, and to develop the skills to "de-bias" our thinking.
Diagnostic errors will never be eliminated entirely. To minimize their occurrence however, we need to examine cognitive problems as part of our analysis when errors occur. Armed with a better understanding of the basics of clinical reasoning, patients and care providers can work in a truly collaborative way to help minimise potentially catastrophic errors.
Art Nahill is a general physician and clinical teacher at Auckland Hospital.