It has been a quite month on the blog as I have started paternity leave and have been redressing the work-life balance scales – very much in the family direction.
Not doing any clinical work has given me time to pause and ponder a few of the more abstract ideas that we often brush over in the daily grind of clinical medicine. So this month I am going to explore a few of these.
The themes are all interrelated and quite broad – from understanding risk, dealing with uncertainty, shared decision making, communication about risk and consent.
So what has prompted me to ponder such ideas? Well, a few things really. I have been reading some wonderful books by Gerd Gigerenzer – and his ideas have really resonated with my beliefs and how I practice. I have also had a few interesting interactions with medicine on the “patient side” – and seen things from a patient perspective first hand – and this has left me feeling somewhat dissatisfied. Now – where to start this extended rant….
In the last few years the work of the Nobel laureate Dr Danial Kahneman has become very sexy in the world of medical education and thinking about doctors’ psychology. His book Thinking Fast and Slow has become a staple of the medical metacognitive types [me included]. Chris Nickson gave a great talk on this at SMACC 2013 [All Doctors are Jackasses]
Kahneman explores the biases and cognitive errors that creep into our practice. His basic premise is that heuristics [simple ‘rules-of-thumb’] can lead us to incorrect decisions and error. He asserts that our human brains are not well equipped to deal with probability and as such we often make errors when dealing with relatively simple statistical calculations. Some true experts with many ( ? > 10,000 hours) of experience may develop useful intuition in practice…. but mostly our “fast-thinking” brain gets it wrong a lot of the time. The solution is to adopt a more analytical approach and try to recognise these traps, slow down and do some deliberate calculation and weighing in order to get a more realistic assessment of the situation before making a call, decision, cut, burr hole etc.
When I first read “Thinking Fast and Slow” it made a lot of sense. I could certainly recall serious errors that I had made where I was following biased, rapid-fire thought processes. But… I never really thought it explained the whole picture. As a GP I work in a world of uncertainty. There are few patients with a statistically describable problem. There are a lot of unknown, unknowns. So the cold analytical “slow brain” approach didn’t really seem to be a viable option either. I even wrote quite a long essay on this entitled “On Evidence, Education, Errors, Ego and Expert intuition“. I felt that there was a need to balance careful analysis where viable against simple, dumb rules where there was no clear, simple and safe path forward.
So when I started reading Gerd Gigerenzer’s Risk SavvyI felt a strong resonation with what I had learned the hard way over the last dozen years. Gigerenzer disagrees with Kahneman to some extent. They both conclude that we are basically innumerate when it comes to simple risk calculations. Gigerenzer feels that these skills are imminently teachable though. The main way that Gigerenzer diverges from Kahneman is that he distinguishes “risk” and “uncertainty”. And suggests that we need different toolboxes to deal with each of these.
So – what is the difference between “risk” and “Uncertainty” ?
Gigerenzer describes risk as the way to describe systems where the numbers are known. For example, the lottery – we know the risk or chances of winning. They are low but calculable from the data available. Sure, humans are terrible with statistics – and we are often biased towards irrational optimism. This is the basis of the modern lottery, casino, poker machine etc. When it comes to dealing with “risk” we need to adopt a solid, carefully analytical mindset and not fall victim to our inherent biases. Simple heuristics may lead us astray as they may not be subtle or sophisticated enough to cope with the situation.
Now here is the problem… modern medicine is far from statistically describable. It is just plain dirty. We read a lot of papers with fancy statistics. Clinical decision aids are all the rage – but these are really just crude approximations. They are population-based tools which are tough to apply to the patient sitting in front of us. The overwhelming majority of what we do falls clearly into the realm of Uncertainty. The numbers are not known, and tend to move with alarming frequency!
So Gigerenzer would argue that in this situation, an uncertain world, we ought to use heuristics – smart ones. In fact he has described an “adaptive toolbox”, a repertoire of heuristics which we have at our disposal. He argues that the trick is to choose the right heuristic (rule) to use in the right situation. Attempting to “calculate” risk in an unknowable environment is both slow and often inferior to following a simple heuristic. He and his colleague Daniel Goldstein have describe a series of “smart heuristics” which have worked pretty well in the right context – often outperforming complex models based on available data.
OK. That is a quick introduction to the current state of play in my mind. I will be back soon with a few examples and scenarios to illustrate how I think we can use this theory to do better in the workplace and improve our communication with patients.
In the mean time I highly recommend reading anything by Gigerenzer – e.g..
– Better Doctors, Better Patients, Better Decisions: Envisioning Health Care 2020
– Risk Savvy
– Reckoning with Risk: Learning to Live with Uncertainty
I am a GP working in Broome, NW of Western Australia. I work as a hospital DMO (District Med Officer) doing Emergency, Anaesthestics, some Obstetrics and a lot of miscellaneous primary care. Also on the web as @broomedocs | + Casey Parker | Contact