AUTHOR: JORDAN HARGREAVES
Heuristics, from the Greek heurískō meaning “to discover”, is the cognitive process of making decisions quickly based on a relatively small amount of available information. Psychologists Dr Amos Tversky and Dr Daniel Khaneman developed the study of heuristics in the 1970s based on the work of Cognitive Psychologist Dr Herbert A Simon. Khaneman theorises that when we make decisions we do it one of two ways: System 1 thinking, which is fast, instinctive and relatively subjective; and System 2 thinking, which is slow, analytical and considered more objective. When we need to make a decision quickly we typically use system 1 thinking; it’s automatic and requires the use of heuristics[1]. Think of a heuristic as a shortcut which allows you to generate a sufficient solution when a perfect one is not attainable within the restraints of the situation[2] such as clinical decisions being made in resource-poor environments.

Since the publication of Khaneman and Tversky’s work, several books have been released highlighting how heuristics can actually be very reliable with the right application and should be harnessed as a tool for fast decision making. For clinicians, this seems like the holy grail given that many of the important decisions we make are quite literally life or death, under high pressure, and often with large amounts of information to process simultaneously.
As highlighted by Malcolm Gladwell in his book “Blink”[3], our intuition can be unreliable and open to unintentional misdirection. This article will discuss this topic in more detail, expanding on what heuristics are, how we affect them, and how we can mitigate them.







The Horse, The Duck & The Cognitive Bias
Heuristics come in many forms. The ones you may be most familiar with in the medical world are adages we tell ourselves to help make decisions; for example, “if you hear hooves, you think horses not zebras” meaning common things are common, similar to the phrase “if it walks and talks like a duck, it’s a duck”[4]. Although these mental shortcuts are used by clinicians all the time, they are not flawless. Tversky and Khaneman explained that the shortcuts used to make a decision can get us to the right end point quickly but occasionally send us ‘off course’[5]. This is because, although hoofbeats are usually a horse, there may be a time when your heuristic tells you ‘that’s gotta be a horse’ when it’s actually a zebra clip-clopping along.
Imagine you’re flying home after a diving expedition. After reaching cruising altitude, one of the dive team starts complaining of a headache and photophobia. You go over to help. They tell you the headache is the worst one they’ve ever had. You recall a case you were told at the dive centre with a similar presentation and think that it’s probably decompression sickness given that she’s been diving recently and you’re now at altitude. Simple, right? Common things are common and this is a known risk to divers. However, jumping the gun without a full history and examination means you miss important differentials including subarachnoid haemorrhage or meningitis.
Cognitive biases influence all our decisions and the above example shows how they can make heuristics unreliable. Specifically, the above case demonstrates priming and availability bias where the fact that you’ve been diving recently and can easily recall a case of something similar that fits means you disregard other key bits of information such as the fact that it’s been more than 18 hours since the dive and the cabin is pressurised, both of which make decompression illness less likely[6,7]. Our biases, though unintentional, can misguide our judgement and make us follow the path that looks familiar. Recognition of these biases allows their mitigation and considering what we could be missing as a result of them can aid our decisions.
Dr Chris Drew has an article on his website summarising 22 types of heuristics, we recommend you give it a read here: https://helpfulprofessor.com/heuristics-examples-types/







The ER problem
Heuristics are decision-making tools which focus on a few pieces of information and ignore the rest to come up with a solution. By doing so, a decision is made much quicker but ultimately sacrifices ‘optimisation’ of the solution as they are limited to the knowledge, experience and bias of the person making the decision.
Algorithms are standardised steps designed to streamline decisions by also only taking into account only a few bits of information. However, these differ from heuristics as they are often designed based on the available evidence and best practices in a given field.
In the late 90s, Chicago’s County Cook Hospital had an issue in its Emergency Room (ER). Diagnosing acute coronary syndrome (ACS) in patients coming to the ER with chest pain but no ST-segment changes was extremely difficult whilst waiting for troponins to come back, if they could even be tested for at that time[3]; As clinical judgement was the only tool immediately available, 90% of patients with chest pain would end up being assigned to the coronary care unit (CCU) with only around 25% of patients actually having confirmed ACS. The high rate of false-positives meant the unit became overcrowded. The staff were overworked and patient safety suffered massively.
So, if you’re an overworked, underfunded hospital being crippled by patient overcrowding, what can you do when you’re heading up a very specific creek without a paddle? Well, two scientists from Michigan University, Green and Mehr, had an idea. To reduce the burden on the hospital, an algorithm was created for physicians which focused on three specific pieces of information.
- Firstly, were ST changes present? If yes, they were admitted to CCU immediately.
- If not, the following question was asked: Is chest pain the principal complaint? If not, they went to a ward bed for observation.
- If chest pain was the principal complaint, they asked the third question: Are any of the following predictors present: Previous MIs, efficacy of nitrates, or T wave changes? If no, the patient would go to an observation bed, but if yes to any of the predictors, these patients were admitted to CCU.
The result? Unnecessary admissions to CCU dropped and patient outcomes improved[3].
This wasn’t a perfect solution and errors still occurred but, for a hospital that was overcrowded, understaffed, with exhausted clinicians and haemorrhaging money through unnecessary high-dependency bed use, the perfect solution wasn’t available. This algorithm reduced the required time and ‘bandwidth’ for physicians to make decisions by simplifying which information was required. Additionally, by making hard and fast rules about what information should be considered, the effect of an individual physician’s heuristics and bias are mostly mitigated. Algorithms are now common in place within healthcare and a vital part of acute medical treatments; one study from the US demonstrates evidence that adherence to cardiac arrest algorithms has a positive correlation with return of spontaneous circulation[8].
The Ottawa Ankle Rules [9] and the NEXUS Spine Rules [10] are invaluable tools for ruling out ankle fractures and cervical-spine injuries, respectively. Though they are scoring systems instead of algorithms, their effect on decision making is the same: they reduce the effect of bias, human factors, and the bandwidth required to make a clinical decision; for example, a member of your party has a bad fall whilst on the side of a mountain and you’re worried they’ve damaged their c-spine. In poor weather conditions with frantic group members, the call to immobilise the patient and attempt extrication to an evacuation point carries risks to yourself and your group. NEXUS spine rules have a sensitivity of 99.6% for cervical-spine injury in those under 65 years old [10], so can quickly aid your clinical decision through 5 simple questions thus helping to reduce the overwhelming information overload that stops you making a decision.
Human factors play a massive role in our decision-making process too; if in the above case an evacuation would be extremely costly to your employer and end the trip of a lifetime, the effect of social facilitation to not be the person who ends the trip could further lead to analysis paralysis[11]. Having an algorithm or set of rules to follow takes societal pressure out of the equation, facilitating an objective, patient-centred clinical decision. We recommend you read our article on human factors for more information on this.







Take home messages
Heuristics are shortcuts we use to make a decision. These are done by focusing on specific pieces of information and disregarding the rest.
Heuristics are general rules: general rules apply generally. As a result, they should be treated as such and the potential for error should be recognised.
Cognitive bias can affect our heuristics: our disregard for information that we deem unnecessary can be a huge source of error. Recognising we have these biases can help us mitigate them by considering what we could be missing.
Algorithms and scoring systems are immensely useful in resource-poor settings. Their use minimises the required bandwidth and can provide evidence-based assurance to your decisions.









Are you interested in learning more about heuristics and other non technical skills?
If so, why not check out our Remote and Restorative course? Whilst you’re there, why don’t you take a look at our other courses too?







Further reading
- Heuristic Examples Types – https://helpfulprofessor.com/heuristics-examples-types/
- Cognitive Bias in Clinical Medicine – ED O’Sullivan and SJ Schofield – https://www.rcpe.ac.uk/sites/default/files/jrcpe_48_3_osullivan.pdf
- Blink: The Power of Thinking Without Thinking – Malcolm Gladwell, 2005
References
- Marewski, J. N., & Gigerenzer, G. (2012). Heuristic decision making in medicine. Dialogues in Clinical Neuroscience, 14(1), 77–89. doi:10.31887/dcns.2012.14.1/jmarewski
- Kahneman, D. (2011). Thinking fast and slow. U.K: Penguin Books.
- Gladwell, M. (2005). Blink: The Power of Thinking Without Thinking. New York: Back Bay Books/Little, Brown & Company.
- Goldsmith, P. (n.d.). Retrieved from https://mdujournal.themdu.com/issue-archive/summer-2019/cognitive-bias-and-diagnosis-heuristics
- Khaneman, D., Slovic, P., & Tversky, A. (1982). Judgment under Uncertainty. doi:10.1017/cbo9780511809477
- de la Cruz RA;Clemente Fuentes RW;Wonnum SJ;Cooper JS; (n.d.). Retrieved from https://pubmed.ncbi.nlm.nih.gov/28846248/
- Feldman, J., & Cooper, J. S. (n.d.). Retrieved from https://www.ncbi.nlm.nih.gov/books/NBK499855/
- McEvoy, M. D., Field, L. C., Moore, H. E., Smalley, J. C., Nietert, P. J., & Scarbrough, S. H. (2014). The effect of adherence to ACLS protocols on survival of event in the setting of in-hospital cardiac arrest. Resuscitation, 85(1), 82–87. doi:10.1016/j.resuscitation.2013.09.019
- Stiell, I. G. (1994). Implementation of the ottawa ankle rules. JAMA: The Journal of the American Medical Association, 271(11), 827. doi:10.1001/jama.1994.03510350037034
- Hoffman, J. R., Wolfson, A. B., Todd, K., & Mower, W. R. (1998). Selective cervical spine radiography in blunt trauma: Methodology of the national emergency X-radiography utilization study (nexus). Annals of Emergency Medicine, 32(4), 461–469. doi:10.1016/s0196-0644(98)70176-3
- Thacker, J. (2021). The human factor in mountaineering and snow sports – going beyond facets. Retrieved from https://mountainassurance.co.uk/2020/10/06/the-human-factor-in-mountaineering-and-snow-sports-going-beyond-facets/
Leave a Reply
You must belogged in to post a comment.