Do we do enough to identify our own biases in everyday life?
Inscribed on the frontispiece of the Temple of Apollo at Delphi are three maxims. The first, and arguably most famous, “know thyself”. A simple phrase that holds the entirety of one’s existence in a few letters, and something I spend far too much time thinking about.
In April I wrote a piece on how our personal politics affects how we approach and interpret science communication. I ended the piece by saying “this requires us to introspect the political machinery that has shaped what we believe can be decoupled”. But introspection requires attention not only on the external environment that shaped us, but the quagmire of internalised beliefs and assumptions that influence our daily thoughts as well.
As scientists, we spend much of our time unpacking and deconstructing research. We’re trained to scrutinise methodologies and pick apart statistical interpretations, but very rarely do we apply the same rigour to ourselves. We hide behind the stereotype of being rational beings who function on facts and logic, without looking deeper at the invisible hands that guide how we interpret the complex, multivariable datasets we’re confronted with in everyday life. These invisible hands are cognitive biases, and although they’re the product of evolutionary mechanisms to think faster and filter inputs quicker, they are a misfiring that causes us to lose objectivity and make irrational judgments. Our own unique combinations are moulded throughout our lifetime and often difficult to recognise.
A useful resource I use to work on identifying my own biases is The School of Thought, a non-profit dedicated to critical and creative thinking, and philosophy. While there are hundreds of different types of cognitive biases, The School of Thought has a fantastic website that focuses on the 24 most common ones and where we are most likely to encounter them in out in the real world. I keep their poster permanently plastered above my desk, and whenever I’m confronted with a new argument in a topic I’m working on I try identify which biases might be influencing my opinion. If I find that I’m overtly agreeing with the argument does it align with an another argument I’m already in favour of (confirmation bias), is the argument being made by someone I respect or know (halo effect), or is the argument being made by someone similar to me (in-group bias)? Similarly if I find that I feel immediately dismissive of the argument at hand is it because I feel as though I’m being coerced into accepting the argument and want to disagree (reactance) or because my mind was already set on believing another outcome (belief bias)?
The poster of cognitive biases available for free download from https://yourbias.is/
I believe these types of exercises are important for making better choices and forming more robust opinions, particularly when we’re looking at science with a strong sociological influence such as what we’re seeing now in the midst of the COVID-19 pandemic. As we debate the science that supposedly underpins policies, churn out social-media think pieces, and go to war in the comment sections it’s important that there are no easy answers and certainly no absolute answers. We would have far more effective engagements by understanding what types of blinkers are altering our worldview.