By: Jim Bruce

… “If you have a brain, you’re biased.”1


 The Cambridge English Dictionary defines bias as a “personal opinion that influences your judgment.”  We all have such personal opinions.
Cognitive biases2, 3 are systematic deviations from the norm as individuals create their own “subjective social reality” based on their perception of the information they are receiving from their senses.  Said differently, an individual's personal construction of his or her social reality, not the objective input received from her or his senses, may dictate their behavior in the social world.  Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, or illogical interpretation.
Cognitive biases may also lead to more effective actions in a given context or enable faster decisions when timeliness is more valuable than accuracy.  Other cognitive biases are the “by-product” of human processing limitations, resulting from a lack of mental mechanisms, or simply from a limited capacity for information processing.
The notion of cognitive biases was first introduced by Amos Tversky and Daniel Kahenman in 19724 and drew upon their experiences observing an individual’s inability to reason intuitively in complex and difficult situations.  They, and others, demonstrated replicable ways in which human judgements and decisions often differ from rational choice.  Over the last six decades the list of cognitive biases has continuously evolved based on research on human judgment and decision-making in cognitive science, social psychology, and behavioral economics.   
How many cognitive biases are there?  Lots!  Wikipedia’s “List of cognitive biases”contains over 160 entries – arranged in three major categories:  decision-making, belief, and behavioral biases;  social biases;  and memory errors and biases. 
You may be wondering, why this is important to you as a leader.  It’s important primarily because we live in a complex world where we cannot think of every possible option, and work through the consequences of that option, when making decisions.  So, we necessarily rely on mental shortcuts that enable us to act more quickly than we otherwise would.  Author and educator Kendra Cherry,5 provides examples for several of these biases:

  • Conformation Bias – Favoring information that agrees with your existing beliefs and discounting information that disagrees.
  • Availability Bias – Placing greater value on information that comes to mind quickly.
  • Attentional Bias – Paying attention to some information about a situation while simultaneously ignoring, perhaps more important, other relevant information.
  • Functional Fixedness Bias – Pigeon-holing a staff member in a narrow job while she has broader skills and capabilities.

Heidi Grant Halvorson and David Rock, in their paper “Breaking Bias,”6 note that “On the whole [cognitive] biases are helpful and adaptive.  They enable people to make quick efficient judgments and decisions with minimal cognitive effort.  But they can also blind a person to new information or inhibit someone from considering valuable options when making an important decision.” 
In the paper they have grouped the long list of common biases into five categories based upon their underlying cognitive nature – similarity, expedience, experience, distance, and safety.  (Their research group, the NeuroLeadership Institute, has named this model SEEDSTM.)  Each of the five categories can be described by a short sentence that helps us remember the model and what that category is all about:
Similarity – “People like me are better than others.”  Individuals tend to focus on things that portray themselves in a the most positive light.  This leads to dividing people into an “ingroup,” who are like you and whom you see positively, and an “outgroup,” who are different from you and whom you perceive negatively.
Expedience – “If it feels right, it must be true.”  Our brains have two parallel decision-making systems, named “System 1” and “System 2” by Daniel Kahneman in his book Thinking, Fast and Slow.   System 1 thinking requires relatively little time and effort while “System 2” thinking is slower, more difficult to engage, and is less fun as it accesses a more complete set of your knowledge on the subject.  We tend to use System 1 thinking which, because it makes use of less information, may be in error.
Experience – “My perceptions are accurate.”  Our brains have evolved to regard its own perceptions as direct and complete.  We believe that what we see is all there is to see and that it is accurate.  This view ignores many of the brain’s processes that create a fuller picture of reality.  Holding “your” own perception too tightly also creates significant tensions among team members as you work to develop a solution to the problem before you.
Distance – “Near is stronger than far.”  The closer an object, an individual, an outcome is in space, time or perceived ownership, the greater the value assigned to it.
Safety – “Bad is stronger than good.”  Losing $10 feels much worse that finding $10 feels good.  The safety bias overly focuses us on what we might lose rather than what we might gain.
Each of these biases has broad applicability to your work as a leader.  Every action that you take occurs in a setting of competing biases.  Developing a better understanding of how your biases impact your personal work as well as that of your team is very important.  You may want to take some time this week to read the article “Breaking Bias.”6  I think that you’ll find it very insightful and helpful as you make decisions in the future. 
And, in next week’s Tuesday Reading, we’ll apply the SEEDSTM model from the Halvorson Rock paper to some of the human resource issues you face in hiring, promotion, and reviewing performance.
Make it a great week for your team and for you.  .  .  .  jim
Jim Bruce is a Senior Fellow and Executive Coach at MOR Associates, and Professor of Electrical Engineering, Emeritus, and CIO, Emeritus, at the Massachusetts Institute of Technology, Cambridge, MA.
 Notes and References:

  1.  NeurolLeadership Institute Webinar, “Why Traditional Bias Training Doesn’t Work (and What Does),” January 31 2018.
  2. Cognitive bias, Wikipedia.
  3. List of cognitive biases, Wikipedia. 
  4. Haselton, M. G.; Nettle, D. & Andrews, P. W. (2005). The evolution of cognitive bias. In D. M. Buss (Ed.), The Handbook of Evolutionary Psychology, Hoboken, NJ, US: John Wiley & Sons Inc. PP 724-746.
  5. Kendra Cherry How Cognitive Biases Influence How You Think and Act,, January 4, 2018.
  6. Heidi Grant Halvorson and David Rock, Breaking Bias, strategy+business, July 2015.
No votes yet