Think Fast, Think Slow
Sometimes we need to react fast, automatically. For example, as we see a large truck speeding towards us as we are standing in the edge of the street waiting for a traffic light to change. Or, as we observe the subtle cues of a very dissatisfied client. And, at a different time, we may find ourselves totally engrossed in the deep work1 of a seemingly intractable problem. And, then our thoughts and actions need to proceed at a slower pace.
Daniel Kahneman addresses the systems that the brain uses for this thinking in his 2011 book Thinking, Fast and Slow.2 Kahneman is a psychologist noted for his work on the psychology of judgment and decision making, and in behavioral economics. He received the 2002 Nobel Memorial Prize in Economic Sciences for his work in behavioral economics. Today, he is the Eugene Higgins Professor of Psychology and Professor of Psychology and Public Affairs, Emeritus, at the Woodrow Wilson School of Public and International Affairs at Princeton University.
The key take-away from Thinking, Fast and Slow is that we have two modes of thinking, System 1 which “operates automatically and quickly, with little or no effort and no sense of voluntary control,” and System 2 which “allocates attention to the effortful mental activities that demand it, including complex computations.”
Kahneman2 tells us that “when we think of ourselves, we identify with System 2, the conscious, reasoning self that has beliefs, makes choices, and decides what to think about and what to do. Although System 2 believes itself to be where the action is, the automatic System 1 is the hero in the book. I [Kahneman] describe System 1 as effortlessly originating impressions and feelings that are the main sources of the explicit beliefs and deliberate choices of System 2.”
System 2 thinking requires your relatively undivided attention. If you do not pay attention, you will perform less well or not at all. Think about making a complex math calculation or completing a tax form.
Kahneman also wrote that “Systems 1 and 2 are both active whenever we are awake. System 1 runs automaticity and System 2 is normally in comfortable low-effort, in which only a fraction of its capacity is engaged. System 1 continuously generates suggestions for System 2: impressions, intuitions, intentions, and feelings. If endorsed by System 2, impressions and intuitions turn into beliefs, and impulses turn into voluntary actions. When all goes smoothly, which is most of the time, System 2 adopts the suggestions of System 1 with little or no modification. You generally believe your impressions and act on your desires, and that is fine – usually.
“When System 1 runs into difficulty, it calls on System 2 to support more detailed and specific processing that may solve the problem of the moment. System 2 is mobilized when a question arises for which System 1 does not offer an answer… System 2 is activated when an event is detected that violates the world that System 1 maintains.”
Ameet Ranadive, Director of Product at Twitter, tells us in his essay, What I Learned from “Thinking Fast and Slow,”3 that “System 1 is continuously creating impressions, intuitions, and judgments based on everything we are sensing. In most cases we just go with the impression or intuition that System 1 generates.” System 1 is very aware that our unconscious drivers such as, in-group/out-group bias, similarity bias, confirmation bias, etc., influence our judgment and decision making. We discussed biases in two Tuesday Readings in early March.4
A reasonable question to ask that we did not address in our earlier readings on bias is this: Just how are our biases formed? The simple answer is we don’t have a good answer to the question. We do know that biases are pervasive, we all have them. That our biases do not necessarily align with our declared beliefs. That our biases generally favor our own in-group. And, we also know that our biases are malleable. Our brains are exceedingly complex, and we can learn and adopt new associations and unlearn others.
The mental process that this learning and unlearning uses is the target of television advertisements that are repeated in every ad-block within a television program that is known to be watched by a target audience. By repeating the message repeatedly, the ad’s sponsor seeks to bias the target audience to take an action favorable to the advertisement governments (e.g., such as buying a specific product). When something is repeated enough times, we tend to believe it. Our brains have difficulty in distinguishing between familiarity and the truth. We are all gullible.
This same approach is used by political organizations, including those involved in the reported Russian interference in the 2016 U.S. Presidential Election, to influence a target audience, selected based on data about their individual preferences, beliefs, etc. available from social media, to vote a favorable way.5
The “Cognitive Reflection Test”6,7 devised in 2005 by Shane Frederick, Professor of Marketing, Yale School of Management, provides a set of simple examples illustrating how System 1 and System 2 thinking works. The test has three questions that provide a simple indication as to whether a person is using System 1 or System 2 thinking to answer the question.
I urge you to answer as you read a question’s text (and before you read the comments immediately following the text of the test):
- A bat and a ball cost $1.10. The bat costs $1.00 more than the ball. How much does the ball cost? _____¢
- It takes five machines five minutes to make five widgets. How long will it take for 100 machines to make 100 widgets? _____ minutes
- In a lake, there is a patch of lily pads. Every day, the patch doubles in size. If it takes 48 days for the patch to grow and cover the entire lake, how many days will it take for the patch to cover half of the lake? _____ days
What is particularly interesting here is that these questions tempt you, really your brain, to respond using System 1 thinking, that is, your human intuition. And, here, your intuition fails you. Indeed, of the 3,428 university students who completed Frederick’s initial studies, 33% got all three questions wrong and 83% got the answer to at least one question wrong. MIT students got 48% of the answers correct, the highest percentage for any university. (Fredrick was on the MIT faculty when he was conducting this research.)
The intuitive, though incorrect, answers are 1) 10¢, 2) 100 minutes, and 3) 24 days. The correct answers are 1) 5¢, 2) 5 minutes, and 3) 47 days. Frederick observed from the test that “Your intention is not as good as you think.”
System 2 thinking when intensely focused on a task can make you effectively blind, even to strong visual stimuli that you would otherwise see. For an example of this effect, you may want to watch the short video, The Invisible Gorilla. Instructions for watching the video ask you to focus on a specific, demanding task (provided at the beginning of the video) while watching it. [Watch now.] We’ve all experienced this phenomenon to some degree in our regular lives. I work with the windows in my study facing a not-that-busy residential street. If I’m focused on my work, I miss people walking their dogs, cars, the school buses, the trash truck, etc. And, if my focus is less, I see them all.
So, what are some of the takeaways from our understanding of System 1 and System 2 thinking:
- It is very easy to fall into a System 1 trap, thinking about an issue that requires detailed analysis in terms of your intuition, impressions, biases or feelings. If there is hard data, it’s time for System 2.
- System 1 thinking often leads one to jump to conclusions, make too quick judgments, and bad, wrong decisions. In other words, we can fool ourselves into thinking all the parts fit when they don’t.
- We can be blind to the obvious. (As an example, recall the gorilla video.)
- Our minds choose the familiar instead of the true. People evaluate the relative importance of an issue by the ease with which they can retrieved it from memory. Frequently mentioned topics populate the mind as others slip away from awareness. Similarly, the media chooses to report what they believe is currently on the public’s mind. In an authoritarian regime, pressure can be exerted to drive what is reported.
- Most of the time we rightly go with System 1’s recommendations. However, don’t resist your brain’s dis-ease which suggests that something isn’t quite right, something unexpected has popped up, something needs some critical thought, … Putting in the extra time will be worth it.
- System 1 works to produce a coherent, believable story based on the available information. This may lead to a what-you-see-is-all-there-is, WYSIATI, conclusion. There is only limited information to work with, much is missing. You may rely on other’s intentions, well meaning, but incorrect judgments and impressions and not seek what’s missing.
- It is easy to become over confident in the future and let our view of the future directly affect decisions in the short term.
- And, a fun fact: Choosing bad fonts and harsh colors in a document will trigger System 2 thinking, leading people to work harder and more carefully on the work.
- Studying your own weaknesses is extremely difficult. It is far easier to see other’s mistakes than our own.
There are lots of important ideas here that can and most likely should change the process of your thinking. My hope is that you’ll latch onto one or two ideas that are particularly important to you at this time and run with them. Then, come back later for another two. I do believe that it will provide significant dividends. Just learning to routinely ask for the data, the reasoning, associated with a decision may be worth the cost of learning about how to better think.
Make it a great week for you and your team!
. . . . jim
Jim Bruce is a Senior Fellow and Executive Coach at MOR Associates, and Professor of Electrical Engineering, Emeritus, and CIO, Emeritus, at the Massachusetts Institute of Technology, Cambridge, MA.
References:
- Cal Newport, Deep Work: Rules for Focused Success in a Distracted World, Grand Central Publishing, January 2016.
- Daniel Kahneman, Thinking, Fast and Slow, Farrar, Straus, and Giroux, LLC, 2011.
- Ameet Ranadive, What I Learned from “Thinking Fast and Slow,” medium.com, February 2017.
- Tuesday Reading, Bias, March 6, 2018, and Mitigating Bias, March 13, 2018.
- Dipayan Ghosh and Ben Scott, Facebook’s New Controversy Shows How Easily Online Political Ads Can Manipulate You, Time, March 19, 2018.
- Shawn Frederick, Cognitive Reflection Test, devised Wikipedia.
- Tara Kadioglu, Why Slow Thinking Wins, The Boston Globe, July 2015.
- November 2024 (3)
- October 2024 (5)
- September 2024 (4)
- August 2024 (4)
- July 2024 (5)
- June 2024 (4)
- May 2024 (4)
- April 2024 (5)
- March 2024 (4)
- February 2024 (4)
- January 2024 (5)
- December 2023 (3)
- November 2023 (4)
- October 2023 (5)
- September 2023 (4)
- August 2023 (4)
- July 2023 (4)
- June 2023 (4)
- May 2023 (5)
- April 2023 (4)
- March 2023 (1)
- January 2023 (4)
- December 2022 (3)
- November 2022 (5)
- October 2022 (4)
- September 2022 (4)
- August 2022 (5)
- July 2022 (4)
- June 2022 (4)
- May 2022 (5)
- April 2022 (4)
- March 2022 (5)
- February 2022 (4)
- January 2022 (4)
- December 2021 (3)
- November 2021 (4)
- October 2021 (3)
- September 2021 (4)
- August 2021 (4)
- July 2021 (4)
- June 2021 (5)
- May 2021 (4)
- April 2021 (4)
- March 2021 (5)
- February 2021 (4)
- January 2021 (4)
- December 2020 (4)
- November 2020 (4)
- October 2020 (6)
- September 2020 (5)
- August 2020 (4)
- July 2020 (7)
- June 2020 (7)
- May 2020 (5)
- April 2020 (4)
- March 2020 (5)
- February 2020 (4)
- January 2020 (4)
- December 2019 (2)
- November 2019 (4)
- October 2019 (4)
- September 2019 (3)
- August 2019 (3)
- July 2019 (2)
- June 2019 (4)
- May 2019 (3)
- April 2019 (5)
- March 2019 (4)
- February 2019 (3)
- January 2019 (5)
- December 2018 (2)
- November 2018 (4)
- October 2018 (5)
- September 2018 (3)
- August 2018 (3)
- July 2018 (4)
- June 2018 (4)
- May 2018 (5)
- April 2018 (4)
- March 2018 (5)
- February 2018 (5)
- January 2018 (3)
- December 2017 (3)
- November 2017 (4)
- October 2017 (5)
- September 2017 (3)
- August 2017 (5)
- July 2017 (3)
- June 2017 (8)
- May 2017 (5)
- April 2017 (4)
- March 2017 (4)
- February 2017 (4)
- January 2017 (4)
- December 2016 (2)
- November 2016 (7)
- October 2016 (5)
- September 2016 (8)
- August 2016 (5)
- July 2016 (4)
- June 2016 (12)
- May 2016 (5)
- April 2016 (4)
- March 2016 (7)
- February 2016 (4)
- January 2016 (10)
- December 2015 (4)
- November 2015 (6)
- October 2015 (4)
- September 2015 (7)
- August 2015 (5)
- July 2015 (6)
- June 2015 (12)
- May 2015 (4)
- April 2015 (6)
- March 2015 (10)
- February 2015 (4)
- January 2015 (4)
- December 2014 (3)
- November 2014 (5)
- October 2014 (4)
- September 2014 (6)
- August 2014 (4)
- July 2014 (4)
- June 2014 (4)
- May 2014 (5)
- April 2014 (5)
- March 2014 (5)
- February 2014 (4)
- January 2014 (5)
- December 2013 (5)
- November 2013 (5)
- October 2013 (10)
- September 2013 (4)
- August 2013 (5)
- July 2013 (8)
- June 2013 (6)
- May 2013 (4)
- April 2013 (5)
- March 2013 (4)
- February 2013 (4)
- January 2013 (5)
- December 2012 (3)
- November 2012 (4)
- October 2012 (5)
- September 2012 (4)
- August 2012 (4)
- July 2012 (5)
- June 2012 (4)
- May 2012 (5)
- April 2012 (4)
- March 2012 (4)
- February 2012 (4)
- January 2012 (4)
- December 2011 (3)
- November 2011 (5)
- October 2011 (4)
- September 2011 (4)
- August 2011 (4)
- July 2011 (4)
- June 2011 (5)
- May 2011 (5)
- April 2011 (3)
- March 2011 (4)
- February 2011 (4)
- January 2011 (4)
- December 2010 (3)
- November 2010 (4)
- October 2010 (4)
- September 2010 (3)
- August 2010 (5)
- July 2010 (4)
- June 2010 (5)
- May 2010 (4)
- April 2010 (3)
- March 2010 (2)
- February 2010 (4)
- January 2010 (4)
- December 2009 (4)
- November 2009 (4)
- October 2009 (4)
- September 2009 (4)
- August 2009 (3)
- July 2009 (3)
- June 2009 (3)
- May 2009 (4)
- April 2009 (4)
- March 2009 (2)
- February 2009 (3)
- January 2009 (3)
- December 2008 (3)
- November 2008 (3)
- October 2008 (3)
- August 2008 (3)
- July 2008 (4)
- May 2008 (2)
- April 2008 (2)
- March 2008 (2)
- February 2008 (1)
- January 2008 (1)
- December 2007 (3)
- November 2007 (3)
- October 2007 (3)
- September 2007 (1)
- August 2007 (2)
- July 2007 (4)
- June 2007 (2)
- May 2007 (3)
- April 2007 (1)
- March 2007 (2)
- February 2007 (2)
- January 2007 (3)
- December 2006 (1)
- November 2006 (1)
- October 2006 (1)
- September 2006 (3)
- August 2006 (1)
- June 2006 (2)
- April 2006 (1)
- March 2006 (1)
- February 2006 (1)
- January 2006 (1)
- December 2005 (1)
- November 2005 (2)
- October 2005 (1)
- August 2005 (1)
- July 2005 (1)
- April 2005 (2)
- March 2005 (4)
- February 2005 (2)
- December 2004 (1)