Introduction
The mysteries of the human brain have always intrigued researchers and psychologists. The enigmatic realm of human cognition continues to captivate our attention.
In "Thinking, Fast and Slow", a renowned psychologist, and Nobel laureate, Daniel Kahneman explores the two distinct systems that govern our thinking processes. Drawing on decades of research, Daniel Kahneman explains that our minds have two systems of thinking. System 1 is fast, automatic, and intuitive. Our ancestors needed to be fast to avoid predators. This system was a product of their learned experience and adaptation. And System 2 - slow, deliberate, and analytical. Imagine trying to find your best friend in a massive crowd at a stadium. If you keep yourself focused on certain characteristics/features to look out for like their hair color, height, and clothes, and keep struggling to find them, you’ll be more likely to spot them, But if you get distracted by the game, the noise, and each person in the crowd, start glossing over their details - who they are, how they look, etc. it’ll be very hard to spot her. This is what System 2 does. It allows you to focus and removes distractions
System 1 operates quickly and effortlessly, relying on mental shortcuts and patterns to make decisions. Imagine you're driving a car, and suddenly, the vehicle in front of you slams on its brakes. Without conscious thought, your foot instinctively moves to the brake pedal. This swift reaction is an example of System 1 thinking, where your brain relies on automatic reflexes and pattern recognition to respond rapidly to a familiar situation. In contrast, System 2 is more effortful and requires conscious effort and concentration. Consider the process of solving a complex math problem or unraveling a challenging riddle. It requires focused concentration and deliberate mental effort. This deliberate engagement of your cognitive abilities represents System 2 thinking, where you consciously engage in analytical reasoning to arrive at a solution.
Kahneman explains the powerful influence of these two thinking systems on various aspects of our lives, from personal relationships to economic behavior.
Let’s dive deeper into the impacts of these two systems in our daily lives.
The Impact of Laziness: Decision-Making and Intelligence
Priming and the Unconscious Influence on Our Thoughts and Actions
Our lack of constant conscious control over our thoughts and actions can be attributed to the prevalence of System 1 (rapid thinking), which frequently assumes command and influences our conduct without our awareness. This can result in swift, intuition-driven judgments, sometimes bypassing the need for more deliberate reflection. Your responses to a situation will consequently be based on your existing thoughts and knowledge and you will use them to fill in the blanks and respond accordingly. This process is known as priming.
For example, if you are asked to guess a particular 3 letter word that starts with ‘c’ and ends with ‘t’, i.e. ‘c_t’, your response won’t always be the same. Let’s say you were just discussing pets with your friends. You’ll probably end up saying the word is cat. But if you had just been discussing baby care with your partner, you’d probably say it’s a cot. So exposure to certain thoughts and concepts makes us think on similar lines in other cases - and being in such a state means we have been ‘primed’. Our bodies can be primed as well - for example, you’ll more likely move faster and be more active after reading an adventure novel or watching an action movie. People primed with money act more individualistically and can be non-cooperative in their behavior in other aspects of life. This is an unconscious process - a result of System 1 taking over. But as this significantly affects our behavior in real life, it’s important to be aware of this when making important decisions.
Making quick but defective choices: The Halo Effect
Other instances where System 1 takes over are the halo effect and confirmation bias. Imagine you like one thing about a person, let's say, Andrew, whom you met at the shops but don’t know too much about. Just because you like one aspect, then you presume that everything else about him is positive. So when you’re in trouble you will first think of Andrew because of the image you have created in your mind of him. This acts as if he has a halo on his head, hence the effect is known as the halo effect.
We get a bias towards the person about whom we have created a positive image in our mind. We always tend to agree with the constructs that we have previously created which brings Andrew to being a saint.
Next up, is confirmation bias. People are accepting of information that confirms pre-held beliefs. They also are likely to accept anything that’s suggested to them, in the absence of other information.
For example, when one asks “Is David a good football player?” Automatically those who do not even know what football is will say yes, because ‘David’ is associated with football within the question. Studies support this.
Both the confirmation bias and halo effect happen because our minds want to act quickly. In the absence of more data and deliberate thinking, this can lead to serious errors of judgment and inaccurate conclusions, hence another thing to look out for.
Cognitive Shortcuts: Unveiling the Mind's Quick Decision-Making Tools
In various situations, our minds use shortcuts called heuristics to quickly understand our surroundings. Although heuristics are usually helpful, overreliance on them can lead to mistakes in many situations & make quick decisions. Two types of heuristics relevant to us are the substitution heuristic and the availability heuristic.
The substitution heuristic is when our mind answers an easier question instead of the one that was actually posed. An example of the substitution heuristic is when people are asked to rate their own happiness or life satisfaction. Instead of directly assessing their overall happiness, they often rely on an easier-to-answer question, such as "How many social interactions did you have today?" or "Did you experience any positive events recently?" These simpler questions serve as proxies for overall happiness, leading to potential inaccuracies in self-assessment.
The availability heuristic is when we overestimate the probability of something based on how easily it comes to mind or how frequently we hear about it. For example, people overestimate the likelihood of death from accidents compared to death from strokes, although, in reality, death from strokes is much more common. This happens because accidents are more frequently reported in the media and are more vivid in people's minds, leading them to believe that accidents are more common than they actually are.
The availability heuristic can lead to biased decision-making and judgments based on the ease of information retrieval rather than objective statistical data. The mind makes quick choices through the use of heuristics and System 1 thinking. Another example - if someone frequently hears news reports about shark attacks, they might believe that the likelihood of a shark attack is higher than it actually is because those instances are more vivid and memorable in their mind. This illustrates how the mind often relies on easily accessible information, leading to biased judgments and decisions.
Numerical Navigations: How Our Difficulty with Statistics Leads to Mistakes We Could Avoid
Hindsight Bias: How We Remember Events Differently.
Our minds have two memory selves – the experiencing self and the remembering self. The experiencing self records how we feel in the moment, giving an accurate account of the experience. On the other hand, the remembering self recalls the whole event afterward, but it is less accurate since it registers memories after the event is over. Two reasons contribute to this: duration neglect, where we ignore the total duration of the event and focus on specific memories, and the peak-end rule, where we give more importance to what happened at the end of the event. These memory biases can affect how we remember and interpret past experiences.
During "The Cold-Hand Experiment" conducted by Daniel Kahneman and his colleagues, participants submerged one hand in extremely cold water for a brief but intensely painful duration, and the other hand in the same cold water for a longer period but with a slightly less painful ending. When given the choice to repeat one of the trials, many participants surprisingly opted for the longer, less painful experience, even though it involved more overall discomfort. This outcome demonstrates the "peak-end rule," revealing that our memory and evaluation of past experiences are influenced more by the peak intensity of the experience (the extremely cold water) and how it concluded (with less discomfort) than by the overall duration of the ordeal.
This example highlights how the experiencing self and the remembering self can have different perceptions of the same event, leading to varying memories and judgments. And it's our two systems that are linked to this!
Shifting Mental Focus Shapes Thoughts and Actions.
Adjusting the focus of our minds can significantly impact our thoughts and behaviors. Shifting our attention and perspective can lead to different interpretations and decisions. For instance, focusing on the present moment may influence our immediate emotional responses, while contemplating the long-term consequences can lead to more rational and considered choices.
An experiment involving a group of judges showed that their decisions on whether to grant parole to prisoners were significantly influenced by the time of day. As the judges became mentally fatigued, they tended to default to the easier option of denying parole, which led to more unfavorable decisions for the prisoners. However, after a short break, when their minds were refreshed, the judges were more likely to consider the individual cases more carefully and make fairer decisions. This example highlights how the focus of the mind can dramatically impact decision-making and behavior.
Our minds are by default in a state of cognitive ease (a state of low focus and low energy use). In this state, System 1 dominates, leading to increased innovation and intuitive decision-making. However, this also makes us more prone to errors. Conversely, in a state of cognitive strain, the analytical System 2 takes charge, reducing creativity but minimizing mistakes due to more detailed thinking and higher awareness. You can deliberately adjust your mental energy to influence the effectiveness of your thinking. For tasks requiring creativity, like writing a novel, promoting cognitive ease can be a strategic approach. And vice versa
Our minds, naturally seeking cognitive ease, construct coherent worldviews that align with familiar narratives, bolstering our confidence in judgments. However, this predisposition can lead to errors as we fill gaps with assumptions and biases, often drawing false conclusions from limited or contradictory evidence. Cognitive biases and heuristics, such as the availability heuristic, further influence our thinking, causing overestimations and biased responses.
While creating mental shortcuts helps simplify complexity, awareness of these tendencies is vital to mitigate overconfidence and judgment errors in decision-making.
The Probability Puzzle: How Presentation Shapes Our Decision-Making
The way we assess situations and handle challenges is heavily impacted by how they are presented to us. Even minor alterations in the wording or emphasis of a statement or question can have a significant influence on our approach.
Let’s take an example. In a decision-making scenario, participants were given two investment options. The first investment was portrayed as having a 90% chance of yielding positive returns, while the second was described as having a 10% chance of resulting in losses. Despite both options essentially conveying the same likelihood of success (90% success rate), individuals consistently favored the first investment due to its positive framing, illustrating how the presentation of information can significantly influence decision-making.
This example illustrates how the way information is presented, such as framing it in terms of success rate versus failure rate, can significantly influence our decision-making and judgment of risk.
This phenomenon extends to the concept of relative frequency versus statistical probability. People often respond differently to information presented in terms of how often an event occurs relative to others rather than its raw statistical probability.
Let’s look at an example. Would you avoid buying a home in an area where 1 out of 10 residents like you are convicts? Or would you avoid buying in an area where there’s a 10% chance of the residents being convicts? Even though both are essentially the same thing, experiments prove you’re more likely to avoid the first option - as this is presented in terms of relative frequency, which has a greater impact on most peoples’ minds
Denominator neglect is another cognitive bias where people focus on the numerator of a ratio while overlooking the denominator.
For instance, if people are told a goalkeeper saved 70 goals out of 100, and another one saved goals 0.7 times out of 1, the people may focus on the first one as the figure of 70 appears to be more dominating in the human mind.
Human Decision-Makers: Why Rational Thinking Alone Doesn't Guide Our Choices.
Chapter 12
Details coming soon.