Imagine for a second, two brains.



In Thinking, Fast and Slow, we are encouraged to confront our unconscious biases by imagining the brain as two systems.

This week’s video is about Daniel Kahneman’s Thinking Fast and Slow.

It’s useful to imagine that we have two distinct systems in our brains. One models the world, trying to predict and save us the energy of needing Two. In the process, it feeds errors to our conscious attention.

Simple and coherent stories seem valid to us, even when they’re just a post-hoc rationalization of random events. Our self-reported confidence is determined by the coherence of our narrative around it, not the quality of our judgement.

Three of the most robust and broadly applicable examples the book discusses are the anchoring, availability, and affect heuristics. You should know these three.

Below, you will find this week’s video from my Youtube channel, some quotes and visuals you might like to share, and a transcript of the video. If you find any of this useful, sign up for my newsletter – I share weekly, across a range of topics that might help you become more intuitive, knowledgeable, and take control of your own life.

This weeks video:

Illustrations and Quotes

Transcript:

This week, I’ll be talking about Daniel Kahneman’s thinking fast and slow. It’s about decision-making and how our amazing brains can go wrong in certain circumstances. Like most psychologists he’s much too fast to generalize from games to life and from experimental subjects to human nature. But there’s a lot of value in this book that I wanted to share with you.
Firstly, you’ll be invited to imagine two systems in your brain on automatic system, one unconscious system. Second, we’ll talk about what intuition is and why we can be overconfident in our own judgment. And finally three automatic judgment you might brain might be making for you on availability, anchoring, and emotions.
Section Heading: System 1 and system 2
What Kahneman invites us to do is conduct a thought experiment to separate our brains into two separate systems, purely for the, for the purpose of illustrating several psychological phenomenon that are hard to explain if you think of your brain as a single entity.
So. The first of these, he refers to as system one.
System one is the autonomous responses that you’re largely unaware of that filter what you see before you even see it, that affects how you make decisions and affect how you think of the world around you. They, they control. Insights that are passed on to system two. And what system one is very, very good at is modeling and working out what might happen next, given the whole complex array of things that are being fed into you by your senses and through your memory and connecting all of that together, to create a congruent, coherent story, and guess what’s going to happen next and prepare your body for what’s going to happen next based on all of your past experiences without you even thinking about it.
What it can’t do is anything to do with statistics. So what Kahneman is saying his interest in many ways, came from economics and is that system one is terrible at economics. It’s terrible at making financial decisions and other decisions in your life that involve statistical analysis or any kind of conscious thought, because even if you think that system two your controlled attention is in control is focusing you on the problem in front of you and helping you to make a good decision.
The information that’s fed into system two is there by system one it already biased in several ways. Which make the analysis of statistics very, very difficult to do in a controlled and what he will call rational manner.
So system two is activated in a few different situations, a few different ways. Firstly, if your system one can’t explain the world around you. If something is incongruent, if something doesn’t fit the narrative, then system two switches online automatically.
It finds something that it wants to focus its attention on an in-congruent, something unexpected and it leases in on that and encourages you to wake up. The second one is, is controlled attention. So when you choose to give your attention to something, and that’s really part of a personality trait, I guess Kahneman will refer to this as a spectrum of people who have a very active system, too, that switches on more quickly than other people who he might refer to as either rational or engaged, depending on which terminology he wants to use.
So these are the people who. Try to think through problems in a very programmatic logical sense and try to avoid the influence of their emotions, unsuccessfully for Kahneman but they, they switched on earlier. So they miss some of the traps that people fall into. And at the other end of the spectrum are people who live most of their lives on autopilot who have what Kahneman refers to as lazy brains.
It’s worth mentioning that none of this is literal. You don’t have a complete division between system one and system two. But, without thinking of your brain in that way, it’s very difficult to realize just how much influence your subconscious or your system one as Kahneman refers to it has over what you see, the decisions you make and your life in general. So it’s a useful thought experiment.
One of the key questions that Kahneman asks throughout really is to what extent is system one trainable? So how much better can your automatic responses to situations, particularly situations involving statistics and economic decision making, how much better can they be? And for Kahneman most of the cognitive biases and the illusions that he refers to are non-trainable they can’t be fixed.
Your automatic responses will always be wrong. And that’s where really where my kind of caveat here comes in because he’s only referring to gains, he’s referring to something with a defined goal.
Section Two: Overconfidence
So what leads to overconfidence for Kahneman? It’s essentially our lazy brains. When we see a coherent narrative, we are very likely to feel confident in it. And indeed we ascribe coherence and causality when there is nothing but chance going on.
So he talks about the narrative fallacy from Taleb. The narrative fallacy is when presented with a series of incoherent data that make no sense at all, you’ll find a story in there somewhere, or you’ll find something to explain how it all happened in the end. You’ll rationalize it post-hoc. Come up with a narrative that looks inevitable except. But if you analyzed it rationally, you’d probably find that at every chance there were a million different ways that that events could have gone. So one example that Kahneman gives is the rise of Google. In retrospect, you can look at all the wonderful business decisions they made and think of specific instances of brilliance.
But most business analysts would agree that while they’re great in some ways, or at least successful, as far as business is concerned. They are largely where they are due to lots of chance. The role of circumstances, vastly underestimated when you try and pull together a coherent narrative and our brains do that all the time in our own lives and in what we observe around us, our brains look for coherence where the might be all the might not. And it’s very difficult to judge the difference between the two. Without taking a view from outside your own narrative, without looking at alternative possibilities and things along those lines.
The second effect that he talks about as far as coherence is concerned is the halo effect.
So this is where if you make an observation about somebody or something, that seems good, that expands to the whole thing, whether it’s a person or a concept or an idea, or. Process being good. And what that means is it’s very difficult to notice that actually most things are nuanced. They come with benefits and costs.
And what the narrative fallacy and the halo effect do together is they feed your brain or coherence that isn’t always there.
(intuition)
One of the most interesting areas that Kahneman goes into on this is around intuition.
So he had what he called an adversarial collaboration with Klein around intuition . They start from very different points of view on the ability of people to read the environment around them and make good predictions, make good decisions as compared to simple data sets for instance.
And what they largely agreed on is that some intuitions, particularly emotional intuitions are inherited automatic, probably evolutionary. And so our brains know many circumstances in which we should be scared or excited, and they react appropriately in those cases. But in the case of many practiced intuitions that come from things like chess grandmasters, and firefighters who were entering burning buildings know when to better.
Most expertise in those fields comes from deliberate practice. And that’s what Klein focuses more on as opposed to the biases and the heuristics that Kahneman focuses on.
The division between these two comes as far as Kahneman and Klein concerned from the environment in which you operate. So in a stable environment, which is fairly regular, almost rules-based, and in which people can get regular feedback on how good their decision-making is. Our intuitions based on pattern recognition are really, really good.
But outside of those stable environments, particularly when it’s not very clear what we should be looking at, and we don’t have clear rules of causation or games to follow through. The brain is very liable to mix up coherence and confidence. So when we can tell a coherent story, we feel confident about that store and because of the way our system one can be misled by the heuristics and biases, coherence and confidence should be treated very separately.
(three different ways in which our brains can be misled)
One of them is called anchoring , which you might be familiar with.
It’s the idea that if you’re given a value before you estimate something, anything, your estimate is influenced by that value that you’re given. So one of the most memorable examples for me was that if people are about to be asked a statistical question with a percentage answer and they’re shown a roulette wheel, Spin it.
And for this study, they weighted the table so that you had either a 10 or a 65 shown up, depending on which group you were in. The people who saw a 10 on the roulette wheel , which has nothing to do with this statistical question averaged 25% in the answer they gave to the question, the people who saw 65 averaged 45%.
So a roulette wheel, which has nothing to do with the statistical question, being asked to people. Influenced the responses that they would given. And the same has been shown in, in many different settings. Anchoring is one of the most robust things that Kahneman studies throughout the book and for achoring it comes down to two things.
So one of these is an error that we regularly make in our system two called undercorrection. So , those of you who drive will know that if you come off a high-speed road, it’s very easy to just continue driving at a reasonably high speed, which is in excess of the speed limit of the road you’re going on to, or in excess of the speed that you would normally go that road as well.
And that’s because your brain is liable to stick with it’s first impression or stick with its previous impression to under correct. And the second one is something called priming, which is a more subconscious effect that you’re less aware of normally.
The best example of that was a study, which asked people about the temperature in Germany, whether it was above or below 20 degrees or five degrees in the two samples. So in one sample, they were asked whether it was above or below 20, and the other sample, they were asked whether it was above or below 5 degrees and they were then shown a series of words.
And while their brains were being studied, to see how fast their, their brains fired in response to those words. Those who saw a figure of 20. So that’s 20 degrees centigrade. For those of you who usually use Fahrenheit, that’s reasonably well, not hot, but reasonably those of those of the study participants who saw 20 were primed to more quickly recognize words associated with summer.
Those who saw five degrees in the previous question that they were asked were primed to more quickly recognize words associated with winter. So this priming doesn’t just refer to statistical questions. It also pulls out a huge array of correlations that your brain has.
Second, in fact that was going to talk about was availability . So that’s whether we can recall instances of something in our memory really easy, and if we think that memories are available to us in our brains, then we’re more likely to, be guided by those.
So when you think about the risk of flooding or when you think about the risk of. Just about anything you’re often guided by extreme events. So we have high watermarks of rivers. We have the media, constantly repeating stories, which are designed to grab your attention by scaring you effectively.
So that the most extreme stories, all of the different stories they could talk about. And because of that, um, expectation. That we’re talking about risk. We’re talking about very serious, bad risk. People tend to overestimate things that are more extreme. They overestimate the likelihood of those things happening as compared to things that are less available in our memories. Even if those things are far more common.
And the final one is the affect heuristic. And so we’re largely guided by emotions and things which are more emotive to us are more memorable and are more easily manipulated as well by the people.
if you read a positive piece on a technology, if you read something that talks about how solar power will revolutionize the future. You’re more likely to underestimate any risks involved in that technology afterwards, simply being more aware of the positive impact of the technology makes you less likely to think about the risks or the negatives.