The book starts by describing our two styles of thinking…
- System 1 – operates automatically and quickly, with little or no effort and no sense of voluntary control.
- System 2 – allocates attention to the effortful mental activities that demand it, including complex computations. The operations of System 2 are often associated with the subjective experience of agency, choice, and concentration.
…and the next 30+ chapters describe situations where System 1 can get us into trouble.
The book focuses on different cognitive biases that humans get fooled by and there was a spell of about 100 pages starting around chapter 4 where it started to feel repetitive and I felt like putting the book down.
That said, I kept going and enjoyed the remaining 2/3 of the book much more.
These were some of my favourite parts:
- Outcome bias – this bias describes the tendency to build a story around events after we already know the outcome and come to the conclusion that the outcome was inevitable. For example, many people claim that they ‘knew well before it happened that the 2008 financial crisis was inevitable’.
Kahneman points out that we only remember the intuitions which turned out to be true, noone talks about those intuitions which turned out to be false. He also cautions us to consider the role that luck plays in any success.
- Intuitions vs Formulas – Kahneman quotes Paul Meehl who suggests that most of the time human decision makers are inferior to a prediction formula even when they are given the score suggested by the formula.
In the world of software this reliance on intuition is often used when interviewing people for jobs and choosing who to hire based on an intuitive judgement. Kahneman suggests making a list of about 6 traits that are pre-requisites for the position and then ranking each candidate against those. The candidate who scores the highest is the best choice.
- Over reliance on representativeness – often we judge the probability of something based on stereotypes rather than taking the base rate of the population into account.
e.g. if we saw a person reading The New York Times on the subway which of the following is a better bet about this stranger?
- She has a PhD.
- She does not have a college degree.
If we judge based on representativeness we’ll bet on the PhD because our stereotypes tell us that a PhD student is more likely to be reading the New York Times than a person without a college degree. However, there are many more non graduates on the subway than PhDs so the likelihood is that the person doesn’t have a college degree.
Kahneman encourages us to take the base rates of the population into account even if we have evidence about the case at hand. He uses the example of predicting how long a project will take and suggests using the data from previous similar projects as a baseline and then adjusting that based on our specific case.
- Framing – Kahneman talks about the impact that the way information is presented to us can have on the way we react to it. For example, he describes a study in which the merits of surgery for a condition are considered and the descriptions of the surgery are like so:
- The one month survival rate is 90%.
- There is 10% mortality is in the first month.
People who were presented with the first option were much more likely to favour surgery than those shown the second option.
- The default option – people feel much more regret if they deviate from the normal choice and something goes wrong than if they stick with the default and something equally wrong happens. This theory is well known on forms where authors will make the option they want people to select the default.
There are many more insights and these are just a few that stood out for me.
I’d encourage you to read the book if you find human decision making interesting although I’d try and forget everything I’ve said as you’ve probably been inadvertently primed by me.