Summary: Michael Lewis wrote an incredibly engaging and thought-provoking book about the groundbreaking work in cognitive bias by Tversky & Kahnemann. Always framed with story narratives, the book is about economics, psychology, business, productivity, and friendship.
Early this January, I read The Undoing Project. Specifically, I listened to the audiobook version. The book was an endless spring of interesting ideas about cognitive bias, and why people make mistakes in their thought processes. The book frames the ideas and biases within longer and shorter narratives, which is one reason the book was always engaging throughout. If you are familiar with Michael Lewis’ other works, like The Big Short, Moneyball, The Blind Side, and so on, it fits right alongside them—compelling human-driven narrative to explain complex topics of finance, psychology, sports, and economics.
The Undoing Project: A Friendship That Changed Our Minds
Michael Lewis (2016, W.W. Norton) (audiobook, read by Dennis Boutsikaris)
Kahnemann & Tversky
The main characters of the book are Danny Kahnemann and Amos Tversky. Two Israeli psychologists who pioneered much of the work in cognitive bias. It emphasizes much of the partnership and personal relationship between the two. They formed an incredibly prolific duo, a successful and productive pairing that is often termed a “fertile pair.” Fertile pairs are seen in partnerships in many different spheres, including entertainment, writing, academia, business, or elsewhere.
Availability heuristic
The availability bias is the tendency to overestimate the importance or frequency of an event by how easy it is to think of an example of it. For example, one study gave out 2 lists of names—one list of men’s names and another list of women’s names—and respondents had to judge afterward which list had been longer. If one list was 1 name shorter, but it included the names of some very famous people, then the respondents remembered that list as being longer. It stuck out to them better, because the names were more prominent and recognizable. The list was more available to recall, so they exaggerated the size of that list.
Anchoring heuristic
The anchoring bias is the tendency to perceive information in a context relative to a reference point. For example, a study gave respondents a multiplication math problem and only enough time to guess it, not to do the math. Mathematically, all the math problems had the same answer, but half the respondents got a problem that started by multiplying small numbers, and half got a problem that started with multiplying higher numbers. The average guess of the first group (small numbers first) was much lower than the average guess of the second group (higher numbers first). They anchored themselves to the first information at the start of the problem, and that influenced their guesses.
Representativeness heuristic
The representativeness bias is the tendency to evaluate situations based on how well they reflect the essential characteristics of a larger category. For example: the Linda problem asked respondents to make a guess about a hypothetical person named Linda. They were prompted with information about Linda that hinted at some political interests and some outspoken personality traits. Respondents had to guess which was more likely: (1) Linda is a bank teller; or (2) Linda is both a bank teller and active in the feminist movement. It is logically impossible for (2) to be more likely than (1), since “Linda is a bank teller” includes all situations where Linda is a feminist and also any situations where she’s not a feminist. Even so, in multiple different studies, respondents were more likely to pick (2) than (1). They judged that option (2) was a better depiction of Linda, because it seemed more representative of her.
Simulation heuristic
The simulation bias is the tendency to overestimate the likelihood of an outcome by how easy it is to imagine. For example, people who lost while participating in a lottery or raffle were most upset if their ticket only had a single digit off from winning. But those who had tickets where all the digits were wrong felt much less upset. So if the winning ticket was 54321, somebody with ticket 54322 felt more regret than a person with ticket 96278. Even though the tickets were random and a losing ticket is a losing ticket, it was easier to simulate winning if you only have to imagine one digit changing.
Many other biases
The list of biases in the book is far longer, and the details more interesting. The list of biases not covered in the book is far greater. So this is not meant to be a comprehensive list. Just a sample of the interesting subjects covered. It also delves into the background of many different characters, especially Danny & Amos.
Relationship to law, insurance, investments, etc.
The book is about the human perception of information. That makes it relevant to the practice of law, and to financial decision-making by insurers and investors. It’s about our biases in judging the quantity and frequency of risks and rewards. That applies to making investment decisions, advising clients on legal risks, to insurers weighting relative risks, and to many other decisions made by lawyers, businesses, and individuals.
The book won’t get you to eliminate your own cognitive biases. But it can help you understand and identify them.