Abstract
Kahneman’s framework describes two thinking modes. System 1 is fast, intuitive, and automatic; System 2 is slow, deliberate, and analytical. Most judgments are driven by System 1 (heuristics), so people rely on these mental shortcuts that save effort but produce predictable errors through biases.
Background
Heuristics are mental shortcuts or rules of thumb that help people make quick decisions and solve problems efficiently, often by simplifying complex information. Heuristic research began in earnest in the 1950s and 1960s as psychologists moved away from simple behaviourist models toward cognitive explanations of judgement and decision-making.
Early work focused on how people form beliefs and make inferences under uncertainty. Researchers like Herbert Simon, Nobel Prize winner for economics, introduced in 1956 the idea of bounded rationality, arguing that people use satisficing (a decision-making strategy) and simple decision rules, rather than optimising, because of limited time and information. He posited that rational choice theory is an unrealistic description of human decision processes and called for psychological realism. He referred to this approach as bounded rationality. This set the stage for a more detailed study of the mental shortcuts that people actually use.
The landmark phase of heuristic research was driven by Amos Tversky and Daniel Kahneman in the early 1970s. Their experiments identified key heuristics (availability, representativeness, anchoring) and documented the systematic biases these shortcuts produce. Rather than viewing errors as random noise, Tversky and Kahneman demonstrated predictable departures from normative probability and utility theory. This launched the cognitive revolution in judgement research and challenged the dominant economic model of rational agents. Kahneman’s empirical work on judgement and decision has reshaped epistemology by bringing descriptive psychology into contact with traditional normative questions about belief, rationality, and evidence.
Kahneman and Amos Tversky (1937–96) founded the branch of psychological research called 'heuristics-and-biases psychology' in the 1970s. It deals with the heuristics or informal rules used by the human mind to make quick decisions. The first examples of these heuristics appeared in Kahneman and Tversky's seminal 1974 paper, "Judgement Under Uncertainty: Heuristics and Biases".
Kahneman acknowledged that often heuristics work well. The representativeness heuristic, for instance, is the tendency to judge the probability of an event based on how the circumstances match one's mental image of such an event. (The likelihood of rain is higher on a day that starts out cloudy). Unfortunately, heuristics also leave people susceptible to various systematic errors in judgement — the "biases" in heuristics and biases. One known bias that accompanies the representativeness heuristic is base-rate neglect, in which people ignore statistical information and overestimate the likelihood of rare but easy-to-imagine scenarios.
Through the 1970s and 1980s, research diversified: psychologists investigated a wider array of heuristics and biases (e.g., hindsight bias, confirmation bias), while economists and behavioural scientists began incorporating these findings into models of economic behaviour. The rise of prospect theory (Kahneman & Tversky, 1979) offered a formal account of how people evaluate gains and losses, explaining anomalies like loss aversion and reference dependence and bridging psychology with decision science.
In the 1990s and 2000s, heuristic research expanded into real-world domains — finance, medicine, law, and public policy — examining how cognitive shortcuts affect professional judgement and systemic outcomes. Experimental methods were complemented by field studies, neuroimaging, and computational modelling, which clarified mechanisms and boundary conditions of heuristics. Intervention research also grew, testing debiasing techniques such as decision aids, checklists, structured analytic techniques, and training to engage more deliberative processing.
Contemporary work treats heuristics as adaptive strategies rather than mere flaws: the fast-and-frugal heuristics programme formalised how simple rules can be effective in particular environments. At the same time, integrative approaches explore when heuristics perform well versus poorly, how people choose among strategies, and how social and cultural factors shape reliance on particular heuristics. Research today is interdisciplinary — spanning cognitive science, behavioural economics, neuroscience, and AI — and increasingly focused on designing institutions and tools that harness useful heuristics while mitigating costly biases.
Summary
Part 1
Introduction
Kahneman begins by describing the goal of his book: to give people a richer vocabulary for discussing and detecting errors in judgement. He offers a brief history of his own professional interest in the psychology of judgement and decision-making, illustrated by some examples of the successes and failures of human intuition. Finally, Kahneman provides a high-level outline of Thinking, Fast and Slow (2011) which begins by detailing the workings of two complementary "systems" of cognition and describes the heuristics, or rules of thumb, on which those systems rely.
In the "Origins" introduction, Kahneman explains the book’s purpose: to describe two systems of thinking – System 1 (fast, automatic, intuitive) and System 2 (slow, deliberate, analytical) – and to show how their interaction shapes judgements and choices. He frames the book as an attempt to synthesise decades of research in cognitive psychology and behavioural economics, much of it produced with Amos Tversky, to explain systematic errors people make and the predictable ways intuition and reasoning fail.
Chapter 1: The Characters of the Story
Kahneman introduces the two systems alluded to in the introduction. System 1 is the automatic, intuitive set of thought processes by which people often make decisions, sometimes without conscious awareness. System 2 is deliberate and rational, but it is also lazy, often superficially endorsing whatever intuitive judgement System 1 comes up with.
Moreover, the use of System 2 involves a focusing, and thus a narrowing, of attention. When System 2 is engaged on a problem, a person becomes much less aware of anything not immediately relevant to solving the problem. The systems come into conflict whenever a person must do something counterintuitive, such as "steer into the skid" on an icy road or make sense of seemingly contradictory visual data. At the end of Chapter 1, Kahneman is careful to point out that the two "systems" are really aliases for different ways of thinking. They are not distinct physical regions of the brain, separate neural pathways, or anything quite that concrete.
Chapter 2: Attention and Effort
System 1, Kahneman says, is "at the centre of the story", whereas System 2 is a mere "supporting character". This is in part because System 2 usually avoids effort, only getting involved in a decision if and insofar as it needs to. To give the reader a reference point for System 2 exertion, Kahneman describes, and invites the reader to try, a simple but challenging arithmetic exercise called the Add-1 task: participants see a sequence of single-digit numbers and must add 1 to each digit mentally (e.g., 3 → 4; 9 → 0). The task forces continuous, effortful computation and measures sustained attention and working memory load.
In experiments, he says, performance of the Add-1 task has been shown to render people "effectively blind" to irrelevant stimuli. Mental effort — the intense and sustained engagement of System 2 — also produces a suite of physiological effects, including dilated pupils and an elevated heart rate.
Kahnemann also presents the famous optical illusion of two arrowlike figures, one with two heads pointing out (<—>) and one with two tails (>—<). The lines themselves are identical in length, but almost everyone perceives the line in the two-headed arrow (<—>) as shorter than the other (>—<). This perception is remarkably hard to shake even if one is familiar with the illusion. The effect was first documented by German psychiatrist Franz Carl Müller-Lyer (1857–1916) in 1889 and has been the subject of many explanatory efforts. Some theories emphasise the fact that human vision evolved to deal with three-dimensional environments, making it easily fooled by flat images. Kahneman uses this illusion because he affirms that it shows the persistence and automaticity with which System 1 can make a mistake. No matter how many times one views the illusion, it is still likely to seem as though one line is shorter than the other. It takes conscious restraint — the exercise of System 2 — to recognise that one's intuition is making a systematic error in judgement.
Chapter 3: The Lazy Controller
Complex, methodical system 2 thinking demands self-control, particularly if such thinking is performed under time pressure. Kahneman notes some exceptions to this general trend: in a pleasant and well-studied psychological state called flow, all one's attention goes into the activity at hand, and no effort is needed to stick to the task. Still, in general, being cognitively busy leaves one with less willpower to resist such temptations as junk food or impulsive spending. This state of diminished self-control is known as ego depletion. Keen to avoid such an expenditure of willpower, System 2 seldom contradicts the intuitions of System 1 unless an obvious discrepancy prompts further investigation. Children who possess a strong and unusually active system 2, who are capable of delaying gratification and exercising self-restraint, are likely to perform better on intelligence tests later in life.
Toward the end of Chapter 3, Kahneman cites a classic study of willpower in young children: the famous "marshmallow test". Kahneman describes a version of the study that involved Oreo biscuits instead, but the basic experimental protocol was the same. The experimenter would offer a child one biscuit and promise to double the reward if the child could wait 15 minutes. About a third of the subjects managed to wait the full 15 minutes without eating the marshmallow. As a group, these children were found to have better educational outcomes later in life, as measured by standardised test scores.
Chapter 4: The Associative Machine
System 1, Kahneman suggests, works largely by association. Once an idea has been "activated" — for instance, by reading a word — System 1 spontaneously searches for related and compatible ideas. Much of this associative work happens unconsciously, as can be observed in studies of so-called priming effects, patterns of behaviour and cognition that appear when a subject is primed with a particular stimulus. Students asked to solve a crossword puzzle featuring words about elderly themes (e.g., "grey" or "wrinkle") will move more slowly when walking down the hall afterward. Kahneman cites several other studies in which a seemingly innocuous or irrelevant stimulus produced a conceptually related effect on a subject's thoughts or actions.
Chapter 5: Cognitive Ease
Next, Kahneman expands on a concept discussed briefly in Chapter 3. In deciding whether System 2 should be tapped to evaluate a decision, he says, System 1 relies on a perception of cognitive ease or its opposite, cognitive strain. The more strained System 1 is, the less effective its intuitions are, and the more likely System 2 is to be called in to consciously address the problem at hand. The causes of cognitive ease, however, include some wholly incidental features that have no bearing on whether a problem is easily solved by intuition — or whether a given proposition is likely to be true. Presenting a statement in a clear, bold font, for instance, makes it seem more familiar and less cognitively burdensome, triggering System 1 to see the often incorrect intuitive answer as more plausible. However, problems presented in small, difficult-to-read font activated System 2 and led more participants to reject the incorrect intuitive answer suggested by System 1 and to arrive at a correct answer by using System 2.
The rhyme-as-reason effect is another example of how cognitive ease can mislead: rhyming sayings are "judged more insightful" than phrases with near-identical meanings that do not rhyme. Mere exposure to just about any stimulus, provided it is not noxious, can lead a person to later associate that stimulus with feelings of familiarity and ease. When influenced by such feelings, an individual will rely even more heavily than usual on System 1, whether or not this reliance is warranted by the nature of the cognitive problem to be solved.
Anyone who has played Tetris also has a good visual model for cognitive ease and cognitive strain, two concepts introduced in the chapter. The slow pace, few blocks, and ample manoeuvring room at the beginning of the game create a condition of cognitive ease. The neatly fitted blocks symbolise the mental state on the screen. As the pace quickens, blocks pile up, and things get more difficult to fit together, cognitive strain sets in. The haphazardly stacked blocks are a virtual mirror image of a mind struggling to integrate large amounts of potentially contradictory information.
Expert Tetris players do not just think one block at a time. Patterns are imagined in advance, contingencies are planned for, and opportunities are identified. Thinking about what blocks might show up, however, is a system 2 activity, both in Tetris and in life. The saying "out of sight, out of mind", though accurate enough as a general observation about human cognition, is overwhelmingly true of System 1.
Chapter 6: Norms, Surprises, and Causes
Kahneman explains that System 1 is constantly engaged in "maintain[ing] and updat[ing] a model" of the world and of "what is normal in it". Against the backdrop of these mental norms, some events stand out as surprising, but the mind quickly adapts to surprises and fits them into the overarching pattern. Thus, an event System 2 knows to be rare can seem familiar and expected to System 1 simply because it matches up with a past experience. Faced with two consecutive surprises, System 1 works to weave them together into a pattern that makes neither event surprising. In doing so, System 1 often comes up with causal explanations, in which agency, blame, and intention are imputed even to inanimate objects. Such causal reasoning works adequately much of the time, but it works against the grain of any attempt to reason statistically. When phenomena have no clear single cause or must be considered in aggregate, statistical reasoning (a System 2 speciality) is necessary.
Chapter 7: A Machine for Jumping to Conclusions
The net effect of many System 1 intuitions is a tendency to jump to conclusions. "Conscious doubt" and the toleration of uncertainty require the deliberate, effortful engagement of System 2. Without such conscious scrutiny, people are prone to confirmation bias, in which evidence that fits into pre-existing beliefs is given more weight than contradictory evidence (which may be dismissed entirely). This bias can take the form of a halo effect, a "tendency to like (or dislike) everything about a person".
Fundamentally, Kahneman says, these biases spring from the fact that System 1 is tasked with fitting available information into a coherent story — not with seeking out more information to challenge the story or fill in gaps. Kahneman gives a colourful name to this basic cognitive tendency: "What You See Is All There Is", or WYSIATI for short. In later chapters, Kahneman will refer back to WYSIATI as a causal factor in many different types of biases.
Product marketing is another area in which the mood heuristic is widely deployed. In producing costly, high-profile advertisements featuring Santa Claus and Christmas lights, Coca-Cola is cultivating positive emotions around its products with the aim of promoting brand loyalty year-round. They are attempting to create what Kahneman called a 'halo effect' in which a product's positive associations (good times, holiday cheer) ripple outward to influence impressions of more relevant features, such as taste or effects on health. Indeed, it's difficult to find a soft drink advertisement that says anything of substance about the actual product and hard not to find a soft drink advertisement that appeals to the mood heuristic instead.
Chapter 8: How Judgements Happen
System 1, Kahneman asserts, understands the world in terms of basic assessments, simple, approximate, intuitive readings of the current situation. These assessments tend to sort events and objects into crude categories: aversive or attractive, threat or opportunity. A stranger on the street, for example, is instantly and unconsciously assessed as either "friendly" or "hostile" based on physical build, facial expression, and so forth. However, there are some rather severe limits to the problems that can be solved by such assessments. System 1 is good at estimating averages, for instance, but very poor at estimating sums, or what Kahneman calls sum-like variables. Unfortunately, statistical and probabilistic reasoning relies on the ability to compare such variables.
From the point of view of System 1, an easier task is intensity matching, in which corresponding values must be assigned to two variables with different dimensions. The seriousness of a crime, for example, can be intuitively matched to the severity of the punishment, just as the loudness of a noise can be matched to the brightness of a colour. Like other traits mentioned in this chapter, the tendency toward intensity matching simplifies everyday decision-making but greatly complicates the task of attempting to think statistically.
Chapter 9: Answering an Easier Question
A heuristic can be thought of as a way of substituting a more easily answered question for the one being asked. Kahneman calls these the heuristic question and the target question, respectively. For example, if the target question is "How happy are you with your life these days?", many people base their answer on the heuristic question "What is my mood right now?" This specific substitution is an example of the mood heuristic, in which people rely on an assessment of their current mood as a "shortcut" to answering a much more challenging question. It is easy and tempting to simply ask, "How do I feel about it?" rather than weigh the pros and cons of a situation or a course of action.
Part 2
Chapter 10: The Law of Small Numbers
Because System 1 tends to reason causally about events, it is easily fooled by small samples reporting extreme results. It is a basic statistical truth that small samples are more likely to display extreme outcomes: one is much more likely to see "all heads, no tails" when flipping four coins at once than when flipping eight. System 1's inability to account for this fact is what Kahneman and Tversky wryly termed "the law of small numbers".
If one bets on a single number — say, 7 — on a roulette wheel, then one will either win or lose with every individual spin. Although there are 38 spaces on a roulette wheel, a win and a loss are the only possible outcomes of a given spin from the gambler's point of view. With a hundred or a thousand such spins, however, the win/loss ratio of observed outcomes will gradually approach 1:38, the ratio of the actual probability of landing on 7 each time the wheel is spun. Interestingly, even trained professionals design experiments with inadequate sample sizes that they determine based on their judgement. This reality is part of the root of the reliability or replication crisis. (The replication crisis in psychology is a widespread concern that many published findings in psychology fail to reproduce when independent researchers repeat the studies. It revealed problems with research practices, statistical inference, incentives, and publication norms that produced unreliable or inflated effects, leading to the maxim: treat single studies, especially surprising ones with small samples, cautiously.
Chapter 11: Anchors
Next, Kahneman introduces the anchoring effect, in which people are found to be extremely suggestible in making numerical estimates. The discovery of such an effect is one of Kahneman and Tversky's most important joint contributions to the psychological literature. Exposure to a number – even one known to be randomly chosen – will influence a person's estimate of the height of a redwood tree, Gandhi's age at death, or the year George Washington was inaugurated. In each case, the respondent has some information: redwoods are very tall, Gandhi was old — but not hundreds of years old — when he was assassinated, and George Washington could not have become president before 1776. Yet a respondent primed with a high number ("Was Gandhi more or less than 144 years old when he died?") will still give a higher estimate than one primed with a low number. This is similar to the anchor-and-adjust heuristic, in which an anchor is chosen early in the reasoning process, and any additional information is used to make slight adjustments.
Chapter 12: The Science of Availability
The availability heuristic is one well-studied means by which System 1 estimates frequencies. To decide how frequent or likely something is, people often rely instead on how easy (or difficult) it is to think of examples. This heuristic is at work when, for instance, spouses overestimate their own contributions to household chores so that the total of the two estimates is greater than 100%. Examples of chores one has done oneself are easier to come up with than chores done by one's partner. The impression of cognitive availability can itself be manipulated by asking for more or fewer examples. People asked to list 12 examples of their own assertive behaviour have a hard time filling the list and often come away with the impression that they are not very assertive after all.
Kahneman also describes both overestimation and underestimation based on available knowledge. In the United States, for instance, people frequently overestimate the likelihood of a spider bite being poisonous because the names of two highly poisonous spiders — the black widow and the brown recluse — are familiar to many Americans from childhood onwards.
On the other hand, people display a casual attitude toward disease prevention when the disease in question is not highly salient. People with no family history of heart disease will underestimate the likelihood of developing it themselves. People whose friends do not get an annual flu shot are less likely to decide to do so themselves.
Chapter 13: Availability, Emotion, and Risk
Here, Kahneman fills out the previous discussion of the availability heuristic. Judgements of availability, he says, are skewed by media coverage, which is "itself biased toward novelty and poignancy". People are much more likely to hear about a fatal accident or a homicide in the news than they are about a death attributable to diabetes or asthma. Consequently, people exaggerate the likelihood of events they have come to fear while downplaying the likelihood of events that get less media attention. It follows that when experts and policymakers attempt to quantify risk, they may initiate an availability cascade. Biases come to be magnified in the public imagination, making it harder for moderate voices or conflicting information to be heard.
Chapter 14: Tom W's Speciality
Kahneman now introduces the representativeness heuristic and the related concept of a base rate. The representativeness of an event or person is the degree to which they seem typical ("representative") of a category. A preference for "neat and tidy" environments, for example, is often seen as representative of librarians. The heuristic aspect lies in people's tendency to trust representativeness over, or instead of, pertinent statistical information. Kahneman gives the example of an experiment in which he and Tversky asked respondents to guess the major of a graduate student named Tom W. The brief synopsis of Tom's personality made him seem representative of computer scientists, and most respondents guessed Tom's field of study accordingly. In doing so, they ignored the base rate: the small proportion of graduate students in computer science as compared to other fields.
Chapter 15: Linda: Less Is More
The representativeness heuristic, however, does not merely distort probabilistic estimates: it can sometimes lead people to make completely illogical guesses. In the famous "Linda" experiment of the 1980s, Kahneman and Tversky told subjects about a young woman named Linda who "is thirty-one years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice and also participated in antinuclear demonstrations." After reading this short descriptive paragraph, subjects were asked to determine which was more likely: "Linda is a bank teller" or "Linda is a bank teller and is active in the feminist movement. " Overwhelmingly, but illogically, respondents chose the latter — a tendency Kahneman calls the conjunction fallacy. The chapter concludes with a survey of other studies demonstrating this intuitive but erroneous "less-is-more" thinking.
The reason so many people get the Linda problem wrong lies in the tendency to conflate plausibility with probability. As Kahneman observes, this is an easy mistake to make, but the two concepts are logically distinct. "Linda is a bank teller and is active in the feminist movement" is a more plausible story than "Linda is a bank teller", but the probability of "bank teller" must be higher — or at least no lower — than the probability of "bank teller and feminist".
Chapter 16: Causes Trump's Statistics
Kahneman now draws a further distinction between statistical base rates and causal base rates. Statistical base-rate information is framed in broad terms: "85% of the cabs in the city are from the Green Cab Company." Causal base-rate information is more specific and seems, in its presentation, to suggest a direct connection to the individual case at hand: "Green cabs are involved in 85% of accidents." When both types of base-rate information are present, the statistical sort is much more likely to be thrown out or underweighted, because System 1 has trouble fitting it into a narrative. More depressingly — in Kahneman's view, at least — people "quietly exempt themselves" from statistics that fail to accord with their self-image.
Chapter 17: Regression to the Mean
The concept of a "jinx" is widespread in sports. If an athlete has an outstandingly good first day at a tournament, she is expected to perform less well on day two. The sportscasters covering the event will advance all sorts of causal explanations for the drop in performance: the athlete was nervous because of the higher expectations, she was exhausted from an unusually strenuous effort on day one, etc. What's really going on, Kahneman says, is somewhat less sensational. Almost any set of outcomes, from test scores to inches of daily rainfall, will follow a distribution in which extreme events are rare and "average" events are common. Thus, there is a strong statistical tendency for any extreme event to be followed by a less extreme one, a phenomenon called regression to the mean.
Chapter 18: Taming Intuitive Predictions
Understanding regression to the mean allows one to identify and correct predictions that do not take such regression into account. An extremely tall child is statistically likely to become a rather (but not extremely) tall adult. An extremely precocious kindergartener is likely to become a high achiever in college but not to the same extreme level witnessed in early childhood. In each case, the statistically favourable outcome lies between the one piece of extreme information (height or intelligence at one point in time) and the mean, to which some regression is expected. Extreme predictions may be exciting — they capture the imagination, and getting them right is very gratifying — but they are seldom correct. Where accuracy is important, it pays to temper one's predictions to account for regression.
Part 3
Chapter 19: The Illusion of Understanding
System 1 tends to try to reduce experiences into a cogent, coherent narrative. This tendency carries with it the risk that the past will come to seem inevitable, that the narrative will seem as though it could not have played out any other way. People often come to believe they knew all along what would happen in an inherently unpredictable situation, and they tacitly "revise the history of [their] beliefs" to match what actually ends up happening. Thus, a person may go from believing (before the fact) that President Nixon would not resign to "having always known" that Nixon would resign. This is not, Kahneman says, mere dishonesty to save face; people genuinely forget that they ever held the ultimately falsified belief. A vast body of popular literature — including many business books — springs from the attempt to find simple causes of success and failure after the fact.
Chapter 20: The Illusion of Validity
Attempts to predict the future are often beset by the illusion of validity, in which a demonstrably useless tool or method is nonetheless assumed to have some special value for forecasting what will happen. People will continue to defer to a particular test, interview, or other metric even when they know rationally that the test is "little better than [a] random guess". Closely related is the illusion of skill, in which the technical sophistication of a practice (say, financial analysis of stocks) is assumed to mean that the practice will be effective. Stock traders, Kahneman observes, believe deeply in the efficacy of their own skill even when they fail to outperform (or even significantly underperform) the market as a whole. It is not the experts' fault, he says, that the world is difficult to predict – but it is their responsibility to be honest about the limits of their predictive powers.
The stock traders Kahneman singles out are probably quite adept at a variety of tasks related to their jobs, such as finding and interpreting the major financial information about a publicly traded firm. The illusion is not that people are skilled when they really aren't, but that their skills are effective in scenarios where they in fact make no difference.
Chapter 21: Intuitions vs. formulae
Because of the various systematic biases in human judgement, Kahneman proposes that algorithms and formulas can often do a better job than an expert's intuition. However, he acknowledges that there is a widespread "hostility to algorithms" in areas where the outcome may have a moral significance — for instance, in health care. After describing his own experiences in developing a scored survey for assigning soldiers to service branches, Kahneman invites the reader to "do it yourself". He offers some tips for developing a quantitative scoring system to be used in a situation in which intuitive judgement would be the norm.
Chapter 22: Expert Intuition: When Can We Trust It?
Although he maintains that the public's faith in expert judgement is sometimes misplaced, Kahneman does not rule out the existence of expert intuition in some specialised domains. He tells of a research programme undertaken with a colleague, psychologist Gary Klein. Klein is generally seen as an opponent of the heuristics-and-biases view of decision-making. In their research, Kahneman and Klein explored the limits of a decision paradigm called recognition-primed decision. This model treats expert intuition as the ability to recognise and act on patterns that might not be apparent to a layperson or a novice. Such recognition, Kahneman suggests, is at work when a chess grandmaster glances at a board and can tell immediately who will win the match, or when a fire captain orders his crew to evacuate just before a building starts to collapse. However, such intuitions can only arise in environments that are "sufficiently regular to be predictable" and can only be developed through long and consistent practice. In fields in which those conditions are lacking, expert intuition is likely to disappoint.
Chapter 23: The Outside View
In this chapter, Kahneman tells of his experiences as part of a team tasked with drafting a textbook on judgement and decision-making. The team's internal estimates of the time to complete the project averaged about three years, even though other textbook-writing projects of similar length and complexity had tended to take seven or more years. In fact, the book took nine years to complete, well within statistical expectations but far longer than the team members themselves had predicted. Kahneman takes this anecdote as an illustration of the need to consider the outside view — the dispassionate, neutral, statistically informed view — when planning a project. The inside view, which is overreliant on the specifics of the situation, is appealing but almost certainly inaccurate. Those who unrealistically favour the more optimistic inside view are committing what Kahneman calls the planning fallacy, allowing the detailed nature of their own forecasts to trump the relevant statistical information.
Chapter 24: The Engine of Capitalism
An "optimistic bias", Kahneman says, is not an altogether bad trait to possess. Optimists tend to live longer, be happier, and be more proactive in solving problems within their control. At the same time, optimism is indeed a type of cognitive bias, and it can be very costly at times. One consequence of optimism in business is the phenomenon of competition neglect, in which entrepreneurs assume that their decisions — irrespective of their competitors' actions — are the ultimate determiner of success or failure. Other types of optimism-induced overconfidence are evident in finance, in medicine, and in day-to-day life. As a means of (partly) counteracting optimistic bias, Kahneman passes along a suggestion from Gary Klein, who advocates conducting a premortem of any major plan before putting it into action. The goal of such an exercise is to imagine how the plan might fail despite the best intentions of all involved.
Part 4
Chapter 25: Bernoulli's Errors
In Part 4, Kahneman broaches the subject of behavioural economics, an area he credits Tversky with introducing him to. He begins by describing two "species" from behavioural economics: Econs, who are perfectly rational and consistent, and Humans, who have all the biases and inconsistencies of real human beings. Behavioural economics is concerned with refining economic models to address ways in which humans differ from Econs.
First, however, Kahneman lays the groundwork of decision-making in traditional economics. In expected utility theory, a person's decisions are assumed to maximise utility, or the benefit derived from the choices available. This includes gambles: if a person prefers apples to bananas, then theoretically, that person will also "prefer a 10% chance to win an apple to a 10% chance to win a banana". Some principles of this theory can be traced back to Swiss mathematician Daniel Bernoulli (1700–82), who observed that people react to relative, not absolute, changes in their wealth. This is just one way in which expected utility — the "psychological value or desirability" of a good — does not perfectly align with monetary value. Despite its brilliance, Kahneman says, Bernoulli's utility theory overlooked a key aspect of subjective value: its dependence on a reference point. It is, he asserts, the change in status that matters, not the status itself. In particular, people will place greater weight on losses than on gains. Faced with a sure gain or the chance of a greater gain, many will prefer the sure thing, but faced with a sure loss or the chance of a greater loss, most will prefer the gamble.
Chapter 26: Prospect Theory
In their 1970s research, Kahneman and Amos Tversky attempted to account for the gaps between expected utility theory and the psychology of real-life decision-making. They concluded that people in general experience loss aversion, meaning people would rather avoid losses than seek gains. This aversion can even be quantified as a ratio: typically, the two researchers found, people were twice as averse to loss as they were attracted to gain; a 50% chance to win $200 balances out a 50% chance to lose $100. At the same time, sensitivity to loss diminishes as the losses get larger, so that losing $200 is not "twice as bad", psychologically speaking, as losing $100. These insights combine to create a characteristic decision-making pattern: when faced with a win-or-lose gamble, most people will be very risk-averse, but when faced with only losing choices, people will risk a greater loss to have a chance of not losing at all. Though he admits it is not a perfect characterisation of economic decision-making, Kahneman describes prospect theory as a marked improvement over the earlier expected utility model.
Chapter 27: The Endowment Effect
Further complicating the picture is the endowment effect: people tend to overvalue what they have once they have it. A person who is equally happy to receive either a raise or some added vacation time will, once they receive the raise, often be unwilling to trade it back for the vacation time. This effect occurs at all scales, from coffee mugs and chocolate bars to rare antiques and vintage wines. The possessor of a good — provided it is "held for use" and not merely a commodity or currency — will overrate the value of the good when considering a sale or exchange. Kahneman cites several experiments in which this effect is established and investigated.
Chapter 28: Bad Events
Yet another twist comes from what Kahneman calls negativity dominance: the tendency to see the one angry face in a crowd of smiles, but not vice versa. There is, Kahneman suggests, an evolutionary reason to give threats greater priority than opportunities, since an opportunity means nothing if one does not survive to enjoy them. Loss aversion, as discussed in Chapter 26, is one instance of negativity dominance; another is the aversion to falling short of a goal. Hence, Kahneman argues, the better performance of golfers when putting for par than when putting for a birdie. Negativity dominance has frustrating consequences for negotiations, whether they are economic or political. Both parties feel that what they are giving up is more valuable than what they are getting in return.
Chapter 29: The Fourfold Pattern
A final piece of the prospect theory puzzle comes in the form of two complementary effects: the possibility effect and the certainty effect. The former alludes to the tendency to overweight the mere possibility of an unlikely event, as seen in lottery-ticket buyers around the world. The latter alludes to the premium paid for the last few percentage points between an almost-sure thing and a true certainty. Both effects pose a further challenge to expected utility theory, as Kahneman proceeds to show by recounting a famous economic puzzle posed by Maurice Allais (the "Allais paradox").
Kahneman returns to the basics of prospect theory and describes the "fourfold pattern" of Allais's decision behaviour in the face of gains and losses. When people have a high probability of a gain, they fear disappointment and are generally willing to accept a smaller, sure-thing payment rather than gamble, but when they face a high probability of a loss, they are typically willing to take their chances. The pattern is reversed for low-probability events: a small chance of a large gain is preferred to a sure-thing settlement, while a smaller sure loss is preferred to a small chance of a large loss. Together, these four effects illustrate the interaction of loss aversion and diminishing sensitivity with the certainty and possibility effects. The chapter closes with some illustrations drawn from the world of litigation.
Chapter 30: Rare Events
In general, Kahneman says, people tend to "overestimate the probabilities of unlikely events" and "overweight unlikely events in their decisions". Several psychological mechanisms contribute to this tendency, including the availability heuristic and the tendency to prefer cognitive ease to cognitive strain. The broadest explanation, however, is that the vividness of rare events tends to give them a disproportionate share in decision-making. The mind's focus on vivid outcomes can lead to denominator neglect: in the phrase "1 out of 100,000 children will be disabled as a result of the vaccine", the emphasis tends to fall on the "1" rather than the "100,000". The more concretely the probability is represented (e.g., "1 child in 100,000" rather than "0.001% of children"), the more pronounced such overemphasis will be. "When it comes to rare probabilities," Kahneman soberly concludes, "our mind is not designed to get things quite right."
Chapter 31: Risk Policies
The discussion now turns to ways in which risk can be approached more systematically. Narrow framing of a problem — approaching it as a set of unrelated one-off choices — can lead to suboptimal outcomes in many cases. Broad framing is required to see how choices interact, to get a "big picture" that transcends individual losses and gains. To illustrate the distinction, Kahneman asks why someone might reject a single, 50/50, "win $200 or lose $100" bet but accept the option of making 100 such bets. If a broad frame is adopted, he says, it is obvious that the 100 bets are massively favourable to the gambler. Yet if each gamble is viewed as a distinct, isolated event, loss aversion kicks in: the thought of losing $100 is often more painful than the thought of winning $200. In a rare moment of direct advice to the reader, Kahneman points out that life itself contains many "small favourable gambles". Accepting them as a matter of course, he adds, will lead to a better result than rejecting each one individually out of loss aversion. This is an example of a risk policy – a commitment to handle a particular risk the same way every time, in order to come out ahead in the long run.
Chapter 32: Keeping Score
Another quirk of human reasoning is mental accounting: the tendency to reckon up money and other resources in separate "accounts" rather than as a lump sum. Such mental accounts are, Kahneman observes, "a form of narrow framing", but they help in making sense of the world. One adverse consequence of mental accounting, however, is that it opens the door to the sunk-cost fallacy. People who spend money for a specific purpose — to see a football game, for instance — will often "throw good money after bad" if an added expense arises in connection with the event. Stockholders will sell winners rather than losers so as to close out their position as a gain rather than a loss. Regret and the anticipation of regret, along with feelings of moral responsibility, further complicate the effort to "keep score" of finite resources.
Chapter 33: Reversals
A preference reversal arises when people make one choice when presented simultaneously with a pair of options ("B over A") but another choice if the two options are considered in isolation ("A over B"). Kahneman cites such reversals as another feature of human economic reasoning not adequately explained by the rational-agent model. The solution to the apparent paradox, he says, is quite simple: people's assessments of value are based on the context in which the question is asked. A six-year-old boy who is 5 feet tall is "tall" relative to other boys his age, and a 16-year-old boy who is 5 feet 1 inch tall is "short" for his age, but in a direct comparison it is obvious that the latter boy is taller. The joint evaluation of the two boys produces a different type of comparison than the single evaluation of each boy relative to his age group. In economic decisions, too, this kind of context-based reversal can be observed: a fine that is high by one agency's standards may be a pittance by the standards of another agency.
Chapter 34: Frames and Reality
Kahneman explores the concept of framing effects, in which the mere wording of a decision problem substantially changes people's preferences. He reports that people are much more likely to recommend a procedure with a 90% survival rate than one with 10% mortality. They will endorse a programme that "saves 200 lives out of 600" but reject one that results in 400 deaths. These effects are important, Kahneman says, because often there is no underlying preference: the frame itself determines what moral intuitions people bring to bear. "An important choice," he remarks, "is controlled by an utterly inconsequential feature of the situation."
Chapter 35: Two Selves
This chapter introduces the two characters who will star in Part 5: the experiencing self and the remembering self. The experiencing self is present in the moment, undergoing pleasure or pain. The remembering self recalls experiences after the fact and makes decisions based on those memories. Memories themselves are prone to persistent biases, such as an overemphasis on the best (or worst) moments of an experience and, more troublingly, a neglect of duration. "We want pain to be brief and pleasure to last", Kahneman observes, but duration neglect ensures the remembering self will not make choices accordingly.
Chapter 36: Life as a Story
Kahneman observes that stories are often defined by their endings. This is as true, he maintains, of operas as it is of a person's lifetime. Citing research on perceived quality of life, Kahneman shows how "peaks and ends matter, but duration does not." That is, a person who lives 60 extremely happy years and then dies suddenly is consistently judged to have had a better life than a person who lives a further five "slightly happy" years. When planning a vacation, likewise, the remembering self is in the driver's seat: many people would not bother to make a trip from which they could not bring back happy memories.
Chapter 37: Experienced Well-Being
Kahneman surveys some standard approaches to measuring happiness, methods that Kahneman regards as overreliant on the remembering self rather than the experiencing self. He presents his own efforts to measure an individual's U-index, a term he and his colleagues used to denote the percentage of time spent in an unpleasant state. Such an index, he suggests, can also be applied in aggregate as a loose measure of a population's well-being. Things that profoundly affect well-being in the moment, Kahneman reports, may have a smaller or even opposite effect on overall life satisfaction—and vice versa.
Chapter 38: Thinking About Life
Finally, Kahneman digs deeper into the issues inherent in any attempt to measure life satisfaction. He reviews some experiments in which life satisfaction measures are shifted considerably by trivial occurrences (e.g., finding a coin) as well as by major events (e.g., a recent marriage). Collectively, Kahneman describes these results as evidence of the focusing illusion, in which people give exaggerated importance to whatever they are thinking about at the moment. In considering the purchase of a new car, for example, people routinely overestimate how big a role the car will play in their overall happiness. This illusion compromises predictions about what will bring future happiness.
Conclusion
Thinking, Fast and Slow concludes with a review of the three main dichotomies presented in the book: System 1 versus System 2; humans versus econs; the remembering self versus the experiencing self.
The "two selves" are treated first since they appeared most recently. Humans rely on their imperfect memories in making decisions, so the remembering self is at the helm whenever a choice is made about one's future happiness. This is true even though the experiencing self — the self that lives out the moment-by-moment results of the decision — may experience pain or pleasure that differs systematically from those predicted by the remembering self. What makes for a pleasant experience is not always what makes for a happy memory, and people often make suboptimal choices because they forget important features of their past experiences. Kahneman urges psychologists and policymakers to adopt definitions of well-being that integrate a person's experiences, memories, and goals.
'Econs' (a term borrowed from Richard Thaler and Cass Sunstein) are the fictitious, perfectly logical beings who follow the simple rules of rational-agent economics. They do not make mistakes in reasoning or succumb to cognitive biases. Humans, as contrasted with Econs, are the logically fallible beings who populate the real world and participate in its economy. Libertarianism, Kahneman suggests, is predicated on the assumption that people are Econs, always rationally following their own self-interest. Behavioural economists, like Kahneman and his colleagues, are sceptical about the rationality of humans and favour policy programmes designed to help people make better choices. Kahneman cites several examples of libertarian paternalism, an approach in which people are steered toward — but not coerced into — responsible choices about their health and finances. (cf. Nudge 2009)
Finally, Kahneman returns to Systems 1 and 2, a pair introduced in the earliest chapters. These are Kahneman's terms for the two different systems the human mind uses to solve problems and produce judgements. System 1, he reminds the reader, is the "fast" system of largely unconscious cognitive patterns used to find quick, approximate solutions. System 2 is the deliberate, "slow" system that sometimes serves to correct the errors of intuition and bias. Recognising one's cognitive errors is difficult; somewhat easier is the task of recognising errors in others' thinking. In either case, Kahneman suggests, the vocabulary offered in Thinking, Fast and Slow makes the effort more fruitful.
Themes
Two styles of thinking, and both have flaws.
The two styles provide the book’s title. The fast style, which Kahneman calls System 1, represents intuition. System 2 is called upon when System 1 gets out of its depth, but System 2 is “lazy”. Both systems make mistakes. System 1 often substitutes feeling for reason, or substitutes an easy question for a hard one, or leaps to a conclusion based on stereotypes. Often such habits of thought are routine enough to be called heuristics, rules of thumb that make thinking easier but lead to predictable kinds of errors.
Both systems are liable to focus on certain facts to the exclusion of others, and both systems are fond of narrative, especially in the form of causal explanations, even in situations where the true story is about the actions of blind chance. Neither system has a deep understanding of probability or statistics, but System 2 can at least be trained. Unfortunately, the deficiencies of Systems 1 and 2 do not lead to humility. On the contrary, people tend to be blind to their errors and overconfident in their judgements.
The standard model of rational choice
Economics, as taught in most introductory textbooks, assumes that human beings seek to maximise expected utility, which is almost, but not quite, the same thing as maximising expected monetary payoff. According to prospect theory, Kahneman’s alternative account of choice, real human beings differ in several respects from the ideal choosers described by expected utility theory. Real human beings:
(1) Grow attached to the status quo and prefer it over what might otherwise be seen as an equally good alternative.
(2) Discriminate more finely between small gains and losses, relative to the status quo, than between large ones.
(3) React more strongly to losses than to equal-sized gains, a behaviour called “loss aversion”.
Real human beings also give too much weight to rare events, especially ones that are easily and vividly imagined (perhaps because of media coverage of such events). Finally, real human beings are poor judges of their own happiness. Their memories of past events fail to reflect the actual mix of enjoyment and unhappiness they experienced at the time. Consequently, people are not good at choosing what will bring them the most pleasure.
Organisations can foster more rational judgements and choices.
Although Kahneman catalogues dozens and dozens of mistakes people are prone to, he is not merely pessimistic. Organisations and governments can adopt and encourage practices that foster more rational judgements and choices and thereby make people better off. For example, an organisation can promote rational planning by mandating review of how similar projects have turned out and by inviting planners to imagine how the project could fail (an exercise called a “premortem”).
An organisation can also improve its projections of future employee performance by relying less on interviewers’ intuition and more on simple, algorithm-based evaluations. Both organisations and governments can optimise the outcomes of many individual decisions by framing decisions properly. For example, participation in a company plan, or in a voluntary organ donor programme, can be encouraged by making participation the default, with a cost-free opt-out option.
Heuristics and our hunter-gatherer brain
In hunter‑gatherer societies, where time and information were limited, shortcut strategies — such as "copy the successful", "prefer familiar foods", and "trust kin" — reduced costly deliberation and improved survival. Those same heuristics still steer modern choices: we rely on social proof when choosing products, favour calorie‑dense foods that once meant energy security, and default to familiar routines under stress. Efficient heuristics can misfire in novel contexts, producing biases like overgeneralisation, stereotyping and susceptibility to persuasive cues that were rare in ancestral environments.
Early humans who quickly inferred causes for rustling in the bushes or patterns in resource distributions were more likely to survive. This predisposition toward seeing intent or structure — sometimes where none exists — helps with learning and social coordination but also fuels conspiracy thinking, superstition, and overinterpretation of random events today. In social settings, pattern‑seeking combines with prestige bias (learning from high‑status others) to create powerful cultural transmission. A few successful models can spread norms and skills rapidly through imitation rather than explicit instruction.
In bands of a few dozen people, reputation, reciprocity, and norms were the main enforcement mechanisms; shame, gossip, and punishment maintained cooperation. These evolved sensitivities explain why modern humans are deeply affected by social approval and status signals, even when those signals are mediated by screens. Social heuristics like "punish free‑riders" and "reward cooperators" still underpin collective action, but in large, anonymous societies, the feedback loops are weaker, making cooperation harder to sustain without institutions that recreate small‑group accountability.
Hunters and gatherers lived with acute uncertainty and immediate consequences, so valuing near‑term rewards and immediate learning was rational. Today, that same bias makes long-term planning (retirement savings, preventive health, and climate action) difficult. Abstract, delayed benefits fail to compete with vivid, immediate gratifications. Practical fixes — micro‑rewards, visible progress, and social commitments — work because they align modern systems with ancestral heuristics rather than trying to override them entirely.
No comments:
Post a Comment