+1(316)4441378

+44-141-628-6690

Thinking Fast and Slow by Daniel Kahneman Summarized by Erik Johnson 1

Thinking Fast and Slow by Daniel Kahneman Summarized by Erik Johnson 1

Book Summary: Thinking Fast and Slow By Daniel Kahneman (FSG, NY: 2001)

Summarized by Erik Johnson

Daniel Kahneman’s aim in this book is to make psychology, perception,

irrationality, decision making, errors of judgment, cognitive science,

intuition, statistics, uncertainty, illogical thinking, stock market gambles,

and behavioral economics easy for the masses to grasp. Despite his

charming and conversational style, this book was difficult for me because I

am accustomed to thinking fast. As a service to my fellow automatic,

intuitive, error-making, fast thinkers I offer this simple (dumbed down)

summary of what is a very helpful book. Writing this summary taught me how to think

harder, clearer, and with fewer cognitive illusions. In short, how to think slower. Now if

only I’d do it.

INTRODUCTION

This book is about the biases of our intuition. That is, we assume certain things

automatically without having thought through them carefully. Kahneman calls those

assumptions heuristics1 (page 7). He spends nearly 500 pages listing example after

example of how certain heuristics lead to muddled thinking, giving each a name such as

“halo effect,” “availability bias,” “associative memory,” and so forth.” In this summary I

list Kahneman’s heuristics to a list of errors of judgment.2

PART ONE: TWO SYSTEMS

CHAPTER ONE: THE CHARACTERS OF THE STORY

Our brains are comprised of two characters, one that thinks fast, System 1, and one that

thinks slow, System 2. System 1 operates automatically, intuitively, involuntary, and

effortlessly—like when we drive, read an angry facial expression, or recall our age.

System 2 requires slowing down, deliberating, solving problems, reasoning, computing,

focusing, concentrating, considering other data, and not jumping to quick conclusions—

like when we calculate a math problem, choose where to invest money, or fill out a

complicated form. These two systems often conflict with one another. System 1 operates

on heuristics that may not be accurate. System 2 requires effort evaluating those

heuristics and is prone error. The plot of his book is how to, “recognize situations in

which mistakes are likely and try harder to avoid significant mistakes when stakes are

high,” (page 28).

1 Synonyms include “rules of thumb,” “presuppositions,” “cognitive illusions,” “bias of judgment,” “thinking errors,” “dogmatic assumptions,” “systematic errors,” “intuitive flaws.” 2 Kahneman did not number his list but I will do so for ease of understanding, citing page numbers as I go. My paragraph summaries are clear but I of course encourage interested readers to go to the book itself to read up on each heuristic in more detail.

Thinking Fast and Slow by Daniel Kahneman Summarized by Erik Johnson 2

CHAPTER TWO: ATTENTION AND EFFORT

Thinking slow affects our bodies (dilated pupils), attention (limited observation), and

energy (depleted resources). Because thinking slow takes work we are prone to think fast,

the path of least resistance. “Laziness is built deep into our nature,” (page 35). We think

fast to accomplish routine tasks and we need to think slow in order to manage

complicated tasks. Thinking fast says, “I need groceries.” Thinking slow says, “I will not

try to remember what to buy but write myself a shopping list.”

CHAPTER THREE: THE LAZY CONTROLLER

People on a leisurely stroll will stop walking when asked to complete a difficult mental

task. Calculating while walking is an energy drain. This is why being interrupted while

concentrating is frustrating, why we forget to eat when focused on an interesting project,

why multi-tasking while driving is dangerous, and why resisting temptation is extra

hard when we are stressed. Self control shrinks when we’re tired, hungry, or mentally

exhausted. Because of this reality we are prone to let System 1 take over intuitively and

impulsively. “Most people do not take the trouble to think through [a] problem,” (page 45).

“Intelligence is not only the ability to reason; it is also the ability to find relevant

material in memory and to deploy attention when needed,” (page. 46). Accessing memory

takes effort but by not doing so we are prone to make mistakes in judgment.

CHAPTER FOUR: THE ASSOCIATIVE MACHINE

Heuristic #1: PRIMING. Conscious and subconscious exposure to an idea “primes” us

to think about an associated idea. If we’ve been talking about food we’ll fill in the blank

SO_P with a U but if we’ve been talking about cleanliness we’ll fill in the blank SO_P

with an A. Things outside of our conscious awareness can influence how we think. These

subtle influences also affect behavior, “the ideomotor effect,” (page 53). People reading

about the elderly will unconsciously walk slower. And people who are asked to walk

slower will more easily recognize words related to old age. People asked to smile find

jokes funnier; people asked to frown find disturbing pictures more disturbing. It is true:

if we behave in certain ways our thoughts and emotions will eventually catch up. We can

not only feel our way into behavior, we can behave our way into feelings. Potential for

error? We are not objective rational thinkers. Things influence our judgment, attitude,

and behavior that we are not even aware of.

CHAPTER FIVE: COGNITIVE EASE

Heuristic #2: COGNITIVE EASE. Things that are easier to compute, more familiar,

and easier to read seem more true than things that require hard thought, are novel, or

are hard to see. “Predictable illusions inevitably occur if a judgment is based on the

impression of cognitive ease or strain,” (page 62). “How do you know that a statement is

true? If it is strongly linked by logic or association to other beliefs or preferences you hold,

or comes from a source you trust and like, you will feel a sense of cognitive ease,” (page

Thinking Fast and Slow by Daniel Kahneman Summarized by Erik Johnson 3

64). Because things that are familiar seem more true teachers, advertisers, marketers,

authoritarian tyrants, and even cult leaders repeat their message endlessly. Potential for

error? If we hear a lie often enough we tend to believe it.

CHAPTER SIX: NORMS, SURPRISES, AND CAUSES

Heuristic #3: COHERENT STORIES (ASSOCIATIVE COHERENCE). To make

sense of the world we tell ourselves stories about what’s going on. We make associations

between events, circumstances, and regular occurrences. The more these events fit into

our stories the more normal they seem. Things that don’t occur as expected take us by

surprise. To fit those surprises into our world we tell ourselves new stories to make them

fit. We say, “Everything happens for a purpose,” “God did it,” “That person acted out of

character,” or “That was so weird it can’t be random chance.” Abnormalities, anomalies,

and incongruities in daily living beg for coherent explanations. Often those explanations

involve 1) assuming intention, “It was meant to happen,” 2) causality, “They’re homeless

because they’re lazy,” or 3) interpreting providence, “There’s a divine purpose in

everything.” “We are evidently ready from birth to have impressions of causality, which

do not depend on reasoning about patterns of causation,” (page 76). “Your mind is ready

and even eager to identify agents, assign them personality traits and specific intentions,

and view their actions as expressing individual propensities,” (page 76). Potential for

error? We posit intention and agency where none exists, we confuse causality with

correlation, and we make more out of coincidences than is statistically warranted.

CHAPTER SEVEN: A MACHINE FOR JUMPING TO CONCLUSIONS

Heuristic #4: CONFIRMATION BIAS. This is the tendency to search for and find

confirming evidence for a belief while overlooking counter examples. “Jumping to

conclusions is efficient if the conclusions are likely to be correct and the costs of an

occasional mistake acceptable, and if the jump saves much time and effort. Jumping to

conclusions is risky when the situation is unfamiliar, the stakes are high, and there is no

time to collect more information,” (page 79). System 1 fills in ambiguity with automatic

guesses and interpretations that fit our stories. It rarely considers other interpretations.

When System 1 makes a mistake System 2 jumps in to slow us down and consider

alternative explanations. “System 1 is gullible and biased to believe, System 2 is in

charge of doubting and unbelieving, but System 2 is sometimes busy, and often lazy,”

(page 81). Potential for error? We are prone to over-estimate the probability of unlikely

events (irrational fears) and accept uncritically every suggestion (credulity).

Heuristic #5: THE HALO EFFECT. “This is the tendency to like or dislike everything

about a person—including things you have not observed,” (page 82). The warm emotion

we feel toward a person, place, or thing predisposes us to like everything about that

person, place, or thing. Good first impressions tend to positively color later negative

impressions and conversely, negative first impressions can negatively color later positive

impressions. The first to speak their opinion in a meeting can “prime” others’ opinions. A

list of positive adjectives describing a person influences how we interpret negative

Thinking Fast and Slow by Daniel Kahneman Summarized by Erik Johnson 4

adjectives that come later in the list. Likewise, negative adjectives listed early colors

later positive adjectives. The problem with all these examples is that our intuitive

judgments are impulsive, not clearly thought through, or critically examined. To remind

System 1 to stay objective, to resist jumping to conclusions, and to enlist the evaluative

skills of System 2, Kahneman coined the abbreviation, “WYSIATI,” what you see is all

there is. In other words, do not lean on information based on impressions or intuitions.

Stay focused on the hard data before us. Combat over confidence by basing our beliefs not

on subjective feelings but critical thinking. Increase clear thinking by giving doubt and

ambiguity its day in court.

CHAPTER EIGHT: HOW JUDGMENTS HAPPEN

Heuristic #6: JUDGEMENT. System 1 relies on its intuition, the basic assessments of

what’s going on inside and outside the mind. It is prone to ignore “sum-like variables,”

(page 93). We often fail to accurately calculate sums but rely instead on often unreliable

intuitive averages. It is prone to “matching,” (page 94). We automatically and

subconsciously rate the relative merits of a thing by matching dissimilar traits. We are

prone to evaluate a decision without distinguishing which variables are most important.

This is called the “mental shotgun” approach (page 95). These basic assessments can

easily replace the hard work System 2 must do to make judgments.

CHAPTER NINE: AN EASIER QUESTION

Heuristic #7: SUBSTITUTION. When confronted with a perplexing problem, question,

or decision, we make life easier for ourselves by answering a substitute, simpler question.

Instead of estimating the probability of a certain complex outcome we rely on an estimate

of another, less complex outcome. Instead of grappling with the mind-bending

philosophical question, “What is happiness?” we answer the easier question, “What is my

mood right now?” (page 98). Even though highly anxious people activate System 2 often,

obsessing and second guessing every decision, fear, or risk, it is surprising how often

System 1 works just fine for them. Even chronic worriers function effortlessly in many

areas of life while System 1 is running in the background. They walk, eat, sleep, breath,

make choices, make judgments, trust, and engage in enterprises without fear, worry, or

anxiety. Why? They replace vexing problems with easier problems. Potential for error?

We never get around to answering the harder question.

Heuristic #8: AFFECT. Emotions influence judgment. “People let their likes and

dislikes determine their beliefs about the world,” (page 103). Potential for error? We can

let our emotional preferences cloud our judgment and either under or over estimate risks

and benefits.

PART TWO: HEURISTICS AND BIASES

CHAPTER TEN: THE LAW OF SMALL NUMBERS

Thinking Fast and Slow by Daniel Kahneman Summarized by Erik Johnson 5

Heuristic #9: THE LAW OF SMALL NUMBERS. Our brains have a difficult time

with statistics. Small samples are more prone to extreme outcomes than large samples,

but we tend to lend the outcomes of small samples more credence than statistics warrant.

System 1 is impressed with the outcome of small samples but shouldn’t be. Small

samples are not representative of large samples. Large samples are more precise. We err

when we intuit rather than compute, (see page 113). Potential for error? We make

decisions on insufficient data.

Heuristic #10: CONFIDENCE OVER DOUBT. System 1 suppresses ambiguity and

doubt by constructing coherent stories from mere scraps of data. System 2 is our inner

skeptic, weighing those stories, doubting them, and suspending judgment. But because

disbelief requires lots of work System 2 sometimes fails to do its job and allows us to slide

into certainty. We have a bias toward believing. Because our brains are pattern

recognition devices we tend to attribute causality where none exists. Regularities occur

at random. A coin flip of 50 heads in a row seems unnatural but if one were to flip a coin

billions and billions of times the odds are that 50 heads in a row would eventually

happen. “When we detect what appears to be a rule, we quickly reject the idea that the

process is truly random,” (page 115). Attributing oddities to chance takes work. It’s easier

to attribute them to some intelligent force in the universe. Kahneman advises, “accept

the different outcomes were due to blind luck” (page 116). There are many facts in this

world due to chance and do not lend themselves to explanations. Potential for error?

Making connections where none exists.

CHAPTER ELEVEN: ANCHORS

Heuristic #11: THE ANCHORING EFFECT. This is the subconscious phenomenon of

making incorrect estimates due to previously heard quantities. If I say the number 10

and ask you to estimate Gandhi’s age at death you’ll give a lower number than if I’d said

to you the number 65. People adjust the sound of their stereo volume according to

previous “anchors,” the parents’ anchor is low decibels, the teenager’s anchor is high

decibels. People feel 35 mph is fast if they’ve been driving 10 mph but slow if they just

got off the freeway doing 65 mph. Buying a house for $200k seems high if the asking

price was raised from $180k but low if the asking price was lowered from $220k. A 15

minute wait to be served dinner in a restaurant seems long if the sign in the window says,

“Dinner served in 10 minutes or less” but fast if the sign says, “There is a 30 minute wait

before dinner will be served.” Potential for error? We are more suggestible than we

realize.

CHAPTER TWELVE: THE SCIENCE OF AVAILABIITY

Heuristic #12: THE AVAILABILITY HEURISTIC. When asked to estimate numbers

like the frequency of divorces in Hollywood, the number of dangerous plants, or the

number of deaths by plane crash, the ease with which we retrieve an answer influences

the size of our answer. We’re prone to give bigger answers to questions that are easier to

retrieve. And answers are easier to retrieve when we have had an emotional personal

Thinking Fast and Slow by Daniel Kahneman Summarized by Erik Johnson 6

experience. One who got mugged over-estimates the frequency of muggings, one exposed

to news about school shootings over-estimates the number of gun crimes, and the one

who does chores at home over estimates the percentage of the housework they do. When

both parties assume they do 70% of the house work somebody is wrong because there’s no

such thing as 140%! A person who has experienced a tragedy will over estimate the

potential for risk, danger, and a hostile universe. A person untroubled by suffering will

under-estimate pending danger. When a friend gets cancer we get a check up. When

nobody we know gets cancer we ignore the risk. Potential for error: under or over

estimating the frequency of an event based on ease of retrieval rather than statistical

calculation.

CHAPTER THIRTEEN: AVAILABIITY, EMOTION, AND RISK

Heuristic #13: AVAILABILITY CASCADES. When news stories pile up our statistical

senses get warped. A recent plane crash makes us think air travel is more dangerous

than car travel. The more we fear air travel the more eager news reporters are to

sensationalize plane crashes. A negative feedback loop is set in motion, a cascade of fear.

“The emotional tail wags the rational dog,” (page 140). Potential for error? Over reacting

to a minor problem simply because we hear a disproportionate number of negative news

stories than positive ones.

CHAPTER FOURTEEN: TOM W’S SPECIALTY

Heuristic #14: REPRESENTATIVENESS. Similar to profiling or stereotyping,

“representativeness” is the intuitive leap to make judgments based on how similar

something is to something we like without taking into consideration other factors:

probability (likelihood), statistics (base rate), or sampling sizes. Baseball scouts used to

recruit players based on how close their appearance resembled other good players. Once

players were recruited based on actual statistics the level of gamesmanship improved.

Just because we like the design of a book cover doesn’t mean we’ll like the contents. You

can’t judge a book by its cover. A start-up restaurant has a low chance of survival

regardless of how much you like their food. Many well run companies keep their facilities

neat and tidy but a well kept lawn is no guarantee that the occupants inside are

organized. To discipline our lazy intuition we must make judgments based on probability

and base rates, and question our analysis of the evidence used to come up with our

assumption in the first place. “Think like a statistician,” (page 152). Potential for error:

Evaluating a person, place, or thing on how much it resembles something else without

taking into account other salient factors.

CHATPER FIFTEEN: LINDA: LESS IS MORE

Heuristic #15: THE CONJUNCTION FALLACY (violating the logic of probability).

After hearing priming details about a made up person (Linda), people chose a plausible

story over a probable story. Logically, it is more likely that a person will have one

characteristic than two characteristics. That is, after reading a priming description of

Thinking Fast and Slow by Daniel Kahneman Summarized by Erik Johnson 7

Linda respondents were more likely to give her two characteristics, which is statistically

improbable. It is more likely Linda would be a bank teller (one characteristic) than a

bank teller who is a feminist (two characteristics). “The notions of coherence, plausibility,

and probability are easily confused by the unwary,” (page 159). The more details we add

to a description, forecast, or judgment the less likely they are to be probable. Why? Stage

1 thinking overlooks logic in favor of a plausible story. Potential for error: committing a

logical fallacy, when our intuition favors what is plausible but improbable over what is

implausible and probable.

CHAPTER SIXTEEN: CAUSES TRUMP STATISTICS

Heuristic #16: OVERLOOKING STATISTICS. When given purely statistical data we

generally make accurate inferences. But when given statistical data and an individual

story that explains things we tend to go with the story rather than statistics. We favor

stories with explanatory power over mere data. Potential for error: stereotyping, profiling,

and making general inferences from particular cases rather than making particular

inferences from general cases.

CHAPTER SEVENTEEN: REGRESSION TO THE MEAN

Heuristic #17: OVERLOOKING LUCK. Most people love to attach causal

interpretations to the fluctuations of random processes. “It is a mathematically inevitable

consequence of the fact that luck played a role in the outcome….Not a very satisfactory

theory—we would all prefer a causal account—but that is all there is,” (page 179). When

we remove causal stories and consider mere statistics we’ll observe regularities, what is

called the regression to the mean. Those statistical regularities—regression to the

mean—are explanations (“things tend to even out”) but not causes (“that athlete had a

bad day but is now ‘hot’). “Our mind is strongly biased toward causal explanations and

does not deal well with ‘mere statistics,’” (page 182). Potential for error: seeing causes

that don’t exist.

CHAPTER EIGHTEEN: TAMING INTUITIVE PREDICTIONS

Heuristic #18: INTUITIVE PREDICTIONS. Conclusions we draw with strong

intuition (System 1) feed overconfidence. Just because a thing “feels right” (intuitive)

does not make it right. We need System 2 to slow down and examine our intuition,

estimate baselines, consider regression to the mean, evaluate the quality of evidence, and

so forth. “Extreme predictions and a willingness to predict rare events from weak

evidence are both manifestations of System 1. It is natural for the associative machinery

to match the extremeness of predictions to the perceived extremeness on which it is

based—this is how substitution works,” (page 194). Potential for error: unwarranted

confidence when we are in fact in error.

Thinking Fast and Slow by Daniel Kahneman Summarized by Erik Johnson 8

PART THREE: OVERCONFIDENCE

CHAPTER NINETEEN: THE ILLUSION OF UNDERSTANDING

Heuristic #19: THE NARRATIVE FALLACY. In our continuous attempt to make

sense of the world we often create flawed explanatory stories of the past that shape our

views of the world and expectations of the future. We assign larger roles to talent,

stupidity, and intentions than to luck. “Our comforting conviction that the world makes

sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance,”

(page 201). This is most evident when we hear, “I knew that was going to happen!” Which

leads to…

Heuristic #20: THE HINDSIGHT ILLUSION. We think we understand the past,

which implies the future should be knowable, but in fact we understand the past less

than we believe we do. Our intuitions and premonitions feel more true after the fact.

Once an event takes place we forget what we believed prior to that event, before we

changed our minds. Prior to 2008 financial pundits predicted a stock market crash but

they did not know it. Knowing means showing something to be true. Prior to 2008 no one

could show that a crash was true because it hadn’t happened yet. But after it happened

their hunches were retooled and become proofs. “The tendency to revise the history of

one’s beliefs in light of what actually happened produces a robust cognitive illusion,”

(page 203). Potential for error: “We are prone to blame decision makers for good decisions

that worked out badly and to give them too little credit for successful moves that appear

obvious only after the fact. When the outcomes are bad, the clients often blame their

agents for not seeing the handwriting on the wall—forgetting that it was written in

invisible ink that became legible only afterward. Actions that seemed prudent in

foresight can look irresponsibly negligent in hindsight,” (page 203).

CHAPTER TWENTY: THE ILLUSION OF VALIDITY

Heuristic #21: THE ILLUSION OF VALIDITY. We sometimes confidently believe our

opinions, predictions, and points of view are valid when confidence is unwarranted. Some

even cling with confidence to ideas in the face of counter evidence. “Subjective confidence

in a judgment is not a reasoned evaluation of the probability that this judgment is correct.

Confidence is a feeling, which reflects the coherence of the information and the cognitive

ease of processing it” (page 212). Factors that contribute to overconfidence: being dazzled

by one’s own brilliance, affiliating with like-minded peers, and over valuing our track

record of wins and ignoring our losses. Potential for error: Basing the validity of a

judgment on the subjective experience of confidence rather than objective facts.

Confidence is no measure of accuracy.

CHAPTER TWENTY-ONE: INTUITIONS VS. FORMULAS

Heuristic #22: IGNORING ALGORITHMS. We overlook statistical information and

favor our gut feelings. Not good! Forecasting, predicting the future of stocks, diseases, car

Thinking Fast and Slow by Daniel Kahneman Summarized by Erik Johnson 9

accidents, and weather should not be influenced by intuition but they often are. And

intuition is often wrong. We do well to consult check lists, statistics, and numerical

records and not rely on subjective feelings, hunches, or intuition. Potential for error:

“relying on intuitive judgments for important decisions if an algorithm is available that

will make fewer mistakes,” (page 229).

CHAPTER TWENTY-TWO: EXPERT INTUITION: WHEN CAN YOU TRUST IT?

Intuition means knowing something without knowing how we know it. Kahneman’s

understanding is that intuition is really a matter of recognition, being so familiar with

something we arrive at judgments quickly. Chess players “see” the chess board, fire

fighters “know” when a building is about to collapse, art dealers “identify” marks of

forgeries, parents have a “sixth sense” when their kids are in danger, readers “read”

letters and words quickly, and friends “are familiar” with their friends from a distance.

Kids become experts at video games, motorists become expert drivers, and chefs become

intuitive cooks. How? Recognition—either over long periods of exposure, or quickly in a

highly emotional event (accidents). Intuition is immediate pattern recognition, not magic.

Heuristic #23: TRUSTING EXPERT INTUITION. “We are confident when the story

we tell ourselves comes easily to mind, with no contradiction and no competing scenario.

But ease and coherence do not guarantee that a belief held with confidence is true. The

associative machine is set to suppress doubt and to evoke ideas and information that are

compatible with the currently dominant story,” (page 239). Kahneman is skeptical of

experts because they often overlook what they do not know. Kahneman trusts experts

when two conditions are met: the expert is in an environment that is sufficiently regular

to be predictable and the expert has learned these regularities through prolonged

practice. Potential for error: being mislead by “experts.”

CHAPTER TWENTY-THREE: THE OUTSIDE VIEW

Heuristic #24: THE PLANNING FALACY means taking on a risky project—litigation,

war, opening a restaurant—confident of the best case scenario without seriously

considering the worst case scenario. If we consult others who’ve engaged in similar

projects we’ll get the outside view. Failure to do this increases the potential for failure.

Cost overruns, missed deadlines, loss of interest, waning urgency all result from poor

planning. Potential for error: “making decisions based on delusional optimism rather

than on a rational weight of gains, losses, and probabilities,” (page 252). In other words,

poorly planned grandiose projects will eventually fail.

CHAPTER TWENTY-FOUR: THE ENGINE OF CAPITALISM

Heuristic #25: THE OPTIMISTIC BIAS. We are prone to neglect facts, others’ failures,

and what we don’t know in favor of what we know and how skilled we are. We believe the

outcome of our achievements lies entirely in our own hands while neglecting the luck

factor. We don’t appreciate the uncertainty of our environment. We suffer from the

Thinking Fast and Slow by Daniel Kahneman Summarized by Erik Johnson 10

illusion of control and neglect to look at the competition (in business start-ups for

example). “Experts who acknowledge the full extent of their ignorance may expect to be

replaced by more confident competitors, who are better able to gain the trust of clients,”

(page 263). Being unsure is a sign of weakness so we turn to confident experts who may

be wrong. Potential for error: unwarranted optimism which doesn’t calculate the odds

and therefore could be risky.

PART FOUR: CHOICES

CHAPTER TWENTY-FIVE: BERNOULLI’S ERRORS

Heuristic #26: OMITTING SUBJECTIVITY. We often think an object has only

intrinsic objective value. A million dollars is worth a million dollars, right? Wrong.

Magically making a poor person’s portfolio worth a million dollars would be fabulous!

Magically making a billionaire’s portfolio a worth a million dollars would be agony! One

gained, the other lost. Economists have erred by failing to consider a person’s

psychological state regarding value, risk, anxiety, or happiness. 18th century economist

Bernoulli thought money had utility (fixed worth) but he failed to consider a person’s

reference point. Potential for error: Making decisions on pure logic without considering

psychological states.

Heuristic #27: THEORY-INDUCED BLINDNESS. “Once you have accepted a theory

and used it as a tool in your thinking, it is extraordinarily difficult to notice its flaws. If

you come upon an observation that does not seem to fit the model, you assume that there

must be a perfectly good explanation that you are somehow missing,” (page 277). When

the blinders fall off the previously believed error seems absurd and the real

breakthrough occurs when you can’t remember why you didn’t see the obvious. Potential

for error: Clinging to old paradigms that have outlived their validity.

CHAPTER TWENTY-SIX: PROSPECT THEORY

Kahneman’s claim to fame is Prospect Theory (for which he won the Nobel prize in

economics). Economists used to believe that the value of money was the sole determinant

in explaining why people buy, spend, and gamble the way they do. Prospect Theory

changed that by explaining three things: 1) the value of money is less important then the

subjective experience of changes in one’s wealth. In other words, the loss or gain of $500

is psychologically positive or negative depending on a reference point, how much money

one already has. 2) We experience diminished sensitivity to changes in wealth: losing

$100 hurts more if you start with $200 than if you start with $1000. And 3) we are loathe

to lose money!

Heuristic #28: LOSS AVERSION. “You just like winning and dislike losing—and you

almost certainly dislike losing more than you like winning,” (page 281). System 1

thinking compares the psychological benefit of gain with the psychological cost of loss

Thinking Fast and Slow by Daniel Kahneman Summarized by Erik Johnson 11

and the fear of loss usually wins. Potential for error: passing by a sure win in order to

avoid what we think might be a possible loss even when the odds are in favor of winning.

CHAPTER TWENTY-SEVEN: THE ENDOWMENT EFFECT

Heuristic #29: THE ENDOWMENT EFFECT. An object we own and use is more

valuable to us than an object we don’t own and don’t use. Such objects are endowed with

significance and we’re unwilling to part with them for two reasons: we hate loss and it

has a history with us. Thus we won’t sell a beloved, useful object unless a buyer offers

significant payment. Objects we don’t like or use sell for less (or we even give them away).

Potential for error: Clinging to objects for sentimental reasons at considerable loss of

income.

CHAPTER TWENTY-EIGHT: BAD EVENTS

Heuristic #30: LOSS AVERSION. People will work harder to avoid losses than to

achieve gains. Golfers putt for par to avoid bogeys (loosing points for going over par) than

for birdies (gaining points by putting under par). Contract negotiations stall when one

party feels they’re making more concessions (losses) than their disputant. People will

work harder to avoid pain than to achieve pleasure. Even animals fight more fiercely to

maintain territory than to increase territory. Potential for error: under estimating our

own and other’s attitudes toward loss/gain. They are asymmetrical.

CHAPTER TWNETY-NINE: THE FOURFOLD PATTERN

Heuristic #31: THE POSSIBILITY EFFECT. When highly unlikely outcomes are

weighted disproportionately more than they deserve we commit the possibility effect

heuristic. Think of buying lottery tickets.

Heuristic #32: THE CERTAINTY EFFECT. Outcomes that are almost certain are

given less weight than their probability justifies. Think of lawyers who offer a “less than

perfect” settlement before the trial which would result in an “almost certain victory.”

Heuristic #33: THE EXPECTATION PRINCIPLE. The two heuristics above have this

in common: “decision weights that people assign to outcomes are not identical to the

probabilities of these outcomes, contrary to the expectation principle” (page 312).

GAINS LOSSES

HIGH PROBABILITY

(Certainty effect)

95% chance to win $10,000. Fear of disappointment, risk averse, accept

unfavorable settlement

95% chance to lose $10,000. Hope to avoid loss, risk seeking, reject

favorable settlement.

LOW PROBABILITY

(Possibility effect)

5% change to win $10,000. Hope of large gain, risk seeking, reject favorable

settlement.

5% chance to lose $10,000. Fear of large loss, risk averse, accept favorable

settlement.

Thinking Fast and Slow by Daniel Kahneman Summarized by Erik Johnson 12

This means people attach values to gains and losses rather than wealth, and decision

weights assigned to outcomes are different from probabilities. The fourfold pattern of

preferences accounts for this. Potential for error:

1. People are risk averse when they look at the prospects of a large gain. They’ll lock

in a sure gain and accept a less than expected value of the gamble.

2. When the result is extremely large, such as a lottery ticket, the buyer is indifferent

to the fact that their chance of winning is extremely small. Without the ticket they

cannot win, but with the ticket, they can at least dream.

3. This explains why people buy insurance. We’ll pay insurance because we’re buying

protection and peace of mind.

4. This explains why people take desperate gambles. They accept a high probability

of just making things worse, for a chance of a slight ray of hope of avoiding the loss

they are facing. This type of risk taking can just turn a bad situation into a

disaster.

CHAPTER THIRTY: RARE EVENTS

Heuristic #34: OVERESTIMATING THE LIKELIHOOD OF RARE EVENTS. It

makes more sense to pay attention to things that are likely to happen (rain tomorrow)

than about things that are unlikely to happen (terrorist attacks, asteroids, terminal

illness, floods and landslides). We tend to overestimate the probabilities of unlikely

events, and we tend to overweight the unlikely events in our decisions. This heuristic

joins forces with the availability cascade (#13) and cognitive ease (#2) heuristics above.

We are more likely to choose the alternative in a decision which is described with explicit

vividness, repetition, and relative frequencies (vs. how likely). Potential for error:

succumbing to fear mongers who manipulate data in favor of their cause.

CHAPTER THIRTY-ONE: RISK POLICIES

Heuristic #35: THINKING NARROWLY. Most of us are so risk averse we avoid all

gambles. This is wrong, says, Kahneman, since some gambles are clearly on our side and

by avoiding them we lose money. One way to decrease risk aversion is to think broadly,

looking at the aggregate wins over many small gambles. Thinking narrowly, looking only

at short term losses, paralyzes us. But thinking broadly is non-intuitive. It’s a System 2

task that takes work. We therefore are wired by System 1 to think irrationally

economically (saying no to easy money). The limit of human rationality is so stark

Kahneman calls it a “hopeless mirage” (page 335). The ideal of logical consistency is not

achievable by our limited minds. Potential for error: passing by risks in our favor.

CHAPTER THIRTY-TWO: KEEPING SCORE

Many have a System 1 calculator in their head that “keeps score” not only of the

potential financial gains and losses of a transaction but also of the emotional risks,

rewards, and possible regrets of our financial decisions. “The emotions that people attach

Thinking Fast and Slow by Daniel Kahneman Summarized by Erik Johnson 13

to the state of their mental accounts are not acknowledged in standard economic theory,”

page 343).

Heuristic #36: THE DISPOSITION EFFFECT. We are often willing to sell money-

earning stocks because it makes us feel like wise investors, and less willing to sell losing

stocks because it’s an admission of defeat. This is irrational since we’d earn more money

by selling the losers and clinging to the winners.

Heuristic #37: THE SUNK COST FALLACY. To avoid feeling bad about cutting our

losses and being called a failure, we tend to throw good money after bad, stay too long in

abusive marriages, and stay in unhappy careers. This is optimism gone hay-wire.

Heuristic #38: FEAR OF REGRET. Regret is an emotion we’re familiar with and we

do well to avoid making decisions that lead to regret. However, we’re terrible at

predicting how intense those feelings of regret will be. It often hurts less than we think.

CHAPTER THIRTY-THREE: REVERSALS

Heuristic #39: IGNORING JOINT EVALUATIONS. We make decisions differently

when asked to make them in isolation than when asked to make them in comparison

with other scenarios. For example, a victim in a robbery will be awarded a higher

compensation when there are poignant factors involved (the victim was visiting a store

he rarely visited), but will be awarded a lower compensation if harmed while in his usual

shopping location. When locations are compared (joint evaluation) we realize the victim’s

location is insignificant and we reverse our original compensation amount. “Joint

evaluations highlights a feature that was not noticeable in single evaluations but is

recognized as a decisive when detected,” (page 359). Potential for error: making decisions

in isolation. We should do comparison shopping, compare sentences for crimes, and

compare salaries for different jobs. Failure to do so limits our exposure to helpful norms.

CHAPTER THIRTY-FOUR: FRAMES AND REALITY

Heuristic #40: IGNORING FRAMES. How a problem is framed determines our choices

more than purely rational considerations would imply. More drivers sign the “donate

organ” card when they have to check the opt-in box, than drivers who must check the opt-

out box. We are more willing to pay extra for gas when using a credit card (vs. cash) if

the fee is framed as “loss of cash discount” than “added credit card surcharge.” Doctors

prefer interventions where outcomes are a “one month survival rate of 90%” than to

interventions where outcomes are, “10% mortality rate.” Both sentences mean the same

thing statistically but the frame of “survival” has greater emotional value than “mortality

rates.” “The meaning of a sentence is what happens in your associative machinery while

you understand it…In terms of the associations they bring to mind—how System 1 reacts

to them—the two sentences really ‘mean’ different things,” (page 363). “Reframing is

effortful and System 2 is lazy,” (page 367). Potential for error: Thinking we make

Thinking Fast and Slow by Daniel Kahneman Summarized by Erik Johnson 14

decisions in an objective bubble when in fact there are subjective factors at work about

which we are unaware.

PART FIVE: TWO SELVES

CHAPTER THIRTY-FIVE: TWO SELVES

Heuristic #41: IGNORING OUR TWO SELVES. We each have an “experiencing” self

and a “remembering” self. The latter usually takes precedence over the former. That is, I

can experience 13 days of vacation bliss but if on the 14th day things go bad I tend to

remember the vacation as negative. My memory overrides my experience. Same with a

40 minute blissful record which ends with a scratch. We remember the scratch sound, not

the 39 previous minutes of musical enjoyment. “Confusing experience with the memory

of it is a compelling cognitive illusion—and it is the substitution that makes us believe a

past experience can be ruined. The experiencing self does not have a voice,” (page 381).

Heuristic #42: THE PEAK END RULE. How an experience ends seems to hold greater

weight in our memory than how an experience was lived. Similar to the previous

heuristic, the peak end rule is shorthand for remembering only how an experience felt at

its end not at this worst moment.

Heuristic #43: DURATION NEGLECT. Another corollary of the two selves: the

duration of an unpleasant or pleasant experience doesn’t seem to be as important as the

memory of how painful or pleasurable the experience was.

CHAPTER THIRTY-SIX: LIFE AS HISTORY

Heuristic #44: NARRATIVE WHOLENESS (my user friendly name). When we

evaluate how well ours and others’ lives have been lived we do well to consider the whole

narrative and not just the end. But because of the previous three heuristics we are prone

to devalue a long, sacrificial, generous life if at the end (or even after death) we discover

episodes of selfishness, etc. “A story is about significant events and memorable moments,

not about time passing. Duration neglect is normal in a story, and the ending often

defines its character” (page 386). Potential for error: paying more attention to longevity

than quality, making decisions based on how memorable it will be rather than how

exciting and enriching the experience itself will be, and experiencing a moment of

pleasure and forfeiting our reputation of integrity.

CHAPTER THIRTY-SEVEN: EXPERIENCED WELL BEING

Heuristic #45: VALUING A REMEMBERING SELF OVER AN EXPERIENCING

SELF. Since most of us rely on unreliable memories we do well to keep in mind what our

experiences were like during them, not just at their conclusion. How many of our waking

moments are spent in unpleasant emotions or negative states? They are hard to recall!

“Our emotional state is largely determined by what we attend to, and we are normally

Thinking Fast and Slow by Daniel Kahneman Summarized by Erik Johnson 15

focused on our current activity and immediate environment,” (page 394). A person stuck

in traffic can still be happy because they’re in love, or a person who is grieving may still

remain depressed while watching a comedy. Potential for error: not paying attention to

what we are doing, letting experiences happen without reflection, and going with the flow

with no attempt to alter our schedules, activities, or experiences.

CHAPTER THIRTY-EIGHT: THINKING ABOUT LIFE

Heuristic #46: AFFECTIVE FORECASTING. Which factor leads to a happier life

duration: or experiences? Would a 20 year life with many happy experiences be better

than a 60 year life with many terrible experiences? Which would you rather be: happy or

old? We are terrible at predicting what will make us happy. When asked the very

difficult question, “Overall, how happy is your life?” we substitute an easier question,

“How happy am I right now?” (See heuristic #7). “…the responses to global well-being

questions should be taken with a grain of salt” (page 399). People make decisions based

on what will make them happy in the future but when it’s achieved the happiness doesn’t

last. We don’t know our future selves very well.

Heuristic #47: THE FOCUSING ILLUSION. “Nothing in life is as important as it is

when you are thinking about it,” (page 402). This means when we’re asked to evaluate a

decision, life satisfaction, or preference we err if we focus on only one thing. How we

answer, “What would make you happy?” depends on many factors and rarely is one factor

determinant. Yet folks regularly focus on one issue—income, weather, health,

relationships, pollution, etc.—and ignore other important factors. “How much pleasure do

you get from your car?” Depends on how much you value the stereo, mileage, looks, age,

cost, comfortable seats, tilt of steering wheel, etc. The fact is, our evaluations are often

based on the heuristic that while we are thinking of a thing we generally think better of

it, forgetting how infrequently we actually think about those things (income, weather,

health, stereo, mileage, looks, etc). What initially strikes our fancy is absorbed into daily

living, we adapt, we acclimate, we experience the initial pleasure less intensely as time

progresses. “The remembering self is subject to a massive focusing illusion about the life

that the experiencing self endures quite comfortably,” (page 406).

Heuristic #48: MISWANTING. (Daniel Gilbert’s phrase). We exaggerate the effect of a

significant purchase or changed circumstances on our future well-being. Things that are

initially exciting eventually lose their appeal.

CONCLUSIONS

SUMMARY OF THE TWO SELVES. It’s absurd that people willingly choose more pain

for longer periods of time that end pleasantly over periods of less pain of shorter duration

and end terribly. But such are the powers of heuristics #s 41, 42, 43, and 45.

SUMMARY OF ECONS AND HUMANS. Kahneman made infrequent mentions of

“econs and humans” so I do not emphasize them in my book summary. Here’s the gist of

Thinking Fast and Slow by Daniel Kahneman Summarized by Erik Johnson 16

his complaint. Economists (“the Chicago school”) operate on the assumption that

consumers are rational (“internally consistent,” “logically coherent,” “adhering to rules of

logic,” page 411) and always will do the rational thing. If not, that’s their loss. Kahneman

as a behavioral economist of course disagrees and suggests that heuristics influence our

choices which are irrational and counter intuitive; we need help making better choices.

The Chicago School are libertarians who want government to keep out of the way and let

people make their own choices, good or bad (provided they don’t hurt others). Economic

behaviorists suggest giving people a nudge is sometimes necessary (regulation, writing

clearer contracts, truth in advertising, etc).

SUMMARY OF TWO SYSTEMS. “This book has described the workings of the mind as

an uneasy interaction between two fictitious characters: the automatic System 1 and the

effortful System 2,” (page 415).

SYSTEM 1 SYSTEM 2

Subconscious values, drives, beliefs that

influence our “gut reactions.”

Articulates judgments, makes choices,

endorses or rationalizes ideas and feelings

Jumps to conclusions regarding causality. Makes up stories to either confirm or deny

those conclusions.

Operates effortlessly. Requires conscious effort to engage.

Can be wrong but is more often right. Can be wrong or right depending on how

hard it works.

Influenced by heuristics. Examines those heuristics when so

inclined.

“The way to block errors that originate in System 1 is simple in principle: recognize the

signs that you are in a cognitive minefield, slow down, and ask for reinforcement from

System 2,” (page 417).

You can place an order similar to this with us. You are assured of an authentic custom paper delivered within the given deadline besides our 24/7 customer support all through.

 

You can place an order similar to this with us. You are assured of an authentic custom paper delivered within the given deadline besides our 24/7 customer support all through.

 

Latest completed orders:

# topic title discipline academic level pages delivered
6
Writer's choice
Business
University
2
1 hour 32 min
7
Wise Approach to
Philosophy
College
2
2 hours 19 min
8
1980's and 1990
History
College
3
2 hours 20 min
9
pick the best topic
Finance
School
2
2 hours 27 min
10
finance for leisure
Finance
University
12
2 hours 36 min
[order_calculator]