Even statisticians were not good intuitive statisticians.
Until geographical separation made it too difficult to go on, Amos and I enjoyed the extraordinary good fortune of a shared mind that was superior to our individual minds and of a relationship that made our work fun as well as productive.
Intuition is nothing more and nothing less than recognition.”
Part 5 describes recent research that has introduced a distinction between two selves, the experiencing self and the remembering self, which do not have the same interests.
The authors note that the most remarkable observation of their study is that people find its results very surprising. Indeed, the viewers who fail to see the gorilla are initially sure that it was not there—they cannot imagine missing such a striking event. The gorilla study illustrates two important facts about our minds: we can be blind to the obvious, and we are also blind to our blindness.
Constantly questioning our own thinking would be impossibly tedious, and System 2 is much too slow and inefficient to serve as a substitute for System 1 in making routine decisions. The best we can do is a compromise: learn to recognize situations in which mistakes are likely and try harder to avoid significant mistakes when the stakes are high.
Why call them System 1 and System 2 rather than the more descriptive “automatic system” and “effortful system”? The reason is simple: “Automatic system” takes longer to say than “System 1” and therefore takes more space in your working memory. This matters, because anything that occupies your working memory reduces your ability to think.
We found that people, when engaged in a mental sprint, may become effectively blind. The authors of The Invisible Gorilla had made the gorilla “invisible” by keeping the observers intensely busy counting passes.
One of the significant discoveries of cognitive psychologists in recent decades is that switching from one task to another is effortful, especially under time pressure.
Modern tests of working memory require the individual to switch repeatedly between two demanding tasks, retaining the results of one operation while performing the other. People who do well on these tests tend to do well on tests of general intelligence.
The most effortful forms of slow thinking are those that require you to think fast.
In addition to the physical effort of moving my body rapidly along the path, a mental effort of self-control is needed to resist the urge to slow down. Self-control and deliberate thought apparently draw on the same limited budget of effort.
System 1 has more influence on behavior when System 2 is busy, and it has a sweet tooth.
People who are cognitively busy are also more likely to make selfish choices, use sexist language, and make superficial judgments in social situations.
The bold implication of this idea is that the effects of ego depletion could be undone by ingesting glucose, and Baumeister and his colleagues have confirmed this hypothesis n ohypothesiin several experiments.
Restoring the level of available sugar in the brain had prevented the deterioration of performance.
The authors of the study plotted the proportion of approved requests against the time since the last food break. The proportion spikes after each meal, when about 65% of requests are granted. During the two hours or so until the judges’ next feeding, the approval rate drops steadily, to about zero just before the meal.
As cognitive scientists have emphasized in recent years, cognition is embodied; you think with your body, not only with your brain.
The notion that we have limited access to the workings of our minds is difficult to accept because, naturally, it is alien to our experience, but it is true: you know far less about yourself than you feel you do.
Feeling that one’s soul is stained appears to trigger a desire to cleanse one’s body, an impulse that has been dubbed the “Lady Macbeth effect.”
“The world makes much less sense than you think. The coherence comes mostly from the way your mind works.”
When you are in a state of cognitive ease, you are probably in a good mood, like what you see, believe what you hear, trust your intuitions, and feel that the current situation is comfortably familiar. You are also likely to be relatively casual and superficial in your thinking. When you feel strained, you are more likely to be vigilant and suspicious, invest more effort in what you are doing, feel less comfortable, and make fewer errors, but you also are less intuitive and less creative than usual.
Larry Jacoby, the psychologist who first demonstrated this memory illusion in the laboratory, titled his article “Becoming Famous Overnight.”
A reliable way to make people believe in falsehoods is frequent repetition, because familiarity is not easily distinguished from truth.
It is entirely legitimate for you to enlist cognitive ease to work in your favor, and studies of truth illusions provide specific suggestions that may help you achieve this goal.
If you care about being thought credible and intelligent, do not use complex language where simpler language will do.
If you use color, you are more likely to be believed if your text is printed in bright blue or red than in middling shades of green, yellow, or pale blue.
Finally, if you quote a source, choose one with a name that is easy to pronounce.
Remember that System 2 is lazy and that mental effort is aversive. If possible, the recipients of your message want to stay away from anything that reminds them of effort, including a source with a complicated name.
The results tell a clear story: 90% of the students who saw the CRT in normal font made at least one mistake in the test, but the proportion dropped to 35% when the font was barely legible. You read this correctly: performance was better with the bad font. Cognitive strain, whatever its source, mobilizes System 2, which is more likely to reject the intuitive answer suggested by System 1.
Mood evidently affects the operation of System 1: when we are uncomfortable and unhappy, we lose touch with our intuition.
“I’m in a very good mood today, and my System 2 is weaker than usual. I should be extra careful.”
The most important aspect of both examples is that a definite choice was made, but you did not know it. Only one interpretation came to mind, and you were never aware of the ambiguity. System 1 does not keep track of alternatives that it rejects, or even of the fact that there were alternatives. Conscious doubt is not in the repertoire of System 1;
The moral is significant: when System 2 is otherwise engaged, we will believe almost anything. System 1 is gullible and biased to believe, System 2 is in charge of doubting and unbelieving, but System 2 is sometimes busy, and often lazy. Indeed, there is evidence that people are more likely to be influenced by empty persuasive messages, such as commercials, when they are tired and depleted.
The deeper truth is that there is nothing to explain.
However, sustaining doubt is harder work than sliding into certainty. The law of small numbers is a manifestation of a general bias that favors certainty over doubt,
The simple answer to these questions is that if you follow your intuition, you will more often than not err by misclassifying a random event as systematic. We are far too willing to reject the belief that much of what we see in life is random.
The exaggerated faith in small samples is only one example of a more general illusion—we pay more attention to the content of messages than to information about their reliability, and as a result end up with a view of the world around us that is simpler and more coherent than the data justify.
Jumping to conclusions is a safer sport in the world of our imagination than it is in reality.
We see the same strategy at work in the negotiation over the price of a home, when the seller makes the first move by setting the list price. As in many other games, moving first is an advantage in single-issue negotiations—for example, when price is the only issue to be settled between a buyer and a seller.
My advice to students when I taught negotiations was that if you think the other side has made an outrageous proposal, you should not come back with an equally outrageous counteroffer, creating a gap that will be difficult to bridge in further negotiations. Instead you should make a scene, storm out or threaten to do so, and make it clear—to yourself as well as to the other side—that you will not continue the negotiation with that number on the table.
They instructed negotiators to focus their attention and search their memory for arguments against the anchor. The instruction to activate System 2 was successful. For example, the anchoring effect is reduced or eliminated when the second mover focuses his attention on the minimal offer that the opponent would accept, or on the costs to the opponent of failing to reach an agreement. In general, a strategy of deliberately “thinking the opposite” may be a good defense against anchoring effects, because it negates the biased recruitment of thoughts that produces these effects.
you should assume that any number that is on the table has had an anchoring effect on you, and if the stakes are high you should mobilize yourself (your System 2) to combat the effect.
The mere observation that there is usually more than 100% credit to go around is sometimes sufficient to defuse the situation.
people who had just listed twelve instances rated themselves as less assertive than people who had listed only six. Furthermore, participants who had been asked to list twelve cases in which they had not behaved assertively ended up thinking of themselves as quite assertive!
A professor at UCLA found an ingenious way to exploit the availability bias. He asked different groups of students to list ways to improve the course, and he varied the required number of improvements. As expected, the students who listed more ways to improve the class rated it higher!
As predicted, participants whose experience of fluency was “explained” did not use it as a heuristic; the subjects who were told that music would make retrieval more difficult rated themselves as equally assertive when they retrieved twelve instances as when they retrieved six.
Suppose you are told that the three-year-old boy who lives next door frequently wears a top hat in his stroller. You will be far less surprised when you actually see him with his top hat than you would have been without the warning.
Merely reminding people of a time when they had power increases their apparent trust in their own intuition.
“Because of the coincidence of two planes crashing last month, she now prefers to take the train. That’s silly. The risk hasn’t really changed; it is an availability bias.”
An inability to be guided by a “healthy fear” of bad consequences is a disastrous flaw.
Michael Lewis’s bestselling Moneyball
Norbert Schwarz and his colleagues showed that instructing people to “think like a statistician” enhanced the use of base-rate information, while the instruction to “think like a clinician” had the opposite effect.
It is useful to remember, however, that neglecting valid stereotypes inevitably results in suboptimal judgments. Resistance to stereotyping is a laudable moral position, but the simplistic idea that the resistance is costless is wrong.
The classic experiment I describe next shows that people will not draw from base-rate information an inference that conflicts with other beliefs. It also supports the uncomfortable conclusion that teaching psychology is mostly a waste of time.
Changing one’s mind about human nature is hard work, and changing one’s mind for the worse about oneself is even harder.
I had stumbled onto a significant fact of the human condition: the feedback to which life exposes us is perverse. Because we tend to be nice to other people when they please us and nasty when they do not, we are statistically punished for being nice and rewarded for being nasty.
success = talent + luck great success = a little more talent + a lot of luck
The fact that you observe regression when you predict an early event from a later event should help convince you that regression does not have a causal explanation.
A business commentator who correctly announces that “the business did better this year because it had done poorly last year” is likely to have a short tenure on the air.
“Perhaps his second interview was less impressive than the first because he was afraid of disappointing us, but more likely it was his first that was unusually good.”
precocious
Here are the directions for how to get there in four simple steps: Start with an estimate of average GPA. Determine the GPA that matches your impression of the evidence. Estimate the correlation between your evidence and GPA. If the correlation is .30, move 30% of the distance from the average to the matching GPA.
Paradoxically, it is easier to construct a coherent story when you know little, when there are fewer pieces to fit into the puzzle. Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.
A general limitation of the human mind is its imperfect ability to reconstruct past states of knowledge, or beliefs that have changed. Once you adopt a new view of the world (or of any part of it), you immediately lose much of your ability to recall what you used to believe before your mind changed.
Hindsight bias has pernicious effects on the evaluations of decision makers. It leads observers to assess the quality of a decision not by whether the process was sound but by whether its outcome was good or bad.
Leaders who have been lucky are never punished for having taken too much risk. Instead, they are believed to have had the flair and foresight to anticipate success, and the sensible people who doubted them are seen in hindsight as mediocre, timid, and weak. A few lucky gambles can crown a reckless leader with a halo of prescience and boldness.
Everything makes sense in hindsight,
In a memorable example, Dawes showed that marital stability is well predicted by a formula: frequency of lovemaking minus frequency of quarrels You don’t want your result to be a negative number.
The important conclusion from this research is that an algorithm that is constructed on the back of an envelope is often good enough to compete with an optimally weighted formula, and certainly good enough to outdo expert judgment.
do not trust anyone—including yourself—to tell you how much you should trust their judgment.
Whether professionals have a chance to develop intuitive expertise depends essentially on the quality and speed of feedback, as well as on sufficient opportunity to practice.
Our conclusion was that for the most part it is possible to distinguish intuitions that are likely to be valid from those that are likely to be bogus. As in the judgment of whether a work of art is genuine or a fake, you will usually do better by focusing on its provenance than by looking at the piece itself. If the environment is sufficiently regular and if the judge has had a chance to learn its regularities, the associative machinery will recognize situations and generate quick and accurate predictions and decisions. You can trust someone’s intuitions if these conditions are met.
this is what always happens when a project ends reasonably well: once you understand the main conclusion, it seems it was always obvious.
“Did he really have an opportunity to learn? How quick and how clear was the feedback he received on his judgments?”
irrational perseverance: the folly we displayed that day in failing to abandon the project. Facing a choice, we gave up rationality rather than give up the enterprise.
Seymour’s forecast from his insidethaa view was not an adjustment from the baseline prediction, which had not come to his mind. It was based on the particular circumstances of our efforts. Like the participants in the Tom W experiment, Seymour knew the relevant base rate but did not think of applying it.
This is a common pattern: people who have information about an individual case rarely feel the need to know the statistics of the class to which the case belongs.
Can overconfident optimism be overcome by training? I am not optimistic.
But of course the main reason that decision theorists study simple gambles is that this is what other decision theorists do.
As the psychologist Daniel Gilbert observed, disbelieving is hard work, and System 2 is easily tired.
You know you have made a theoretical advance when you can no longer reconstruct why you failed for so long to see the obvious.
“He suffers from extreme loss aversion, which makes him turn down very favorable opportunities.”
the disadvantages of a change loom larger than its advantages, inducing a bias that favors the status quo.
“She didn’t care which of the two offices she would get, but a day after the announcement was made, she was no longer willing to trade. Endowment effect!”
“When they raised their prices, demand dried up.”
Images of the brain showed an intense response of the amygdala to a threatening picture that the viewer did not recognize. The information about the threat probably traveled via a superfast neural channel that feeds directly into a part of the brain that processes emotions, bypassing the visual cortex that supports the conscious experience of “seeing.”
“Bad emotions, bad parents, and bad feedback have more impact than good ones, and bad information is processed more thoroughly than good. The self is more motivated to avoid bad self-definitions than to pursue good ones.
Economic logic implies that cabdrivers should work many hours on rainy days and treat themselves to some leisure on mild days, when they can “buy” leisure at a lower price.
Because negotiators are influenced by a norm of reciprocity, a concession that is presented as painful calls for an equally painful (and perhaps equally inauthentic) concession from the other side.
opprobrium
“This reform will not pass. Those who stand to lose will fight harder than those who stand to gain.”
“Each of them thinks the other’s concessions are less painful. They are both wrong, of course. It’s just the asymmetry of losses.”
However, when an unlikely event becomes the focus of attention, we will assign it much more weight than its probability deserves.
This is where businesses that are losing ground to a superior technology waste their remaining assets in futile attempts to catch up. Because defeat is so difficult to accept, the losing side in wars often fights long past the point at which the victory of the other side is certain, and only a matter of time.
My experience illustrates how terrorism works and why it is so effective: it induces an availability cascade. An extremely vivid image of death and damage, constantly reinforced by media attention and frequent conversations, becomes highly accessible, especially if it is associated with a specific situation such as the sight of a bus.
adding irrelevant but vivid details to a monetary outcome also disrupts calculation.
A good attorney who wishes to cast doubt on DNA evidence will not tell the jury that “the chance of a false match is 0.1%.” The statement that “a false match occurs in 1 of 1,000 capital cases” is far more likely to pass the threshold of reasonable doubt.
As expected from prospect theory, choice from description yields a possibility effect—rare outcomes are overweighted relative to their probability. In sharp contrast, overweighting is never observed in choice from experience, and underweighting is common.
“We shouldn’t focus on a single scenario, or we will overestimate its probability. Let’s set up specific alternatives and make the probabilities add up to 100%.”
“They want people to be worried by the risk. That’s why they describe it as 1 death per 1,000. They’re counting on denominator neglect.”
every simple choice formulated in terms of gains and losses can be deconstructed in innumerable ways into a combination of choices, yielding preferences that are likely to be inconsistent.
We have neither the inclination nor the mental resources to enforce consistency on our preferences, and our preferences are not magically set to be coherent, as they are in the rational-agent model.
Decision makers who are prone to narrow framing construct a preference every time they face a risky choice. They would do better by having a risk policy that they routinely apply whenever a relevant problem arises. Familiar examples of risk policies are “always take the highest possible deductible when purchasing insurance” and “never buy extended warranties.”
The outside view is a broad frame for thinking about plans. A risk policy is a broad frame that embeds a particular risky choice in a set of similar choices.
Exaggerated optimism protects individuals and organizations from the paralyzing effects of loss aversion; loss aversion protects them from the follies of overconfident optimism.
An organization that could eliminate both excessive optimism and excessive loss aversion should do so. The combination of the outside view with a risk policy should be the goal.
“I decided to evaluate my portfolio only once a quarter. I am too loss averse to make sensible decisions in the face of daily price fluctuations.”
“Each of our executives is loss averse in his or her domain. That’s perfectly natural, but the result is that the organization is not taking enough risk.”
the main motivators of money-seeking are not necessarily economic. For the billionaire looking for the extra billion, and indeed for the participant in an experimental economics project looking for the extra dollar, money is a proxy for points on a scale of self-regard and achievement.
As a result, we refuse to cut losses when doing so would admit failure, we are biased against actions that could lead to regret, and we draw an illusory but sharp distinction between omission and commission, not doing and doing, because the sense of responsibility is greater for one than for the other.
To implement this rational behavior, System 2 would have to be aware of the counterfactual possibility: “Would I still drive into this snowstorm if I had gotten the ticket free from a friend?” It takes an active and disciplined mind to raise such a difficult question.
The investor has set up an account for each share that she bought, and she wants to close every account as a gain. A rational agent would have a comprehensive view of the portfolio and sell the stock that is least likely to do well in the future, without considering whether it is a winner or a loser.
The sunk-cost fallacy keeps people for too long in poor jobs, unhappy marriages, and unpromising research projects.
Averaging over several such pairs of cases, awards to victims of personal injury were more than twice as large in joint than in single evaluation.
The system of administrative penalties is coherent within agencies but incoherent globally.
The objective outcomes are precisely identical in the two frames, and a reality-bound Econ would respond to both in the same way—selecting either the sure thing or the gamble regardless of the frame—but we already know that the Human mind is not bound to reality.
Remarkably, the “rational” individuals were not those who showed the strongest neural evidence of conflict. It appears that these elite participants were (often, not always) reality-bound with little conflict.
manipulation—but we must get used to the idea that even important decisions are influenced, if not governed, by System 1.
The version in which cash was lost leads to more reasonable decisions. It is a better frame because the loss, even if tickets were lost, is “sunk,” and sunk costs should be ignored. History is irrelevant and the only issue that matters is the set of options the theater patron has now, and their likely consequences.
“They will feel better about what happened if they manage to frame the outcome in terms of how much money they kept rather than how much they lost.”
Peak-end rule: The global retrospective rating was well predicted by the average of the level of pain reported at the worst moment of the experience and at its end. Duration neglect: The duration of the procedure had no effect whatsoever on the ratings of total pain.
If the objective is to reduce patients’ memory of pain, lowering the peak intensity of pain could be more important than minimizing the duration of the procedure. By the same reasoning, gradual relief may be preferable to abrupt relief if patients retain a better memory when the pain at the end of the procedure is relatively mild. If the objective is to reduce the amount of pain actually experienced, conducting the procedure swiftly may be appropriate even if doing so increases the peak pain intensity and leaves patients with an awful memory.
I find it helpful to think of this dilemma as a conflict of interests between two selves (which do not correspond to the two familiar systems). The experiencing self is the one that answers the question: “Does it hurt now?” The remembering self is the one that answers the question: “How was it, on the whole?” Memories are all we get to keep from our experience of living, and the only perspective that we can adopt as we think about our lives is therefore that of the remembering self.
Confusing experience with the memory of it is a compelling cognitive illusion—and it is the substitution that makes us believe a past experience can be ruined. The experiencing self does not have a voice. The remembering self is sometimes wrong, but it is the one that keeps score and governs what we learn from living,
If we had asked them, “Would you prefer a 90-second immersion or only the first part of it?” they would certainly have selected the short option. We did not use these words, however, and the subjects did what came naturally: they chose to repeat the episode of which they had the less aversive memory.
Other classic studies showed that electrical stimulation of specific areas in the rat brain (and of corresponding areas in the human brain) produce a sensation of intense pleasure, so intense in some cases that rats who can stimulate their brain by pressing a lever will die of starvation without taking a break to feed themselves.
The cold-hand study showed that we cannot fully trust our preferences to reflect our interests, even if they are based on personal experience, and even if the memory of that experience was laid down within the last quarter of an hour! Tastes and decisions are shaped by memories, and the memories can be wrong.
“You are thinking of your failed marriage entirely from the perspective of the remembering self. A divorce is like a symphony with a screeching sound at the end—the fact that it ended badly does not mean it was all bad.”
A story is about significant events and memorable moments, not about time passing. Duration neglect is normal in a story, and the ending often defines its character.
This is how the remembering self works: it composes stories and keeps them for future reference.
As expected from this idea, Diener and his students also found a less-is-more effect, a strong indication that an average (prototype) has been substituted for a sum. Adding 5 “slightly happy” years to a very happy life caused a substantial drop in evaluations of the total happiness of that life.
Odd as it may seem, I am my remembering self, and the experiencing self, who does my living, is like a stranger to me.
We found that American women spent about 19% of the time in an unpleasant state, somewhat higher than French women (16%) or Danish women (14%).
It appears that a small fraction of the population does most of the suffering—whether
The biggest surprise was the emotional experience of the time spent with one’s children, which for American women was slightly less enjoyable than doing housework.
The Americans were far more prone to combine eating with other activities, and their pleasure from eating was correspondingly diluted.
Can money buy happiness? The conclusion is that being poor makes one miserable, and that being rich may enhance one’s life satisfaction, but does not (on average) improve experienced well-being.
“The easiest way to increase happiness is to control your use of time. Can you find more time to do the things you enjoy doing?”
A disposition for well-being is as heritable as height or intelligence, as demonstrated by studies of twins separated at birth.
The same principle applies to other goals—one recipe for a dissatisfied adulthood is setting goals that are especially difficult to attain.
Teenagers’ goals influence what happens to them, where they end up, and how satisfied they are.
Nothing in life is as important as you think it is when you are thinking about it.
much narrower question: “How much pleasure do you get from your car when you think about it?”
“She thought that buying a fancy car would make her happier, but it turned out to be an error of affective forecasting.”
“Buying a larger house may not make us happier in the long term. We could be suffering from a focusing illusion.”
An objective observer making the choice for someone else would undoubtedly choose the short exposure, favoring the sufferer’s experiencing self. The choices that people made on their own behalf are fairly described as mistakes.
The only test of rationality is not whether a person’s beliefs and preferences are reasonable, but whether they are internally consistent. A rational person can believe in ghosts so long as all her other beliefs are consistent with the existence of ghosts.
Moreover, you will believe the story you make up. But System 2 is not merely an apologist for System 1; it also prevents many foolish thoughts and inappropriate impulses from overt expression.
The acquisition of skills requires a regular environment, an adequate opportunity to practice, and rapid and unequivocal feedback about the correctness of thoughts and actions. When these conditions are fulfilled, skill eventually develops, and the intuitive judgments and choices that quickly come to mind will mostly be accurate.
The voice of reason may be much fainter than the loud and clear voice of an erroneous intuition, and questioning your intuitions is unpleasant when you face the stress of a big decision.