ALPHABETICAL BRAIN® VOCABULARY

HUMANIST SECULAR
SCIENCE STAR
DANIEL KAHNEMAN

August 18, 2022

green separator

THINKING, FAST AND SLOW
by Daniel Kahneman.
Farrar, Straus and Giroux,
2013, 2011
(499 pages)

green separator

Quote = "Daniel Kahneman's book is long but readable. It presents provocative theories and groundbreaking research and clearly explains both. He engages readers in a lively conversation about how we think.The book reveals where we can and cannot trust our intuitions and how we can tap into the benefits of 'slow thinking.' He discusses practical and enlightening insights into how choices are made in both our business and our personal lives... And he explains how we can use different techniques to guard against the mental glitches that often get us into trouble. In addition, he analyzes how the human brain makes irrational decisions and falls prey to various mental 'traps.' Finally, he looks at how the 'experienced happiness' of individuals is quite distinct from the 'remembered happiness' of people." (Paraphrased slightly by webmaster from publisher's blurb)

Quote = "Kahneman postulates two systems of thinking that operate simultaneously but often at odds: intuitive and deliberative, or fast and slow, respectively... He claims that 'fast judgments' dominate to a greater extent than we know and frequently operate to our disadvantage." (Paraphrased slightly by webmaster from publisher's blurb)

Quote = "A key discovery that overcame a situation that he calls 'theory induced blindness' was that outcomes are better defined by gains and losses than by sums of wealth. It refers mainly to 'fast-thinking' mistakes but can occur in 'slow thinking' when our assumptions are wrong or they simply interfere with seeing (perceiving) correctly... For example, the impact of overconfidence on corporate strategies, the difficulties of predicting what will make us happy in the future, the profound effect of cognitive biases on everything from playing the stock market to planning your next vacation can all be understood by knowing how the two systems of thought shape our judgments and decisions." (Paraphrased slightly by webmaster from publisher's summary)

green separator
BOOK OUTLINE
green separator

Note = Numbers in parentheses refer to pages

INTRODUCTION (3-15)

PART I — TWO SYSTEMS (17-105)

1) THE CHARACTERS OF THE STORY (19-30)

2) ATTENTION AND EFFORT (31-38)

3) THE LAZY CONTROLLER (39-49)

4) THE ASSOCIATIVE MACHINE (50-58)

5) COGNITIVE EASE (59-70)
    note = Causes of cognitive ease --- diagram (60)

    note = Illusions of remembering (60-61)

    note = Illusions of truth (61-62)

    note = How to write a persuasive message (62-64)
6) NORMS, SURPRISES, AND CAUSES (71-78)

7) A MACHINE FOR JUMPING TO CONCLUSIONS (79-88)

8) HOW JUDGMENTS HAPPEN (89-96)

9) ANSWERING AN EASIER QUESTION (97-105)
    note = Substituting one question for another can be a good strategy for solving difficult problems (98-99)

    note = The mood heuristic for happiness (101-103)

    note = The affect heuristic (103-105)
PART 2 — HEURISTICS AND BIASES (106-195)

10) THE LAW OF SMALL NUMBERS (109-118)

11) ANCHORS (119-128)
    note = Anchoring as adjustment (120-122)
12) THE SCIENCE OF AVAILABILITY (129-136)

13) AVAILABILITY, EMOTION, AND RISK (137-145)

14) TOM W'S SPECIALTY (146-155)
    note = To be useful, your beliefs should be constrained by the logic of probability (154)
15) LINDA — Less is more (156-165)

16) CAUSES TRUMP STATISTICS (166-174)

17) REGRESSION TO THE MEAN (175-184)
    note = Eureka story of training Israeli fliers and their trainer's mistake of attaching a causal interpretation to the inevitable fluctuations of a random process. (175-176)

    note = Understanding regression via Galton' discovery in 1886 publication (179-184)
18) TAMING INTUITIVE PREDICTIONS (185-195)

PART 3 — OVERCONFIDENCE (197-265)

19) THE ILLUSION OF UNDERSTANDING (199-208)

20) THE ILLUSION OF VALIDITY (209-221)

21) INTUITIONS VS. FORMULAS (222-233)
    note = Intuition adds value even in the justly derided selection interview, but only after a disciplined collection of objective information and disciplined scoring of separate traits (231-232)

    note = Suggestions about evaluators making judgments vs formulas (233)
22) EXPERT INTUITIONWhen can we trust it? (234-244)

23) THE OUTSIDE VIEW (245-254)
    note = Mitigating the planning fallacy (251-254)
24) THE ENGINE OF CAPITALISM (255-265)

PART 4 — CHOICES (267-374)

25) BERNOULLI'S ERRORS (269-277)

26) PROSPECT THEORY (278-288)

27) THE ENDOWMENT EFFECT (289-299)

28) BAD EVENTS (300-309)

29) THE FOURFOLD PATTERN (310-321)

30) RARE EVENTS (322-333)

31) RISK POLICIES (334-341)

32) KEEPING SCORE (342-352)

33) REVERSALS (353-362)

34) FRAMES AND REALITY (363-374)
    note = [1] Emotional framing (364-367)

    note = [2] Good frames (371-374)
PART 5 — TWO SELVES (375-)

35) TWO SELVES (377-385)
    note = The meaning of "utility" according to Bentham vs. "wantability" during last 100 years (375)

    [1] Experience utility (378)

    [2] Experience and memory (378-381)

    [3] Which self should count (381-384)

    [4] Biology vs. Rationality (384-385)

    [5] Speaking of two selves (385)
36) LIFE AS A STORY (386-390)
    [1] Amnesic vacations (388-390)

    [2] Speaking of life as a story (390)
37) EXPERIENCED WELL-BEING (391-397)
    [1] Experienced well-being (392-397)

    [2] Speaking of experienced well-being (397)
38) THINKING ABOUT LIFE (398-407)
    [1] The focusing illusion (402)

      note = "Nothing in life is as important as you think it is when you are thinking about it." (402)

      note = "How much pleasure do you get from your car?" (403)

    [2] Speaking of thinking about life (407)
CONCLUSIONS (408-418)
    [1] Two selves (408-415)

    [2] Econs and humans (411-415)

    [3] Two systems (415-418)
APPENDIX A — Judgment under uncertainty (419-432)

APPENDIX B — Choices, Values, and Frames (433-446)
    note = Concluding remarks (446)
NOTES (449-481)

ACKNOWLEDGMENTS (483)

INDEX (485-499)

green separator
AUTHOR NOTES, SUMMARY,
AND BOOK DESCRIPTION

green separator

AUTHOR NOTES = Daniel Kahneman received the Nobel Prize in Economic Sciences 2002 for his pioneering work with Amos Tversky on decision-making. He was born on March 5, 1934 in Tel Aviv, Israel) and did pioneering work researched human judgment and decision making under uncertainty. The book was a major New York Times bestseller --- Winner of the National Academy of Sciences Best Book Award in 2012 --- Selected by the New York Times Book Review as one of the ten best books of 2011 --- A Globe and Mail Best Books of the Year 2011 Title --- One of The Economist's 2011 Books of the Year --- One of The Wall Street Journal's Best Nonfiction Books of the Year 2011 --- 2013 Presidential Medal of Freedom Recipient --- Kahneman's work with Amos Tversky is the subject of Michael Lewis's The Undoing Project: A Friendship That Changed Our Minds.

SUMMARY = In this long but readable book, Kahneman presents provocative theories and groundbreaking research and, moreover, clearly explains both. He postulates two systems of thinking that operate simultaneously but often at odds: intuitive and deliberative, or fast and slow, respectively. Fast judgments dominate to a greater extent than we know and to our disadvantage. A key discovery that overcame an effect Kahneman terms "theory induced blindness" (which refers mainly to fast-thinking mistakes but can occur in slow thinking when our assumptions are wrong or simply interfere with seeing) was that outcomes are better defined by gains and losses than by sums of wealth. The book is destined to be a classic.

BOOK DESCRIPTION = Kahneman, the renowned psychologist and winner of the 2002 Nobel Prize in Economics, takes us on a groundbreaking tour of the mind and explains the two systems that drive the way we think. System 1 is fast, intuitive, and emotional; System 2 is slower, more deliberative, and more logical. The impact of overconfidence on corporate strategies, the difficulties of predicting what will make us happy in the future, the profound effect of cognitive biases on everything from playing the stock market to planning our next vacation --- each of these can be understood only by knowing how the two systems shape our judgments and decisions.

The book engages the reader in a lively conversation about how we think, Kahneman reveals where we can and cannot trust our intuitions and how we can tap into the benefits of slow thinking. He offers practical and enlightening insights into how choices are made in both our business and our personal lives --- and how we can use different techniques to guard against the mental glitches that often get us into trouble. He analyzes how the human brain makes irrational decisions and falls prey to mental "traps" and he looks at how the "experienced happiness" of individuals is quite distinct from their "remembered happiness."

A key discovery that overcame an effect Kahneman terms "theory induced blindness" was that outcomes are better defined by gains and losses than by sums of wealth. It refers mainly to "fast-thinking" mistakes but can occur in "slow thinking" when our assumptions are wrong or simply interfere with our seeing (perception).

green separator
BOOK REVIEWS
green separator

LIBRARY JOURNAL REVIEW = Daniel Kahneman (psychology, emeritus, Princeton) won the 2002 Nobel Prize in Economics for his work with Amos Tversky on decision making. In this large, readable book, Kahneman presents provocative theories and groundbreaking research and, moreover, clearly explains both. He postulates two systems of thinking that operate simultaneously but often at odds: intuitive and deliberative, or fast and slow, respectively. Fast judgments dominate to a greater extent than we know and to our disadvantage. A key discovery that overcame an effect Kahneman terms "theory induced blindness" (which refers mainly to fast-thinking mistakes but can occur in slow thinking when our assumptions are wrong or simply interfere with seeing) was that outcomes are better defined by gains and losses than by sums of wealth. "Prospect theory," an idea Kahneman developed with Tversky, posits that, when all our options are bad, we tend to take riskier paths. With Kahneman's expert help, readers may understand this mix of psychology and economics better than most accountants, therapists, or elected representatives. VERDICT A stellar accomplishment, a book for everyone who likes to think and wants to do it better. -- E. James Lieberman, George Washington Univ. Sch. of Medicine, Washington, DC.

BOOK LIST REVIEW = Decision making tends to be intuitive rather than logical. Kahneman has dedicated his academic research to understanding why that is so. This work distills his and colleagues' findings about how we make up our minds and how much we can trust intuition. Clinical experiments on psychology's traditional guinea pigs college students abound and collectively batter confidence in System 1. as Kahneman calls intuition. All sorts of biases, sporting tags like the halo effect (i.e., unwarranted attribution of positive qualities to a thing or person one likes), bedevil accurate appraisal of reality. According to Kahneman, intuitive feelings often override System 2. or thinking that requires effort, such as simple arithmetic. Exemplifying his points in arenas as diverse as selecting military officers, speculating in stocks, hiring employees, and starting up businesses, Kahneman accords some reliability to intuitive choice, as long as the decision maker is aware of cognitive illusions (the study of which brought Kahneman the 2002 Nobel Prize in Economics). Kahneman's insights will most benefit those in leadership positions yet they will also help the average reader to become a better car buyer.--Taylor, Gilbert

CHOICE REVIEW = This is an important book from an important scholar. Winner (along with his late colleague, Amos Tversky) of the 2002 Nobel Prize in economics--for work on decision making --- Kahneman (Emeritus., psychology, Princeton; public affairs, Princeton's Woodrow Wilson School of Public and International Affairs) practically invented the discipline of behavioral economics and more generally transformed the entire approach to the psychology of decision making. This book makes, among many other things, two major contributions. First, Kahneman provides a substantial review and synthesis of the body of research he did with Tversky. Second, he explicates, and organizes the work around, his recent model of ways of thinking: system 1 --- fast, intuitive, emotional; system 2 --- slower, more deliberative, logical. Kahneman explores the consequences of this distinction in a variety of domains. Summing Up: Essential. Upper-division undergraduates through faculty and professionals; general readers. R. Levine California State University -- Fresno

green separator
EXCERPT - CHAPTER 1
green separator

TO OBSERVE YOUR MIND
IN AUTOMATIC MODE


[Glance at the Image Below]

Look at the image of the woman in the book:
from two perspectives


Your experience as you look at the woman's face seamlessly combines what we normally call seeing and intuitive thinking. As surely and quickly as you saw that the young woman's hair is dark, you knew she is angry. Furthermore, what you saw extended into the future. You sensed that this woman is about to say some very unkind words, probably in a loud and strident voice. A premonition of what she was going to do next came to mind automatically and effortlessly. You did not intend to assess her mood or to anticipate what she might do, and your reaction to the picture did not have the feel of something you did. It just happened to you. It was an instance of fast thinking.

Now look at the following problem:

17 × 24

You knew immediately that this is a multiplication problem, and probably knew that you could solve it, with paper and pencil, if not without. You also had some vague intuitive knowledge of the range of possible results. You would be quick to recognize that both 12,609 and 123 are implausible. Without spending some time on the problem, however, you would not be certain that the answer is not 568. A precise solution did not come to mind, and you felt that you could choose whether or not to engage in the computation. If you have not done so yet, you should attempt the multiplication problem now, completing at least part of it.

You experienced slow thinking as you proceeded through a sequence of steps. You first retrieved from memory the cognitive program for multiplication that you learned in school, then you implemented it. Carrying out the computation was a strain. You felt the burden of holding much material in memory, as you needed to keep track of where you were and of where you were going, while holding on to the intermediate result. The process was mental work: deliberate, effortful, and orderly --- a prototype of slow thinking. The computation was not only an event in your mind; your body was also involved. Your muscles tensed up, your blood pressure rose, and your heart rate increased. Someone looking closely at your eyes while you tackled this problem would have seen your pupils dilate. Your pupils contracted back to normal size as soon as you ended your work --- when you found the answer (which is 408, by the way) or when you gave up.

TWO SYSTEMS

Psychologists have been intensely interested for several decades in the two modes of thinking evoked by the picture of the angry woman and by the multiplication problem, and have offered many labels for them. I adopt terms originally proposed by the psychologists Keith Stanovich and Richard West, and will refer to two systems in the mind, System 1 and System 2.

System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control. System 2 allocates attention to the effortful mental activities that demand it, including complex computations. The operations of System 2 are often associated with the subjective experience of agency, choice, and concentration. The labels of System 1 and System 2 are widely used in psychology, but I go further than most in this book, which you can read as a psychodrama with two characters.

When we think of ourselves, we identify with System 2, the conscious, reasoning self that has beliefs, makes choices, and decides what to think about and what to do. Although System 2 believes itself to be where the action is, the automatic System 1 is the hero of the book. I describe System 1 as effortlessly originating impressions and feelings that are the main sources of the explicit beliefs and deliberate choices of System 2. The automatic operations of System 1 generate surprisingly complex patterns of ideas, but only the slower System 2 can construct thoughts in an orderly series of steps. I also describe circumstances in which System 2 takes over, overruling the freewheeling impulses and associations of System 1. You will be invited to think of the two systems as agents with their individual abilities, limitations, and functions.

In rough order of complexity, here are some examples of the automatic activities that are attributed to System 1:

EXAMPLES OF SYSTEM 1 THINKING
    [1] Detect that one object is more distant than another.

    [2] Orient to the source of a sudden sound.

    [3] Complete the phrase "bread and..."

    [4] Make a "disgust face" when shown a horrible picture.

    [5] Detect hostility in a voice.

    [6] Answer to 2 + 2 = ?

    [7] Read words on large billboards.

    [8] Drive a car on an empty road.

    [9] Find a strong move in chess (if you are a chess master).

    [10] Understand simple sentences.

    [11] Recognize that a "meek and tidy soul with a passion for detail" resembles an occupational stereotype.
All these mental events belong with the angry woman--they occur automatically and require little or no effort. The capabilities of System 1 include innate skills that we share with other animals. We are born prepared to perceive the world around us, recognize objects, orient attention, avoid losses, and fear spiders. Other mental activities become fast and automatic through prolonged practice. System 1 has learned associations between ideas (the capital of France?); it has also learned skills such as reading and understanding nuances of social situations. Some skills, such as finding strong chess moves, are acquired only by specialized experts. Others are widely shared. Detecting the similarity of a personality sketch to an occupational stereotype requires broad knowledge of the language and the culture, which most of us possess. The knowledge is stored in memory and accessed without intention and without effort.

Several of the mental actions in the list are completely involuntary. You cannot refrain from understanding simple sentences in your own language or from orienting to a loud unexpected sound, nor can you prevent yourself from knowing that 2 + 2 = 4 or from thinking of Paris when the capital of France is mentioned. Other activities, such as chewing, are susceptible to voluntary control but normally run on automatic pilot. The control of attention is shared by the two systems. Orienting to a loud sound is normally an involuntary operation of System 1, which immediately mobilizes the voluntary attention of System 2. You may be able to resist turning toward the source of a loud and offensive comment at a crowded party, but even if your head does not move, your attention is initially directed to it, at least for a while. However, attention can be moved away from an unwanted focus, primarily by focusing intently on another target.

The highly diverse operations of System 2 have one feature in common: they require attention and are disrupted when attention is drawn away. Here are some examples:

EXAMPLES OF SYSTEM 2 THINKING
    [1] Brace for the starter gun in a race.

    [2] Focus attention on the clowns in the circus.

    [3] Focus on the voice of a particular person in a crowded and noisy room.

    [4] Look for a woman with white hair.

    [5] Search memory to identify a surprising sound.

    [6] Maintain a faster walking speed than is natural for you.

    [7] Monitor the appropriateness of your behavior in a social situation.

    [8] Count the occurrences of the letter a in a page of text.

    [9] Tell someone your phone number.

    [10] Park in a narrow space (for most people except garage attendants).

    [11] Compare two washing machines for overall value.

    [12] Fill out a tax form.

    [13] Check the validity of a complex logical argument.
In all these situations you must pay attention, and you will perform less well, or not at all, if you are not ready or if your attention is directed inappropriately. System 2 has some ability to change the way System 1 works, by programming the normally automatic functions of attention and memory. When waiting for a relative at a busy train station, for example, you can set yourself at will to look for a white-haired woman or a bearded man, and thereby increase the likelihood of detecting your relative from a distance. You can set your memory to search for capital cities that start with N or for French existentialist novels. And when you rent a car at London's Heathrow Airport, the attendant will probably remind you that "we drive on the left side of the road over here." In all these cases, you are asked to do something that does not come naturally, and you will find that the consistent maintenance of a set requires continuous exertion of at least some effort.

The often used phrase "pay attention" is apt: you dispose of a limited budget of attention that you can allocate to activities, and if you try to go beyond your budget, you will fail. It is the mark of effortful activities that they interfere with each other, which is why it is difficult or impossible to conduct several at once. You could not compute the product of 17 × 24 while making a left turn into dense traffic, and you certainly should not try. You can do several things at once, but only if they are easy and undemanding. You are probably safe carrying on a conversation with a passenger while driving on an empty highway, and many parents have discovered, perhaps with some guilt, that they can read a story to a child while thinking of something else.

Everyone has some awareness of the limited capacity of attention, and our social behavior makes allowances for these limitations. When the driver of a car is overtaking a truck on a narrow road, for example, adult passengers quite sensibly stop talking. They know that distracting the driver is not a good idea, and they also suspect that he is temporarily deaf and will not hear what they say.

Intense focusing on a task can make people effectively blind, even to stimuli that normally attract attention. The most dramatic demonstration was offered by Christopher Chabris and Daniel Simons in their book The Invisible Gorilla. They constructed a short film of two teams passing basketballs, one team wearing white shirts, the other wearing black. The viewers of the film are instructed to count the number of passes made by the white team, ignoring the black players. This task is difficult and completely absorbing. Halfway through the video, a woman wearing a gorilla suit appears, crosses the court, thumps her chest, and moves on. The gorilla is in view for 9 seconds. Many thousands of people have seen the video, and about half of them do not notice anything unusual. It is the counting task --- and especially the instruction to ignore one of the teams --- that causes the blindness. No one who watches the video without that task would miss the gorilla. Seeing and orienting are automatic functions of System 1, but they depend on the allocation of some attention to the relevant stimulus. The authors note that the most remarkable observation of their study is that people find its results very surprising. Indeed, the viewers who fail to see the gorilla are initially sure that it was not there --- they cannot imagine missing such a striking event. The gorilla study illustrates two important facts about our minds: we can be blind to the obvious, and we are also blind to our blindness.

PLOT SYNOPSIS

The interaction of the two systems is a recurrent theme of the book, and a brief synopsis of the plot is in order. In the story I will tell, Systems 1 and 2 are both active whenever we are awake. System 1 runs automatically and System 2 is normally in a comfortable low-effort mode, in which only a fraction of its capacity is engaged. System 1 continuously generates suggestions for System 2: impressions, intuitions, intentions, and feelings. If endorsed by System 2, impressions and intuitions turn into beliefs, and impulses turn into voluntary actions. When all goes smoothly, which is most of the time, System 2 adopts the suggestions of System 1 with little or no modification. You generally believe your impressions and act on your desires, and that is fine --- usually.

When System 1 runs into difficulty, it calls on System 2 to support more detailed and specific processing that may solve the problem of the moment. System 2 is mobilized when a question arises for which System 1 does not offer an answer, as probably happened to you when you encountered the multiplication problem 17 × 24. You can also feel a surge of conscious attention whenever you are surprised. System 2 is activated when an event is detected that violates the model of the world that System 1 maintains. In that world, lamps do not jump, cats do not bark, and gorillas do not cross basketball courts. The gorilla experiment demonstrates that some attention is needed for the surprising stimulus to be detected. Surprise then activates and orients your attention: you will stare, and you will search your memory for a story that makes sense of the surprising event. System 2 is also credited with the continuous monitoring of your own behavior --- the control that keeps you polite when you are angry, and alert when you are driving at night. System 2 is mobilized to increased effort when it detects an error about to be made. Remember a time when you almost blurted out an offensive remark and note how hard you worked to restore control. In summary, most of what you (your System 2) think and do originates in your System 1, but System 2 takes over when things get difficult, and it normally has the last word.

The division of labor between System 1 and System 2 is highly efficient: it minimizes effort and optimizes performance. The arrangement works well most of the time because System 1 is generally very good at what it does: its models of familiar situations are accurate, its short-term predictions are usually accurate as well, and its initial reactions to challenges are swift and generally appropriate. System 1 has biases, however, systematic errors that it is prone to make in specified circumstances. As we shall see, it sometimes answers easier questions than the one it was asked, and it has little understanding of logic and statistics. One further limitation of System 1 is that it cannot be turned off. If you are shown a word on the screen in a language you know, you will read it --- unless your attention is totally focused elsewhere.

CONFLICT

Figure 2 is a variant of a classic experiment that produces a conflict between the two systems. You should try the exercise before reading on.

You were almost certainly successful in saying the correct words in both tasks, and you surely discovered that some parts of each task were much easier than others. When you identified upper-and lowercase, the left-hand column was easy and the right-hand column caused you to slow down and perhaps to stammer or stumble. When you named the position of words, the left-hand column was difficult and the right-hand column was much easier.

These tasks engage System 2, because saying "upper/lower" or "right/left" is not what you routinely do when looking down a column of words. One of the things you did to set yourself for the task was to program your memory so that the relevant words (upper and lower for the first task) were "on the tip of your tongue." The prioritizing of the chosen words is effective and the mild temptation to read other words was fairly easy to resist when you went through the first column. But the second column was different, because it contained words for which you were set, and you could not ignore them. You were mostly able to respond correctly, but overcoming the competing response was a strain, and it slowed you down. You experienced a conflict between a task that you intended to carry out and an automatic response that interfered with it.

Conflict between an automatic reaction and an intention to control it is common in our lives. We are all familiar with the experience of trying not to stare at the oddly dressed couple at the neighboring table in a restaurant. We also know what it is like to force our attention on a boring book, when we constantly find ourselves returning to the point at which the reading lost its meaning. Where winters are hard, many drivers have memories of their car skidding out of control on the ice and of the struggle to follow well-rehearsed instructions that negate what they would naturally do: "Steer into the skid, and whatever you do, do not touch the brakes!" And every human being has had the experience of not telling someone to go to hell. One of the tasks of System 2 is to overcome the impulses of System 1. In other words, System 2 is in charge of self-control.

ILLUSIONS

To appreciate the autonomy of System 1, as well as the distinction between impressions and beliefs, take a good look at figure 3.

This picture is unremarkable: two horizontal lines of different lengths, with fins appended, pointing in different directions. The bottom line is obviously longer than the one above it. That is what we all see, and we naturally believe what we see. If you have already encountered this image, however, you recognize it as the famous Müller-Lyer illusion. As you can easily confirm by measuring them with a ruler, the horizontal lines are in fact identical in length.

Now that you have measured the lines, you--your System 2, the conscious being you call "I"--have a new belief: you know that the lines are equally long. If asked about their length, you will say what you know. But you still see the bottom line as longer. You have chosen to believe the measurement, but you cannot prevent System 1 from doing its thing; you cannot decide to see the lines as equal, although you know they are. To resist the illusion, there is only one thing you can do: you must learn to mistrust your impressions of the length of lines when fins are attached to them. To implement that rule, you must be able to recognize the illusory pattern and recall what you know about it. If you can do this, you will never again be fooled by the Müller-Lyer illusion. But you will still see one line as longer than the other.

Not all illusions are visual. There are illusions of thought, which we call cognitive illusions. As a graduate student, I attended some courses on the art and science of psychotherapy. During one of these lectures, our teacher imparted a morsel of clinical wisdom. This is what he told us: "You will from time to time meet a patient who shares a disturbing tale of multiple mistakes in his previous treatment. He has been seen by several clinicians, and all failed him. The patient can lucidly describe how his therapists misunderstood him, but he has quickly perceived that you are different. You share the same feeling, are convinced that you understand him, and will be able to help." At this point my teacher raised his voice as he said, "Do not even think of taking on this patient! Throw him out of the office! He is most likely a psychopath and you will not be able to help him."

Many years later I learned that the teacher had warned us against psychopathic charm, and the leading authority in the study of psychopathy confirmed that the teacher's advice was sound. The analogy to the Müller-Lyer illusion is close. What we were being taught was not how to feel about that patient. Our teacher took it for granted that the sympathy we would feel for the patient would not be under our control; it would arise from System 1. Furthermore, we were not being taught to be generally suspicious of our feelings about patients. We were told that a strong attraction to a patient with a repeated history of failed treatment is a danger sign--like the fins on the parallel lines. It is an illusion--a cognitive illusion --- and I (System 2) was taught how to recognize it and advised not to believe it or act on it.

The question that is most oft en asked about cognitive illusions is whether they can be overcome. The message of these examples is not encouraging. Because System 1 operates automatically and cannot be turned off at will, errors of intuitive thought are oft en difficult to prevent. Biases cannot always be avoided, because System 2 may have no clue to the error. Even when cues to likely errors are available, errors can be prevented only by the enhanced monitoring and effortful activity of System 2. As a way to live your life, however, continuous vigilance is not necessarily good, and it is certainly impractical. Constantly questioning our own thinking would be impossibly tedious, and System 2 is much too slow and inefficient to serve as a substitute for System 1 in making routine decisions. The best we can do is a compromise: learn to recognize situations in which mistakes are likely and try harder to avoid significant mistakes when the stakes are high. The premise of this book is that it is easier to recognize other people's mistakes than our own.

USEFUL FICTIONS

You have been invited to think of the two systems as agents within the mind, with their individual personalities, abilities, and limitations. I will oft en use sentences in which the systems are the subjects, such as, "System 2 calculates products."

The use of such language is considered a sin in the professional circles in which I travel, because it seems to explain the thoughts and actions of a person by the thoughts and actions of little people inside the person's head. Grammatically the sentence about System 2 is similar to "The butler steals the petty cash." My colleagues would point out that the butler's action actually explains the disappearance of the cash, and they rightly question whether the sentence about System 2 explains how products are calculated. My answer is that the brief active sentence that attributes calculation to System 2 is intended as a description, not an explanation. It is meaningful only because of what you already know about System 2. It is shorthand for the following: "Mental arithmetic is a voluntary activity that requires effort, should not be performed while making a left turn, and is associated with dilated pupils and an accelerated heart rate."

Similarly, the statement that "highway driving under routine conditions is left to System 1" means that steering the car around a bend is automatic and almost effortless. It also implies that an experienced driver can drive on an empty highway while conducting a conversation. Finally, "System 2 prevented James from reacting foolishly to the insult" means that James would have been more aggressive in his response if his capacity for effortful control had been disrupted (for example, if he had been drunk).

System 1 and System 2 are so central to the story I tell in this book that I must make it absolutely clear that they are fictitious characters. Systems 1 and 2 are not systems in the standard sense of entities with interacting aspects or parts. And there is no one part of the brain that either of the systems would call home. You may well ask: What is the point of introducing fictitious characters with ugly names into a serious book? The answer is that the characters are useful because of some quirks of our minds, yours and mine.

A sentence is understood more easily if it describes what an agent (System 2) does than if it describes what something is, what properties it has. In other words, "System 2" is a better subject for a sentence than "mental arithmetic." The mind-especially System 1: appears to have a special aptitude for the construction and interpretation of stories about active agents, who have personalities, habits, and abilities. You quickly formed a bad opinion of the thieving butler, you expect more bad behavior from him, and you will remember him for a while. This is also my hope for the language of systems.

Why call them System 1 and System 2 rather than the more descriptive "automatic system" and "effortful system"? The reason is simple: "Automatic system" takes longer to say than "System 1" and therefore takes more space in your working memory. This matters, because anything that occupies your working memory reduces your ability to think. You should treat "System 1" and "System 2" as nicknames, like Bob and Joe, identifying characters that you will get to know over the course of this book. The fictitious systems make it easier for me to think about judgment and choice, and will make it easier for you to understand what I say.

SPEAKING OF SYSTEM 1 AND SYSTEM 2:

"He had an impression, but some of his impressions are illusions. This was a pure System 1 response."

"She reacted to the threat before she recognized it. This is your System 1 talking. Slow down and let your System 2 take control."

green separator

Click or Tap Star to:

Daniel Kahneman

RETURN TO THE HUMANIST HUB
OF SECULAR SCIENCE STARS


ALPHABETICAL BRAIN® VOCABULARY

green separator
produced by
Infinite Interactive Ideas®