Daniel Kahneman is the most respected and influential psychologist in America, if not the world. Who else in psychology has been awarded a Nobel Prize in economics while achieving major breakthroughs in the fields of cognition, decision theory, memory studies, life narratives, and happiness? Now Kahneman has produced a landmark book that deserves all the praise it is getting. In five hundred densely packed pages he offers an overview and synthesis of decades of the fascinating research that has been his life’s work.
Readers making their way through the thirty-eight succinct chapters of Thinking, Fast and Slow will enjoy not only a crash course in cognitive psychology, but also an intriguing and even entertaining story. Kahneman’s research began in the 1970s when, as a young Israeli army psychologist, he was assigned the task of assessing the leadership potential of recruits. The assessment team worked hard, conducting interviews and behavioral exercises to arrive at their predictions. Yet follow-up studies of the subsequent actual performance of their subjects revealed that the team’s predictive judgments were often faulty. What puzzled Kahneman was the fact that his fellow assessors were reluctant to change their methods even in the face of statistical proof that those methods were not working. Why were highly intelligent individuals so resistant to rational thinking?
From this beginning, Kahneman embarked on years of research, seeking a better understanding of human thinking. His work, he explains in these pages, took off after he “had the luck” to meet and work with Amos Tversky, a fellow Israeli academic trained in mathematics and decision theory. Kahneman’s phrasing is important, because luck and the role of random chance are recurring themes in his conceptual scheme. In his view, most people underestimate or ignore the role chance plays in events, clinging instead to causal explanations—even faulty ones—and granting too much effect to their own actions. Working together to understand these patterns of cognition, Kahneman and Tversky discovered a host of irrational biases, heuristic rules of thumb, cognitive illusions, and systematic errors built into human thinking—even the thinking of trained experts like themselves. Their innovative research findings produced the oft-cited papers that led to the 2002 Nobel prize.
Kahneman enlivens his hefty book with participatory exercises, concrete examples, and personal comments, all intended to help show how the mind works. As his title proclaims, his decades of research with Tversky and other collaborators have led him to identify two kinds of thinking, which he introduces here as fictional players in a psychodrama. System I functions fast and automatically, the result of billions of years of evolutionary selection—a kind of survival machine that remains on whenever a human organism is awake. Working with effortless, marvelous power, System I generates a ceaseless stream of assessments, intentions, emotions, associations, norms, causal explanations, and belief confirmations. It is also insensitive to ambiguity, sometimes substitutes an easier question for a harder one, exaggerates rare probabilities, employs stereotypes, and follows a negative bias, provoking stronger reactions to losses than to gains. Most crucially, it operates by what Kahneman calls WYSIATI, or What You See Is All There Is. When information crucial to a correct solution or decision is not immediately available, System I confidently ignores its absence. It constructs a coherent causal story by jumping to conclusions from what is given—then sails on, ignoring other possible explanations.
By contrast, System II thinks logically, statistically, and contextually. This system can follow rules, calculate, deliberately compare objects, choose, make decisions, and—most important—control and correct System I. Yet because it is an innately “lazy controller,” operating by the law of least effort, it can allow biases, illusions, and systematic errors to go unexamined. Kahneman views laziness as deeply built into human beings; and since deliberate slow thinking requires effort and energy, the errors produced by the fast intuitive responses of System I are endorsed by System II and become the self’s conscious beliefs, attitudes and intentions. The problem is augmented by the fact that many people lack the educational skills to assess the reliability, say, of what they hear or read in the media. Indeed, this year’s political debates would seem to provide a dismal demonstration of the power of biased System I thinking.
Though familiarity with Kahneman up till now has been limited largely to specialists, his research on the systematic errors of fast thinking has already made its way into our cultural mainstream. Most Americans are familiar with how advertising manipulates “framing effects,” “affect biases,” or “availability biases” to influence consumer choices and decisions. For instance, when we answer a question or make a decision, we tend to use the most readily available information in our minds, the information that surfaces first. As Malcolm Gladwell’s hugely popular book Blink attested, a feeling of effortless confidence accompanies instantaneous intuitive responses. Kahneman grants the power of intuitions stemming from training and experience but also cautions against fast thinking as an ubiquitous source of error. It takes energy to question or resist a coherent plausible story constructed by System I. Far easier to follow, for instance, an affect bias, whereby we leap to the conclusion that, if we like somebody or something, then he or it is the best choice.
The anatomy of cognition offered up in this magisterial book reveals with humbling clarity what flawed and faulty thinkers we are. Human beings in Kahneman’s view are brilliantly designed to think intuitively in causal, metaphorical, and associative ways, but not well prepared to think in the slow, controlled, rational, and statistical ways that can help correct fast thinking. Yes, fast thinking is essential to surviving—and to living creatively and pleasurably—but it can also be biased, narrow, self-involved, and dangerously prone to error. To have a balanced life, individuals need to appreciate the creativity of System I but, when stakes are high, be ready to slow down and apply System II’s slow and controlled logical thinking. This means that more education to develop logical thinking is in order. One can be blind to one’s own blindness, Kahneman knows; his goal is to give us the tools to think better about how we think. And, in the interim, to help us lead more informed conversations around the office water cooler.
Thinking, Fast and Slow presents an enormous range of research—a feast impossible to do justice to in a review. Still, there are other subjects I would love to hear Kahneman engage. Maybe in his next book he could discuss how he reconciles his views on the role of luck with his advocacy of reason and the scientific method. Though he comments here and there about how benighted scientists, economists, academics, and expert decision-makers can be, he still champions the value of logic and mathematics. But where do mathematics and logic originate; are they also part of the evolutionary package? And while the moral implications of fast and slow thinking seem obvious, Kahneman mostly skirts the “ought” questions—other than Don’t be stupid. He recounts his dismay when his students persist in adhering to irrational conclusions despite being confronted with contrary evidence. In one class a student, accused of being irrational and illogical, replies by shouting back “So what?” Good question. I’d like to hear Kahneman discuss how he moves from his belief in reason to the other values he holds as a person.
Moral lessons may seem like a stretch in response to the rigorously scientific work described in this book. But for me Kahneman’s research on cognition strengthens the traditional claim that reason is central to acquiring moral virtues. Describing his work processes, Kahneman recounts how he and his colleagues might struggle for years to solve a problem, grappling with a question, turning it this way and that, testing and retesting. Slowly a solution would emerge—and then ironically the results could seem perfectly simple and obvious. To my mind this process of creative mulling resembles what moralists call discernment. And are there no moral implications to the fact that the slow, controlled rational thinking of System II provides a broader and more universal perspective on decision-making than that put forth by System I’s survival-oriented operations? On the contrary. Recognizing that there is more to know in an uncertain world can further humility and openness to corrective change; reasoned thinking can push back against pride, overconfidence, and the biased judgments of others. Clearly, one moral imperative readers can take away from Kahneman’s excellent book is the importance of avoiding the trap of self-deception. My wanting something to be true, after all, doesn’t make it so.
The novelist Iris Murdoch claimed that the effort to pay attention and see reality more accurately is essential to moral growth—and, by way of illustration, she told the story of how a mother-in-law overcame her initial rejection of her “hopeless” daughter-in-law by a slow process of positive attention and reasoned reflection. Overriding irrational instant aversions may be the major moral task that reason can play. For that matter, recognizing that realities exist beyond what you see is a first step in religious awareness. Another step, and perhaps especially for Christians, is embracing the truth that we are fallible; indeed, as Kahneman reminds us, we are built to err.
Reason and logic exert a mysterious overriding force upon the mind’s assent to truth. Thinking, Fast and Slow lets us revise an ancient truth with an ironic addendum. While the heart may have reasons which reason cannot know, it’s often best to take reason for your guide.