Tags

, , , , ,

Daniel Kahneman is a psychologist who admits to knowing little about economics. Still, his work with Amos Tversky on the psychology of judgment and decision-making won him a Nobel Prize in Economic Sciences in 2002. This goes to show that exciting research is often to be found across rather than within disciplines.

Thinking, Fast and Slow (Penguin Books 2012) summarizes Kahneman’s research and its implications for everyday life and public policy. If you like solving simple puzzles, taking small gambles or testing your judgment capacity, then you will like the book and also learn a lot from it—about how you think, or rather, about how you often fail to think.

Kahneman’s main argument is that our thinking brain is organized in two systems. System 1 is fast as it relies on accumulated knowledge codified in our neural networks in the course of evolution. The problem is that much of this codified knowledge is not always appropriate to contemporary living. As a result, system 1 is often wrong. By contrast, system 2, is both reflective and critical, and has the ability to ponder on what is known, and also what it does not know, but it is lazy.

The interplay between the two systems explains a number of widespread and recurring decision errors that have been observed in psychological experiments. Here is a selection:

When thinking fast …

• we are more likely to accept as true that which is easier to understand or emotional;
• we ignore base-rates, use stereotypes instead and see causality when there is none;
• we make decisions based on confirming evidence alone (confirmation bias);
• we either dismiss small risks or take them too seriously;
• we manipulate our memory so as to always be on the right (hindsight bias);
• we judge persons alone on first impressions (halo effect);
• we often fail to take an ‘outside’ view and/or use comparative information;
• we fight losing fights (loss aversion and sunk-cost fallacy);
• we think in terms of losses and gains rather than outcomes;
• we discount time and duration, focusing on peak and end experiences.

The majority of the above heuristics or biases are often amusing to observe in the lab context. On paper, as above, they appear harmless. Yet, they are frequently the source of serious judgmental errors like the following:

• We believe those media reports which package their message in simple populist language and do not question the source;
• We judge people differently according to their religion, color, ethnicity or sexual orientation;
• We give risks which are rare, such as terrorism, lots of attention, albeit only for as long as they dominate public debates;
• We evaluate students according to whether we like them or depending on how close we think they come to confirming our expectations;
• We invest lots of money in infrastructure projects without serious planning;
• We remain in relationships long after these have ceased to be fulfilling;
• We have trouble accepting defeat and therefore fight wars too long at too high cost;
• We suppress positive memories relating to experiences that ended badly.

When relying on their ‘fast’ brains experts are as prone to the above errors as non-experts. The exceptions are those who have studied a field long enough and well enough to have internalized data and information as knowledge. Nonetheless, Kahneman argues, when judging the ability of experts to make valid forecasts, one should also take into account the regularity of the environment they are observing. Forecasts about society, polity and the economy are difficult because these are also areas that are less regular.

A weakness of Thinking, Fast and Slow concerns naming. ‘Fast’ and ‘slow’ are good descriptive terms for the two systems of our brain that Kahneman is discussing. However, in an attempt to generalize his findings towards a more encompassing theory of the mind, he takes a step further and draws parallels with other classificatory systems such as intuitive thinking vs. rational thinking or ‘Humans’ vs. ‘Econs’ or ‘experiential self’ vs. ‘remembering self.’

In doing so he falls himself victim to confirmation bias as he favors those research findings that confirm his views, granting little attention to those experiments which point in another direction or simply to alternative interpretations.

For instance, there is much more to ‘intuitive’ thinking than it being, perhaps, fast. Research on memory and life stories is still at its infancy considering that what is remembered will often change with time. The cultural variation in judgments is only now beginning to be adequately addressed—also the reason why behavioral economics is proving less successful than anticipated. The role of emotions is not limited to providing a frame for a story. And finally, there is the unconscious—a space (and term) that Kahneman smartly avoids throughout his book.

Figuring out how our brains work will probably take longer than figuring out how to survive on the moon. In any case, the research by Kahneman and his associates definitely represents one important building block.

Advertisements