Before we start, try a little experiment. You’re going to need some walking space, nothing more. Start walking, get a nice steady pace going. Now, try to find the answer to this sum: “2+2=?” Did you manage it? Unless you’re the kind of mouth-breather who has trouble manoeuvring a knife and fork at the same time then it should have posed no problem. Now, get your stroll on again. Try this one on for size: “23×42=?” Those of you with exceptional gifts for mental arithmetic may have managed it, for the rest of us mere mortals it caused us to at least break our stride and probably stop altogether. In fact, unbeknownst to us our heartrates increased and our pupils noticeably dilated.
In Thinking, Fast And Slow Nobel laureate Daniel Kahneman gives us a guided tour of two distinct mental systems we constantly and unconsciously use for processing different kinds of data. System 1 gives us an up-to-date picture of our surroundings, dealing with easy problems and reacting at lightning speed – you might be tempted to call it ‘instinct’. System 2 on the other hand is concerned with the trickier situations and requires a great deal more energy, hence the discomfort caused when attempting the second calculation. It’s more aligned with what we’d call our logical and rational selves, as opposed to the more emotional character of System 1. Though we are aware of neither in our daily lives, nor of the switching between them, they guide our every action and, as Kahneman takes great relish in demonstrating, lead us into all manner of cognitive biases.
Kahneman orders his book into a series of discussions on the various heuristics, or mental shortcuts, which System 1 will take in order to keep us comfortable. It turns out that we always prefer to be lazy, no matter how diligent we profess to be. System 1 will always take whatever shortcuts it can to minimise effort, response times and anxiety. (And don’t worry if it sounds like we have two autonomous homunculi controlling our lives, it’s just easy shorthand to help us understand our mental operations.)
One of the easiest to demonstrate is the availability heuristic. A wonderful yet grisly example of this heuristic at work is the United States immediately following the 9/11 attacks (taken from Dan Gardner’s wonderful Risk: The Science And Politics Of Fear). The media during that period was pushing one transport-related story only – planes are bad. The images of the burning towers were everywhere you looked. Everyone knows the death toll, the 3,000 who lost their lives that day. Few are aware that in the following year over half that number again died on America’s roads as a direct result of choosing to drive instead of fly.
In the terminology of Thinking, Fast and Slow, this was an error of System 1. It was asked the question, “Should I fly or drive to this meeting?” but instead of passing the work on to meticulous but labour-intensive System 2, it substituted another question instead. It asked “What information about flying is close to hand?” The answer was obvious, yet the information to hand failed to mention that flying remained a far safer option than driving. The results of 1,600 such errors speak for themselves.
Throughout the rest of the book Kahneman details several other such heuristics which lead us down alleyways of faulty logic. The framing heuristic determines how a slight and logically insignificant change in the wording of a statement or question can have massive differences in our reactions to it. Would you prefer a medical treatment which would give you a 90% chance of living or one which gives you a 10% chance of dying? Depends which system is at work. There is also loss aversion, a remarkably powerful heuristic whose operation results in the stunning and repeatedly demonstrated fact that optimists are healthier, happier and live longer that pessimists. (Unfortunately reading this made me even more of a pessimist. Thanks Daniel.). By combining studies of loss aversion, overconfidence and our tendency to simplify the world around us, Thinking, Fast And Slow manages to unite all manner of previously inexplicable human behaviour under one relatively simple explanation.
Don’t worry, though, it’s not all bad news. Kahneman is at pains to point out that for the vast majority of our lives Systems 1 and 2 perform their assigned tasks immaculately. Our continued existence is testament to their efficacy. The main point he wishes to drive home is that with some effort we can become more aware of their failings. With this in mind we can hope to reduce our own susceptibility to cognitive biases, although as a disclaimer he does note that it is far, far easier to notice these shortcomings in others and hence this book is aimed more at ‘gossips and critics’. Despite this warning, anyone interested in skepticism, critical thinking and psychology should devour this book and start spreading the word.
Then start annoying their friends with calls of “Ha, your System 1 is a jerk!”.