Toolbox: System 1 and System 2 Thinking

“Toolbox” articles delve into a new way of looking at values, with a view to using these techniques in future articles.

Philosophers have emphasized the key role of rationality in human decision-making for millennia. To Aristotle, reason was humanity’s defining characteristic, the only factor that separates people and beasts. The central importance of reason was further developed during the Enlightenment through the works of Mill, Smith, and Kant.

However, it was in the last century that the cult of rationality reached its extreme in the form of homo economicus, or “the rational man,” the cold, calculating, self-interested figure that stalks economics textbooks across the world. When faced with an important decision, homo economicus feels nothing. Like a supercomputer, they calculate the predicted utility of all possible outcomes, applies a discount rate on future gains (probably using the interest rate on a U.S. treasury bill), and selects the option that will maximize their overall utility.

No doubt, the model of homo economicus has led to impressive developments in the social sciences, from the influential writings of Milton Freeman to the power of game theory. Even so, it should always be remembered that homo economicus is a model, and like all models, it’s wrong (although it can be useful). From Blink to Nudge, best-sellers are constantly showing that homo economicus doesn’t exist. Our decision-making is often driven by emotions and heuristics (i.e. mental rules-of-thumb that allow people to sidestep the arduous process of reasoning), rather than rationality.

One way of understanding this phenomenon was popularized by Nobel Prize laureate Daniel Kahneman, including most famously in his book Thinking, Fast and Slow (his ideas are summed up in this article for the Scientific American). Kahneman outlines two forms of thinking: System 1 is quick and instinctive, such as solving a simple math problem (e.g. 2 + 2 = ?) or driving a car on an empty road. System 2, its slow and rational cousin, is used for more deliberate thought, such as calculating the product of 24 and 17 or parking in a tight space.

These two systems are separate, but related. Under the homo economicus model, there is only System 2, and it is infallible. However, here in the real world, System 1 is the star of the show. The vast majority of our decisions are made with System 1, mostly because System 2 takes a lot of work to use. Thinking rationally and deliberately is exhausting. No one can do it 100% of the time (or even 10% of the time). And most of the time it isn’t even necessary; normal people don’t reach for a pen and paper when they need to calculate 2 + 2, right? As a result, people lean heavily on System 1, which leads to all sorts of systematic errors in reasoning.

However, sometimes the methodical processes of System 2 can override the rapid decisions of System 1 and correct errors. To demonstrate, let’s draw out a System 1 error through the famous Monty Hall problem:

Suppose you’re on a game show, and you’re given the choice of three doors: Behind one door is a car; behind the others, goats. You pick a door, say No. 1, and the host, who knows what’s behind the doors, opens another door, say No. 3, which has a goat. He then says to you, “Do you want to pick door No. 2?” Is it to your advantage to switch your choice?

Source: Wikipedia

The answer is yes; you always want to switch your choice. In fact, changing your door doubles your chance of winning, from 33% to 66%.

If you haven’t been exposed to this problem before, your System 1 is probably objecting to this answer. “Once the host shows you the goat, there are two doors left,” your gut screams. “It’s a 50/50 chance! It doesn’t matter if you change the door!” And you wouldn’t be alone. 87% of people who are exposed to the problem stay with the door they have selected, which is the suboptimal answer. Even famous mathematicians get this problem wrong!

Let’s try to activate your System 2 thinking and figure out this brain teaser. Here are the three stages of the problem

1) You select one of three doors.

Your chance of being correct is 33%. So far, so good. You probably didn’t need to think much about this, so your System 1 is working fine.

2) The host opens one of the two remaining doors, and there is a goat behind it.

Here’s where System 1 makes an error. This step does not change the odds to 50/50, even though there are now only two doors left. Keep in mind that the host will never show you the door with the car behind it because that would ruin the game. If you were right with your first pick, then the host has two goats that he could show you, and he selects one at random. However, if you were originally wrong (remember, this is 66% likely), then he only has one door that he could open at this stage. The other door has the car.

3) Now you have the option to switch.

If you were right at stage 1 (33% chance), you should stay with your choice. But if you were wrong (66% chance), then you should change your selection. Since you don’t know if you were right or wrong at stage 1, it’s best to play the odds. On balance, your chances double when you change doors.

Source: Wikipedia

The System 1 error becomes more clear if we consider the game on a larger scale (say, 100 doors with 99 goats and one car). You select a door randomly (say, door 56), and one by one, the host opens 98 doors, showing goat after goat. Eventually, there are only two doors left: the one you selected (door 56) and another door (say, door 14). Do you switch? To help you decide, ask yourself this question: why didn’t the host open door 14? There are two possible reasons. Either you were originally right (1% chance) or it is the last door left that doesn’t have a goat behind it (99% chance).

I hope by now your System 2 has overridden your System 1, and you understand the mechanics of this problem. If you still don’t, that’s the fault of my explanation. Thankfully, smarter people than me have tackled it using equations, tree diagrams, and computer models. The result is the same: switching your door means you win 66% of the time. Trust me on this one.

But even if you do get it now, do you see how hard that was? If you have already been exposed to this problem before, do you remember how confused you were the first time you tried to figure it out? It’s not a pleasant experience when System 1 and System 2 thinking collide. That discomfort is called cognitive dissonance. I’d further note that this is a relatively straight-forward example; there is a right answer, and it has been proven with the gold standard of System 2 thinking: mathematics.

Bringing the discussion back to values: my basic argument is that our value judgements have all the hallmarks of System 1 thinking: emotional, reactionary, and illogical. No matter how much we’d like to believe that our values are rooted in System 2 thinking, they’re probably not. I’ll delve into this issue next week!

Leave a Reply

Your email address will not be published. Required fields are marked *