OPINION7 August 2019

New frontiers: re-establishing System1/System 2 truths

Behavioural economics Opinion UK

In the latest of their new frontiers series, The Behavioural Architect’s Crawford Hollingworth and Liz Barker look at some of the myths that now surround Daniel Kahneman’s System 1 and System 2 thinking.

Arguably the most famous theory in behavioural science was popularised by Nobel Laureate Daniel Kahneman and describes the process of ‘thinking fast and slow’ otherwise known as System 1 and System 2 thinking. This model has been widely adopted due to its simplicity and intuitive nature. Nowadays, even if you don’t know anything about behavioural science, you’ve probably heard of Kahneman and would recognise the phrase System 1 and 2.

In this article, we want to re-establish the foundations that underpin our behavioural science knowledge. We cannot explore new frontiers on unstable footing, and so we need to investigate how misperceptions about this theory have arisen over the years.

System 1 & 2 –  a refresh

For centuries, philosophers, psychologists, and scientists alike have distinguished between intuitive and conscious reasoning; from Decartes’ mind-body dualism in the 17th century to Posner and Synder’s formal depiction of the (first) dual process model of the mind in 1975. However, it was not until Daniel Kahneman included the terms system 1 and system 2 in his 2011 bestselling book Thinking Fast and Slow that the distinction between automatic and deliberate thought processes became popularised.

  • System 1 “is the brain’s fast, automatic, intuitive approach”[ 1 ]. System 1 activity includes the innate mental activities that we are born with, such as a preparedness to perceive the world around us, recognise objects, orient attention, avoid losses – and fear spiders!
  • System 2 is “the mind’s slower, analytical mode, where reason dominates”[ 2 ]. Usually, system 2 is activated when we do something that does not come naturally and requires some sort of conscious mental exertion.

The theory was intended to be a helpful analogy to guide our understanding of how our minds process information – and it does an admirable job of this.

The dual-system theory has travelled from the world of academia into popular language and mainstream thinking.

On the one hand it is a welcome bridge for the often-criticised gap between academics and the ‘real-world’. On the other, however, in its transition from academia to pop-culture, the original theory of system 1 and 2 seems to have lost some of its depth, nuance and detail. Central to this misunderstanding seems to be the idea that system 1 and system 2 are literal representations of our brain structure. Additionally, many of the more popular ‘sound-bites’ from the book have been reproduced and disseminated without the context and constraints provided when read in-situ.   

As behavioural science progresses, it is important to be wary of the myths that have arisen.

The three key myths that have emerged in popular media are:

  1. The brain is not literally divided into two
  2. System 1 and 2 work in tandem, not as separate entities
  3. Both systems can be biased and can make mistakes – neither one is categorically ‘good’ or ‘bad'.

FACT 1: The brain is not literally divided into two

Just as the common myth that people are either right or left brained has been proved false, we also know there aren’t actually sections of the brain with system 1 or system 2 stamped on them. Kahneman clearly states that “there is no one part of the brain that either of the systems would call home”[ 3 ].

The idea of left-brain and right-brain thinking is persistent, and many people continue to believe that the left side is responsible for analytical thinking, while the right side is more creative. It is easy to understand why system 1 and system 2 type thinking have been mistakenly associated with this idea. System 2’s rational, logical thinking is analogous with the ‘left brain’ and similarly system 1 thinking seems easily associated with the idea of an intuitive, artistic right brain.

These ideas are fundamentally incorrect, however.

All this being said, neuroscientists have found that some regions of the brain are slightly more associated with one of the two systems[ 4 ].

For example, this body of evidence indicates that affective cognition (system 1-type thinking for emotional responses) is located in the mesolimbic dopamine reward system. This pathway is responsible for the release of dopamine. Given that human beings tend to seek instant gratification, dopamine plays a key role in “thinking fast”[ 5 ]. On the other hand, the frontal and parietal cortex have been linked to the analytic system of decision-making (system 2 ), therefore this region is more associated with our complex reasoning and higher-order “slow” thinking.

The separation of brain functions for decision-making and perceived specialisation has given rise to the multiple systems hypothesis. However, it is the combination of information gathered from the multiple systems – mesolimbic pathway, frontal cortex, and parietal cortex – that help to produce our decisions. In other words, while different regions may be more or less relevant for either system 1 or system 2, neither hemisphere is restricted to solely system 1-type decision-making and the other for system 2.

FACT 2: System 1 and 2 work in tandem, not as separate entities

Another myth or common misconception is that system 1 and 2 are hierarchical processes with one occurring before the other. People often think system 1 thinking occurs first and system 2 thinking following. Kahneman points out that almost all processes are a mix of both systems, and it is important to emphasise that the systems are complementary.

Importantly, unconscious processes such as emotion (system 1 ) play a vital role in our more logical reasoning (system 2 ), and it is this integrative approach that makes our decision-making meaningful, and often more effective and purposeful[ 6 ].

Ellen Peters and her colleagues conducted an experiment in which they gave participants tasks that required processing numbers. Unsurprisingly, participants who had high levels of numeracy outperformed those who were less numerate. Numeracy has been previously linked to an improved ability to use system 2 reasoning effectively.

However, they also found that participants with great numeracy skills were also able to use system 1 reasoning more frequently and reliably. Importantly, they also found that over time, the consistent and effective use of system 2 reasoning calibrates system 1 processing making that more effective, which in turn promotes better systematic (system 2 ) reasoning, essentially creating a feedback loop.

Outside of an experimental setting, everyday tasks provide further evidence for the teamwork of systems 1 and 2. Language is one example; we communicate deliberately, but during the flow of conversation we don’t tend to rehearse grammatical rules which are taken into account without conscious thought. Physical activity is another. Recent research suggests that exercise is partly habit-driven, yet also requires conscious oversight to be successfully completed.[ 7 ]

FACT 3: Both systems can be biased and can make mistakes

A myth that has developed around systems 1 and 2 is that system 1 is the source of bias, and system 2 is called up as the ‘voice of reason’ to correct biases in our thinking.

Both are actually susceptible to bias and both can make mistakes. For example, system 1 may have gathered accurate information, yet system 2 may process this poorly and make a mistake. Conversely, system 1 may have gathered biased information and so despite system 2 processing it accurately, the conclusion may be incorrect due to a biased starting point.

Confirmation bias is a good example of how both systems can be affected by bias: we may notice and more easily remember information that supports our existing beliefs (a system 1 activity), while also motivated to analyse new information that supports our existing belief (a system 2 activity).

In the medical field, for example, it was long thought that diagnostic errors were caused by system 1 type reasoning and clinicians were advised to think more slowly and gather as much information as possible. However, later reviews found that experts were just as likely to make errors when attempting to be systematic and analytical. Research by behavioural scientists such as Gerd Gigerenzer has shown that more information and slower processing does not always lead to the most accurate answer. Diagnosing patients and making treatment decisions using mental shortcuts and evidence-based rules of thumb can perform just as well, if not better. This discovery lead to the creation of ‘fast and frugal’ decisions trees for patient diagnosis, where doctors only needed to ask three crucial diagnostic questions. When tested, this method improved accurate diagnosis of heart disease by between 15-25%.[ 8 ]

Researchers decided to test the accuracy of various heuristics (a system 1 activity) across a number of real-world situations and compare this accuracy against more complex decision-making strategies as the benchmark[ 9 ].

One of the tasks in their experiment was to predict which of two cities (Los Angeles or Chicago) had a higher rate of homelessness based on some basic initial data points provided. They compared the accuracy of prediction using three common heuristics (take-the-best, tallying and minimalist[ 10 ]) to two baseline complex predictive strategies (linear regression and naive Bayes[ 11 ]) and found that when faced with limited initial data, heuristic strategies actually outperform complex strategies[ 12 ].

Kahneman explains that “system 1 is not a machine for making errors, it usually functions beautifully”[ 13 ].

While these myths possess considerable intuitive appeal, it would be a shame – and more importantly, damaging to the field – if their simplistic descriptions drowned out the more fascinating story of how our brains really work. The theory of system 1 and system 2 is incredibly useful as a way to understand the complexities of human decision making.

By The Behavioural Architects’ Crawford Hollingworth and Liz Barker


[ 1 ] The Harvard Gazette ( 2014 ). Layers of choice. Retrieved from https://news.harvard.edu/gazette/story/2014/02/layers-of-choice/

[ 2 ] The Harvard Gazette ( 2014 ). Layers of choice. Retrieved from https://news.harvard.edu/gazette/story/2014/02/layers-of-choice/

[ 3 ] Kahneman, D. ( 2011 ). Thinking, fast and slow. New York: Farrar, Straus and Giroux. Pg29.

[ 4 ] Camerer, C., Loewenstein, G., & Prelec, D. ( 2005 ). Neuroeconomics: How Neuroscience Can Inform Economics. Journal of Economic Literature, 43( 1 ), 9-64.

[ 5 ] TheEconReview. ( 2017 ). What Neuroscience Has to Say about Decision-Making. Retrieved from https://theeconreview.com/2017/01/13/what-neuroscience-has-to-say-about-decision-making/

[ 6 ] Peters, E., Västfjäll, D., Slovic, P., Mertz, C. K., Mazzocco, K., & Dickert, S. ( 2006 ). Numeracy and decision making. Psychological science, 17( 5 ), 407-413.

[ 7 ] Gardner, B., & Rebar, A. L. ( 2019 ). Habit Formation and Behaviour Change. In Oxford Research Encyclopaedia of Psychology; Rhodes, R. E., & Rebar, A. L. ( 2018 ). Physical activity habit: Complexities and controversies. In the Psychology of Habit. 91-109. New York: Springer

[ 8 ] Green, L., & Mehr, D. R. ( 1997 ). What alters physicians’ decisions to admit to the coronary care unit? The Journal of Family Practice, 45( 3 ), 219–226.

[ 9 ] Katsikopoulos, K.V., Schooler, L.J., Hertwig, R. ( 2010 ). The robust beauty of ordinary information. Psychological Review, 117, 1259–1266.

[ 10 ] The take-the-best heuristic assumes that cues are processed in order of validity, and it compares both alternatives on a single cue, one at a time, until a cue is found that distinguishes between the alternatives; tallying is a heuristics that simply tallies cues for or against each alternative; and minimalist heuristic assesses options against random cues, and applies a stopping rule when one of the alternatives has a positive cue, and the other does not (all cues receive the same weight for tallying and minimalistic heuristics).

[ 11 ] A linear regression predicts a statistical relationship between cues with a linear functional form; naive Bayes selects the alternative with the higher probability of having the higher criterion value, given the alternatives’ entire cue profile

[ 12 ] Hertwig, R., & Pachur, T. ( 2015 ). Heuristics, history of. In International encyclopaedia of the social & behavioural sciences. Elsevier.

[ 13 ] New Scientist. ( 2018 ). We've got thinking all wrong. This is how your mind really works. Retrieved from https://www.newscientist.com/article/mg24032040-300-weve-got-thinking-all-wrong-this-is-how-your-mind-really-works/