Further reflections on the limitations of intuitive thinking
For most, the use of our intuitions in everyday life serves us well. Intuition allows us to make quick decisions, form impressions and recognise intentions without the need of engaging our deeper cognitive reasoning. Think about how we suddenly become aware of potentially dangerous situations, work out simple maths equations, sense someone else’s emotional state, repeat common tasks with ease or understand non-verbal communications and gestures. Looking at it another way you could say that intuitions allow us to arrive at an answer to a problem or situation without really thinking about it. These examples highlight how we use our intuitive experiential knowledge to engage with the world around us. And while we need this fast processing method to deliver decisions and responses, there are some limitations with our intuitions that can lead to poor or even wrong decisions.
Most times our intuition and automatic responses serve us well. But when they fail, they can fail dramatically. Our intuitions are based on recognition of previous problem patterns and experiences, as well as socio-cultural values and biases. We can question our intuitions and assumptions with our slower-acting cognitive processes. But we are sometimes prone to believing we have reached the correct decision (albeit, and intuitive one).
Intuition, Cognition and their shortcomings
In his book ‘Thinking, Fast and Slow”, the psychologist Daniel Kahneman described two systems for handling our thinking processes - System 1 which is our automatic and fast intuitive system, and the slower System 2 which we use to process cognitive, effortful activities and statistical problems (Kahneman, 2011). System 1 is ideal for decision making that does not require any meaningful cognitive effort. For example, moving out of the way of a falling item, adding up 2 + 2, walking up stairs, or tying our shoe laces. None of these tasks require us to invest a great deal of mental effort. Intuition comes into play with the easy choices we make every day.
In contrast, System 2 is a slower system that handles our logical thinking and self-control. This system requires an investment in mental effort to reach a decision or outcome. As a slower cognitive system it sometimes fails to overcome the speed and availability of the heuristic-based System 1. Kahneman describes it as our 'lazy' system that requires thoughtfulness and effort on our part to engage with. So why is this a problem? Our intuition can sometimes be based on incorrect assumptions, mental fixedness, availability biases and be influenced by our emotions or irrational reasoning. We are also prone to use the ‘mental ease’ or availability or System 1, and so avoid the ‘mental strain’ or effort in engaging System 2.
Emotions and irrational reasoning can also have a major influence on our System 1. Professor of Psychology and Behavioural Economics Dan Ariely has published a number of books about human irrationality. His research and supporting experiments has shown how we can be swayed by our emotions - even when we think we are being completely rational. We are prone to making poor decisions based on irrational emotions that prevent us from making better counter-decisions.
We are also prone to priming and being over-confident in our intuitive decisions. Priming is a human phenomenon where we can be influenced by different types of suggestions. Research has discovered that priming can occur directly (e.g. someone suggesting a word or image) as well as indirectly (e.g. something occurs within our environment or immediate surrounding, but does not register directly with our cognitive system). Both Ariely and Kahnemen found that we are more open to priming that we may be aware of.
Thoughts, images, words, gestures and even our emotional state can be used to prime our decision making - and also have lasting affects. And this can influence our decision-making and problem-solving patterns far into the future (Ariely, 2010). We assume “I did X when this happened before, so I will do X again’. We repeat the same outcome in similar future situations, and then this response pattern gets included into our cognitive values and belief system. And all based on an initial priming.
In another study about mood, it was discovered people who were primed for good moods were observed to use more of System 1 - intuitive decision-making, more creative, high mental ease and more likely to display logical errors. While people primed for bad moods used System 2 and were more suspicious, vigilant and used a higher level of analytical thinking.
Sometimes over-confidence in our intuition can also lead to poor or detrimental outcomes. We can be less mindful to other possible solutions to problems (see my blog on mental and functional fixedness) as we are absolutely convinced in the correctness of our decisions. We fail to test our assumptions and can get entrenched in believing in our decisions. What is also surprising is how our emotions and biases affect what we accept as a rational thinking process.
Jumping to conclusions is efficient if the conclusions are likely to be correct and the costs of an occasional mistake acceptable, and if the jump saves much time and effort. Jumping to conclusions is risky when the situation is unfamiliar, the stakes are high, and there is no time to collect more information. These are the circumstances in which errors are probable, which may be prevented by a deliberate intervention of System 2. (Kahneman, 2011)
We can counter this behaviour with checking our decisions with System 2. But this requires us to invest in meaningful mental effort, through mental strain on System 1. However, System 1 always prefers to operate with mental ease. But engaging System 2 requires us to invest an amount of effort to think about the problems, decisions and implications. This requires us to switch attention from System 1 to System 2; and this is something that can be blocked from occurring by our over-confidence.
Implications on design and the user
I wrote my initial article on the short-coming of an Intuitive UI approach when designing user interfaces for end-users. There are a number of possible areas we need to be mindful as designers. Firstly, we need to test our assumptions around what are the implications of an Intuitive UI approach. Are we assuming everyone will simply understand the user interface? Are we over-confident that all tasks and flows will have successful outcomes? If unchecked, then we may be adding frustration and confusion for our customers and end-users. One initial problem could be an assumption of innate universal language or imagery. Jesse Prinz wrote about a number of international studies to test these exact issues - innate understanding of language, gestures, imagery, behaviours and values across different cultures. They results highlighted a high degree of variance in the way different cultures think about logic, reasoning, social norms, images, values and culture. And just as importantly, these cultural differences are not fixed but extremely fluid (Prinz, 2012).
Designing to think slowly
Understanding the contexts and intent of what our customers and end-users are trying to achieve is always a good starting point. Applying an approach that includes unambiguous language, simple ‘learned behaviours’ and user-testing our design decisions will ensure better outcomes for everyone. Framing a task or context for the end-user, as well as including contextual feedback, will also help. The benefits of user-testing to counter our assumptions and biases cannot be underestimated. Assuming everyone will simply ‘get’ an intuitive user interface means some of our end-users will incorrectly interpret icons, labels, links, tasks and interactions. I have personally encountered this within user-test sessions, where patterns we assumed would be easy to use actually led to confusion with some customers. The reason for this may have been the customers intuitive decisions were based on their own previous experiences as well as incorrect assumptions.
The choices we make as designers can greatly affect the usability and user experience for our customers. If we are mindful of the variances in the way individuals think, behave and arrive at solutions, then we can deliver more meaning user experiences and appropriate user interfaces.
D. Kehneman (2011). Thinking, Fast and Slow
Penguin Books, London
D. Ariely (2010). The Upside of Irrationality (The Unexpected Benefits of Defying Logic at Work and at Home)
Jesse J. Prinz (2012). Beyond Human Nature (How Culture and Experience Shape Our Lives)
Penguin Books, London