Here’s a riddle. Solve it quickly with the first answer that comes into your head, and then scroll down.
If a plane crashes exactly on the border between the US and Canada, where are the survivors buried?
The answer might be obvious – or at least apparently so – but it can be easy to miss the facts and we can get a variety of answers.
‘They go back to their home countries’ or ‘Half and half’ have come up for me when I’ve asked this.
But the answer of course is that they’re not buried anywhere: they are survivors!
If you missed the key facts here, don’t worry – you’re not alone. This kind of result was studied by Daniel Kahneman and Amos Tversky and is an example of “System 1 Thinking” as illustrated in Kahneman’s book ‘Thinking, Fast and Slow’.
In brief, Kahneman highlights two types of thinking that we all do:
Automatic, instinctive, fast. ‘What’s your name?’, ‘what’s 2 + 2?’ complete the title: ‘Romeo & ….’. This is also the home of our cognitive biases (patterns, cues and categories of thought that enable us to decide or interpret quickly).
Slow, deliberate and effortful. ‘Count the blue cars’, ‘what’s 88 x 7?’ or trying to remember where you left your keys or doing something for the first time.
We know that our brains have a limited amount of energy, and you will have experienced how tiring a really cognitive, System 2 day can be – say an advanced workshop. It follows therefore that our brains will use quick, low-energy System 1 responses where possible to do so, and in the majority of cases this is exactly what we want. We can think and act fast.
But when we really need to listen, and especially if the situation is complex, System 1’s speed and pattern/category matching can send us in the wrong direction and cause issues, for example:
- First impressions at interviews – ‘I liked the candidate from the handshake’ (confirmation bias: people with good handshakes are effective and they had a good handshake so must be effective too)
- Assuming cause/effect – “The network blipped last week too so it must be that” (availability heuristic: recent memories holding greater influence)
- Blaming the ‘usual suspects’ – “Everyone’s had problems with that supplier, it must be them again!” (bandwagon effect: believing something that many others do)
- Underestimating costs or time etc. – “I’ll have that done by Friday” (hard-easy effect: overestimating your own ability to solve difficult tasks)
- Automatically supporting a presented fact from a senior colleague (anchoring bias: your own opinion is influenced by their statement)
And so on. And so what?
The reason I wanted to share this article was to introduce or remind people about the concept. There are many complex situations in our organisations that require more intentional and System 2-based responses, and being aware should give you and your teams the opportunity to build triggers, practices and fail-safes when needed.
You know that moment when you’re walking with someone and need to stop to answer a question? That’s a shift to System 2, and that’s what I mean.
Switching to System 2
Here are four examples of areas and interventions that you might take to encourage System 2 thinking in your teams:
As simplification, let’s assume that there are two types of problems: ones that you are certain of the cause of, and all the others. With the first type, go ahead – put out the fire, replace the component, turn the power back on – there’s no great benefit from thinking deeply.
But for the second type, or if your ‘certain’ fix didn’t work, you need to slow down and check the facts and assumptions that you may be making. In my experience, it’s never better to ‘keep trying things until we fix it’ over some thought and intentional action, and you can promote that behaviour using some good practices.
I strongly recommend Kepner Tregoe’s (KT) approaches here – you can save significant time in finding causes (or meaningful next actions) and the framework also speeds things up by making it much easier for different teams to collaborate
Check out www.kepner-tregoe.com and specifically the Microsoft case study for more info here.
Similar to troubleshooting, we could categorise in two. If you’ve already made up your mind that you need that red Porsche, a System 2 approach isn’t going to do anything more than justify your decision.
However, when you’ve got multiple options to choose from or there are significant consequences to think about (time, materials, money, reputation etc) encouraging people to a System 2 mindset can be valuable.
There is a field of ‘decision science’ that you can look to for inspiration. Most systems (including KT again) attempt to separate the needs (what a decision should achieve) from the options. Doing these discretely helps to prevent biases or preferences for options or vendors influencing the criteria for the decision.
As a branch of decision making, hiring people for your team or organisation is one of the most important activities of any leader. Get it right – get people that can bring new ideas, approaches or experience – and there’s a massive up-side. Get it wrong, and you can at least end up wasting time.
And research tells us that we make up our mind about an interview candidate within the first 20 seconds of conversation, so there’s a big risk of System 1 decisions and biases here. First impressions are more powerful than you might imagine, and that can extend to candidate selection in the first place, like CV/Resume reviews.
Like decision making you can look to decision science, and there are also good practices out there that can help. Look at Google’s approaches to hiring (Work Rules by Laszlo Bock, e.g. the practice of getting multiple people to speak with a candidate) and the Tech Talent Charter for a free toolkit with many links and ideas that can help with hiring and attracting diverse candidates in an unbiased, System 2 way.
Check out https://www.techtalentcharter.co.uk/ and specifically the toolkit available free from this page: https://www.techtalentcharter.co.uk/toolkit
How good are we at estimating how long a piece of work is going to take? In my experience, pretty poor. There are so many complex elements to making a prediction about a task that estimates are invariably wrong; misunderstanding of what the task is, assumption that nothing will change or come up, a desire to please (and demonstrate how competent and effective you are) or a desire not to displease (sand-bagging a task to give you time) etc.
So what to do? Agile teams – recognising that this is a problem – have been using more intentionally System 2 practices in their planning for some time, for example with ‘planning poker’. Simply, a team estimate tasks together but using a system that prevents anchoring (everyone reveals their estimates simultaneously) and relative sizing that promotes conversation and a more accurate assessment.
Check out the planning poker cards on the Agile Business Consortium for more information (or search ‘planning poker’ for more): https://www.agilebusiness.org/store/ViewProduct.aspx?id=13685520
You will find many more examples no doubt, and overall I hope this has inspired you to look for instances where it would pay dividends for you to slow down. Less haste, more speed really does matter.
Ask your teams to think about this concept too – especially in retrospectives and wash-ups. Ask specific questions and probe to find if it was a thinking pattern that gave you the result you didn’t want.
Image credit: Photo by Miguel Á. Padriñán from Pexels
Leave a Reply