OPINION19 April 2016

BE Bites: How to de-bias decision making

Behavioural science Opinion UK

Knowing that we are prone to making biased decisions doesn't always help us to avoid doing so, says Crawford Hollingworth. But there are strategies to keep us on the straight and narrow.

Straight crop

Behavioural scientists and psychologists such as Nobel prize winner Daniel Kahneman acknowledge that merely knowing that we can make biased, irrational decisions does not necessarily de-bias our thinking and decision-making. It’s more effective to put carefully designed frameworks and strategies in place to funnel our thinking in the right way in order to come up with the most rational, well-thought-out decision and avoid narrow thinking marred by bias.

Decision-making biases often occur when our emotions and personal motivations get involved and influence our preferences. For example, groups often suffer from what Cass Sunstein calls ‘happy talk’ – when group members say that all is going well and likely to go even better – that there is nothing to worry about. Teams endeavour to make sure ‘no boats are rocked’ and it breeds a culture of overconfidence.

Another common bias is normalcy bias, assuming something won’t happen in the future simply because it has not happened before. Or selecting interview candidates ‘because we like them’- letting our emotions guide our choice rather than using a more structured framework.

And we often try to solve problems with the same solution we have always used, sticking to the status quo and failing to think laterally and creatively to come up with a far more effective solution.

To tackle some of these challenges, Jack Soll and John Payne, Professors at Duke University and Katherine Milkman, Assistant Professor at Wharton School of Business, have recently compiled A User’s Guide to Debiasing.

One of the common behavioural biases they identify is how we often think very narrowly about the future, planning for only one forecast, scenario or outcome. We typically hate uncertainty. The continuous ‘What ifs?’ can expend considerable mental energy. So we prefer to preserve our mental resources; be a little lazy, perhaps overconfident, assured that a specific (often desired) outcome is certain to happen, meaning we can stick to imagining and planning for only one future.

Soll, Payne and Milkman then go on to consider four different strategies to widen our thinking and tackle this ‘only one future’ outlook:

  1. Make three estimates or forecasts for the future: make a low, medium and high estimate for a forecast or outcome. Don't just state a range – we give a narrower range when we do that, but we give wider estimates when we think about our low and high estimates separately. As a guide, these low and high estimates should be unlikely, but still possible.
  2. Forecast twice: make one forecast, then assume that was wrong and guess again (without anchoring to it!). Then take an average of the two. Research shows that when we think twice about a problem, we consider it from a different perspective and may reconsider. If you can ‘sleep on it’ before you make your second guess – even better.
  3. Make a premortem: postmortems try to understand the cause of a past failure. ‘Premortems’ imagine potential failures and try to explain the likely cause, but in a less critical way than a traditional devil’s advocate strategy might. The premortem helps to reduce optimism bias as it makes risks more salient and vivid. Daniel Kahneman suggests that “doing a premortem on a plan that is about to be adopted won’t cause it to be abandoned. But it will probably be tweaked in ways that everybody will recognise as beneficial. So the premortem is a low-cost, high-payoff kind of thing.”
  4. Take an outsider’s view: imagine you're observing the decision you face from the outside. What would someone on the outside advise or think was likely? Secondly, consider the fate of other, similar ventures – how did they fare? This helps to reduce planning fallacy – when we underestimate how long something will take to do – as we can be guided by the experience of other similar projects.

Governments, intelligence agencies and even some companies have long recognised these decision-making flaws and implemented things like red teaming – singling out groups or individuals whose job it is to challenge and counter the central viewpoint. But there are many more strategies and frameworks that organisations can make good use of in a systematic way. These types of strategies can be incredibly valuable, even to the brightest of people.

As Jack Soll and his colleagues note “Even the smartest people exhibit biases in their judgements and choices. It’s foolhardy to think we can overcome them through sheer will. But we can anticipate and outsmart them by nudging ourselves in the right direction when it’s time to make a call.”

One well known application of taking an outsider’s view comes from Intel, the microprocessor company. In 1985, its core business was still based around its founding business, manufacturing memory chips, but it was struggling to compete with Japanese manufacturers.

Andy Grove, who was then director of engineering (and later became Intel’s chief executive and chairman) notes “I looked out of the window at the Ferris wheel of the Great America amusement park revolving in the distance, then I turned back to Gordon [then the company’s chief executive] and I asked ‘If we got kicked out and the company brought in a new CEO, what do you think he would do?’ Gordon answered without hesitation: 'He would get us out of memories [memory chips].' I stared at him, numb, then said: ‘Why shouldn’t you and I walk out the door, come back and do it ourselves?’.”

Crawford Hollingworth is founder of The Behavioural Architects

Reference:

Soll, J.B., Milkman, K.L., Payne, J.W., ‘A User’s Guide to Debiasing’ published in The Wiley Handbook of Judgement and Decision-making. December 2015

Grove, Andy “Only the Paranoid Survive” Doubleday 1996

0 Comments