MY IIRSM LOGOUT

Get connected

You are here:

You are here

News

It’s all in the mind....

Way back in 1970, Roger Hargreaves’ 6-year-old son Adam asked his father a question. It was a question that only a child could ask. Adam said, ‘Daddy, what does a tickle look like?’

In response Hargreaves, a cartoonist, drew a picture of a round orange blob with a face and long rubbery arms. It became the main character in his first book, Mr Tickle. At first, he had difficulty finding a publisher willing to take on the work but eventually the book was published and went on to be the start of the Mr Men series of books which have sold over 90 million copies. They are favourites of children all over the world.

Hargreaves' great creations came because he listened to what on first hearing sounds like a very silly question. One which Mr Silly might have asked. But silly questions challenge conventional ideas and prompt lateral thinking. Silly questions use the power of imagination to great effect and so can help managers, and particularly risk managers, think the unthinkable.

Whilst our brains are capable of such moments of true creation, they can also work against us at times. Sticking with kids for a moment, we all know that playgrounds need to be dangerous enough to be challenging, but often are made ‘too safe’ in the pursuit of total safety, and therefore unused and unloved. In such circumstances, there’s a real risk that children grow up without any real feel for risk and so expose themselves to even greater threats in adulthood. People, including children (who are people too, after all, despite appearances at times) don’t seek to minimise risk. They seek to optimise it. They drive and swim and fight and love and play so that they can achieve what they desire, but they push themselves a bit at the same time too, so they continue to develop. Thus, if things are made too safe, people (including children) start to figure out ways to make them dangerous again. We all prefer to live ‘on the edge’. There, we can be confident in our experience and also confront whatever unknowns we might face in the future. We’re hardwired to enjoy risk. Overprotected, we will fail when something dangerous, unexpected and full of opportunity, appears, as it inevitably will.  

So what about adults? We all are wary of sharks – they’re threatening looking creatures with big teeth. Disaster movies tell us they’re scary. But did you know that coconuts kill more people every year than sharks – 150 versus 5, as a result of having a habit of falling from tall trees onto unsuspecting passers-by. This is an example of a heuristic –a mental shortcut that helps us make decisions and judgements quickly. In this case it happens because our brains are hard-wired to help us avoid dangerous situations. The trouble is, this hard-wiring is deeply inherited and dates back to our ancestors in the jungles. It leads to the so-called ‘flight or fight’ response. Daniel Kahneman and Amos Tversky in their book ‘Thinking Fast and Slow’ (Penguin books, 2012) describe this as System 1 thinking – it’s our sub-conscious or so-called ‘crocodile’ brain at work.

It’s not however the full picture. Kahneman and Tversky go on to describe System 2 thinking as well – this is what happens when we stop and apply logic to a situation. The trouble is System 1 is about 200 times more powerful than System 2 so all of us easily lapse into System 1 thinking, even when we know it’s wrong. Ever driven somewhere and arrived, without being conscious of the journey – that’s System 1 at work. Here’s another example. A bat and ball cost £1.10 but the bat costs £1 more than the ball. So, how much is the ball? Think quickly and use System 1 and you’ll probably say 10p but stop and think for a bit, and engage System 2, and it’s obvious it’s 5p. This aspect of how our brains work can fool us all into making incorrect decisions at work – and that effects risk. This effect is worse when we’re tired, stressed or under pressure too. As risk managers, we can help our organisations make better decisions by forcibly taking a step back when making key decisions. Never make a key decision right at the end of the working week for instance. Come next week, you’ll regret it.

Working in teams raises other heuristic issues that can impact on our response to risk. Whilst not an exhaustive list, these include:

  • Affinity (‘like me’) bias – we tend to trust people like ourselves more than we trust others. Think of your closest colleagues, then discount all of those of the same gender, then the same race and finally those with the same work or social background. Anyone left? If not, seriously consider taking a more diverse approach to future hires.
  • Stereotype bias  – we often see this at play when recruiting someone whom we employ because we’d had a positive experience with someone quite similar previously – or, of course, the opposite. We’re then surprised when they end up performing totally differently.
  • Anchoring – this is where we rely too much on the first piece of information obtained when making a decision. An expectation of a high level of injuries at work, for instance, can make us feel that a reduction is good, whereas what’s really needed is elimination of all injuries.
  • Cognitive dissonance – where our actions don’t follow what we believe. For instance, smoking even though we know it’s bad for us, At work, the Challenger space shuttle disaster was a classic example of cognitive dissonance. All the engineers involved knew the craft would disintegrate on take off, yet they still launched.
  • Group think – where we change our view or decision to ‘fit in’ with the crowd. This is often at the root cause of many an industrial accident or company failure. For instance, the failure of the western world’s banking system ten years ago could be explained by group think in the banking sector, coupled with affinity bias.
  • Availability bias – where we overestimate the importance of the data we have, over and above the data we still need to source.
  • Confirmation bias – where we decide an answer then go on to prove it’s the right one, discounting all other options.
  • Gambler’s fallacy – believing that future probabilities are altered by past events, when in fact they are unchanged.
  • Ostrich effect – avoiding negative information by pretending it doesn’t exist.
  • Risk compensation – taking bigger risks when perceived safety increases: being more careful when perceived risk increases. For instance, it’s proven that we cycle faster when wearing a helmet.
  • Authority bias – giving too much credence to the wearing of a uniform for instance. This is why airline pilots wear uniforms – so that passengers follow their instructions in an emergency.
  • Status quo bias – ‘if it ain’t broke, don’t fix it’ – preferring the current state of affairs, or business model, over change. Many a company – think Kodak films or Blockbuster video – have failed as a result of this.
  • Courtesy bias – giving an opinion or reaching a conclusion that is viewed as more socially acceptable, so as to avoid causing offence or creating controversy.

‘If everyone is thinking alike, someone isn’t thinking’ – General George S Patton

So apart from recognizing their existence, how do we counter these tendencies? Some simple techniques include:

  • Framing decisions, by proactively framing its context.
  • Not rushing to solve a problem.
  • Allowing the most junior person to give their view or solution first.
  • Seeking opposing and contradictory evidence.
  • Gaining expert opinion.
  • Nudging, such as how supermarkets get us to choose healthy snacks by placing them by the checkout.
  • Considering alternatives. Using ‘what if’ analysis.
  • And, building true diversity, but not just of defined characteristics such as gender and race, into your teams, including diversity of social and business background too. 

Many practitioners are still of the opinion that risk management is all about tools and techniques, or standards and mathematics. Like so much in life, however, this is only part of the picture. Whether we’re thinking of catastrophic risk failures such as Deepwater Horizon or failure of the western world’s banking system, or major business successes such as growth of the Californian technology giants, psychology lies at the heart and soul of these situations. Ignore the brain at your peril.

Written by Steve Fowler FIIRSM

23 May 2019

Back
to news