Silver lining on a storm cloud.

The last few weeks have seen some devastating natural hazards become enormous and compounding disasters. Three hurricanes came in quick succession – Harvey, Irma and Maria – through central America and the Caribbean islands. Hideous flooding throughout parts of Asia saw whole cities there under water too. Three earthquakes in Mexico city reduced buildings and people’s lives to rubble. And two volcanoes – one on Vanuatu, and another in Bali – are still ready to fire at any moment. Without even considering any of the human-originating hazards (like the threat of nuclear warfare happening in the north of the Pacific, or the hundreds of thousands of refugees fleeing conflicts in multiple countries) there’s certainly been a lot requiring the attention, support and funds from so many governments and aid organisations. I have been paying very close to attention to all of these, as they have enormous relevance to my current work for a major international NGO. I’ve spent the majority of this year ensconced in the world of emergency preparation and disaster risk reduction, trying to find ways for communities to better prepare themselves and reduce their own sources of risk – all with the aim of stopping a natural hazard from becoming a humanitarian disaster.

Many studies and reports have been done over the years highlighting just how many millions of lives and livelihoods could have been spared through better preparation and risk reduction, not to mention the billions of dollars spent in recovery and response. Hazards have been occurring for far longer than we have been around as a species (in fact, we can thank their presence for many awesome things we love and rely upon – volcanic soils, islands, reefs, etc) … why is it that humans (with all our complex brain-bits) are still not preparing for these events?

Well, it gets a little tricky to pin down one or two reasons. With all those complex brain-bits, we can thank multiple biases and habits (often interacting together) for establishing this recurring pattern of human history.

Recently, some smart peeps put their heads together and looked at some of the more commonly occurring cognitive biases we have, and how they have played roles in some of history’s biggest disasters – noting that many of these interact and overlap with each other. Six of the more common biases were outlined in a book published late last year called “The Ostrich Paradox” by Robert Meyer and Howard Kunreuther. If we are to help communities get better prepared, then we have to look at many of these biases at once, and across different segments of the public, to give ourselves the best shot at improving the track record here. Interestingly, they named the book after ostriches because, despite the Hollywood cartoon versions of ostriches we are accustomed to, ostriches don’t REALLY stick their heads in the sand (SPOILER ALERT). In fact, ostriches are a flightless bird – and they’ve evolved some really neat ways to avoid danger that don’t involve merely flying away. They argue we should be more like the ostrich – i.e. be aware of our cognitive biases and limitations and evolve some tricks to overcome them.

If this sounds of interest, then I highly recommend you buying/reading the whole book (and also because I endorse people supporting good science literature like this, and this one has real-world application) – however this might give you a taste of what they covered…

The six biases they discuss are:

  1. Myopia bias
  2. Inertia bias
  3. Simplification bias
  4. Herding bias
  5. Amnesia bias
  6. Optimism bias

Myopia refers to us thinking more in the short-term when it comes to planning. We know we should invest in longer term strategies to reduce impacts or mitigate risk, but more often we tend not to adopt them… we like living in the present. But it means we can ignore potential longer term futures and the less-than-great outcomes that may come with them…

Inertia… the classic bias that keeps us doing the default, the status quo, for as long as possible. This raises its head in a few ways – because the defaults we tend to stick to can vary immensely. Sometimes you might take action as a default, but in an emergency where it’s better to hang tight where you are, that can lead to serious outcomes. Sometimes you might not take action where you should  – because the action was new to you, or perhaps meant you having to consider lots of options…

Simplification bias plays quite a role in emergency preparedness, because of lot of preparing for emergencies involves us considering risk and the likelihood of things occurring. Humans are not great at computing and understanding probabilities – especially small ones – and so we group things into categories. Categories like ‘no chance of it occurring’ to ‘small chance of it occurring’. This can get vague quickly. It also means that we overestimate the impact of taking some small preventative actions to prepare, because similarly we tend to categorise that too… We can get our assessment of risk wrong in the first place; and then overestimate the impact of anything we did to reduce it.

“I’ll have what she’s having…” – whether it be a breakfast dish at a cafe or looking to our neighbours for their emergency planning. We are a social species and herding bias is a big one in emergencies. We look to others for cues on what is appropriate for us to do. Is everyone else doing it? Well then, I’m on board! We like to be similar… even if that means we all similarly do something really unhelpful to ALL of our situations. This bias means often humans hesitate for too long in a disaster situation – looking to others for the cue of what to do, and how to behave.

Amnesia bias is probably one of the more depressing biases for me. We have an enormous capacity to learn, but we often learn best from making mistakes and improving upon our performance. But if we do something right the first time, the experience may not actually be as impactful upon us. Planning for an emergency is a bit like this, because if you get it right, then logically you won’t experience quite as bad an outcome in the emergency itself, but then you don’t necessarily remember it as well either… because, well, it all kind of went ok. It doesn’t actually get as reinforced in our minds, as if we hadn’t prepared and perhaps lost all of our possessions. Coupled with this, as time goes on we also tend to forget details and our emotional memories can become a little dulled. Memories do fade… even for those who’ve experience major disasters, and memories of emotions fade even more. Despite our incredible capacity to learn, the way in which we learn does not always help us in situations like these.

And finally, the optimism bias… which links quite a bit to simplification and amnesia too… because we tend to think that, even if a disaster is likely, WE won’t be nearly as affected as others. A good example of this is knowing the high divorce rates in Western developed countries – but never ever considering that the statistics really ever apply to you… because somehow you’ll overcome those statistics. Your relationship will be FINE and continue on FOR EVER AND EVER AND EVER…. but those stats have gotta come from somewhere, eh? In fact there’s a couple of things at play in our heads here – thinking we’re more immune to the bad things happening is just one of them. We also have a tendency to focus on the low probability of an event occurring on any given day, rather than consider the probability over our lifetimes (a much longer time period). AND we also think something is more likely to occur if we can more easily imagine it. This all leads to a mindset of ‘well, it’s so unlikely it’s probably not worth worrying about… and even if something did happen, I’m sure we’ll be fine.’

Knowing that these biases (and others) are having these effects on our minds means we can get better at building in ways to overcome them in our emergency education or planning programs. The book outlines many ways we can start to overcome them, but I’ll leave it there for now. It is great to see more and more research being shared, and an emphasis on involving psychology more when it comes to helping communities reduce their disaster risks and impacts  – such as this recent piece in The Conversation this week.    The more we do this, the more effective we’ll be, and perhaps the more lives will be saved.

Despite being delightfully complex people we don’t always do what is best for us. Even I, who have been working on this area for almost a year now – still haven’t been able to ‘formalise’ my emergency preparation plan. I’m still patting myself on the back for purchasing a small ‘in-car first aid kit’ when I first started this job. It’s a start. But it is also a glaring case of optimism bias in action. That kit is not going to do much when the real crap hits the fan.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s