September 30, 2020 by Carol Jardine, President of Canadian P&C Operations, Wawanesa Insurance
“On the eve of 9/11, the risk of a terrorist attack was probably the furthest thing from most people’s minds. If someone had told investors there was a 1-in-100 chance that a catastrophic terrorist attack would occur in any year during the lease of the World Trade Center, they might have given this risk some consideration but still assumed it was not worth worrying about. But here’s the rub: While a 1-in-100 chance of a disaster occurring in any one year is indeed quite small, extrapolated out over the life of a 99-year lease, that same risk becomes quite large. To be precise, there would be a 63% chance that a catastrophic attack would happen at least once in 100 years. That is a risk an investor would take more seriously.”
I believe insurers have a social responsibility to provide loss prevention and risk mitigation advice in addition to insurance protection. As Board Chair of the Institute for Catastrophic Loss Reduction (ICLR), I collaborate with knowledgeable and influential peers on the Board to reduce the risk of injury and property damage, champion disaster prevention research, and help Canadians understand their catastrophe risk.
We all have an important role to play in loss prevention – one that requires knowledge and changes in behaviour. Our goal is to help brokers talk to customers in a way that helps them truly understand climate threats so they will take action to reduce their losses from catastrophes.
Why we think ‘it won’t happen to me’
We do ourselves a disservice by talking about 1-in-100-year events. A 1-in-100 chance of a disaster means there’s actually a 63% chance of the event happening at least once in 100 years. Even more meaningfully put: it’s the same as a one in five chance that an event could happen within 25 years!
Customers don’t expect to own their home for 100 years, so they think “it probably won’t happen to me.” But what if we told them there’s a one in five chance of that same event happening within 25 years? And even if an event happened last year, there is still a 1 in 5 chance it will happen again in the next 25 years.
Our ability to foresee catastrophes has never been better, yet we consistently fail to heed warnings; we rebuild in flood and wildfire areas, and don’t build appropriately in tornado and hail zones. Our language and behaviour need to change.
Making it real for customers
This adjusted way of looking at risk is certainly more realistic to someone with a 25-year mortgage. Homeowners are more likely to protect themselves for an event that has a 1 in 5 chance of taking place before their mortgage is paid off.
The Ostrich Paradox goes on to explain the following in the context of residents living on a 100-year floodplain:
“…time scales over decades are hard enough to grasp mentally, much less over a century. The tendency to ignore this risk might be exacerbated if the location recently experienced an actual flood. In that case, the ‘once-in-a-century’ reference might wrongly be construed as implying that the home is safe for another 99 years.”
By simply changing our language, we can make it real, and hopefully people will be more likely to see the value in taking protective actions today.
The good news: behavioural biases can be changed
The Ostrich Paradox reminds us that individuals, communities and institutions need a systematic approach to overcome six key cognitive biases to be better prepared for catastrophic events brought on by climate change.
Here’s a quick behaviour bias summary to help us change our customers’ minds around the risk of catastrophes.
Humans simply aren’t good at accepting effort and costs for delayed (and uncertain) benefits. This is because we see our immediate needs more clearly than those in the distant future.
The book cites a study relating to Hurricane Sandy where coastal residents were asked about their intention to evacuate. When the storm was still far away, 55% said they’d evacuate. But when the storm became imminent 12 hours later, the focus shifted to the logistics of evacuating. In the end, only 19% evacuated.
Our memories for pain are short lived and we can persevere through temporary discomfort to learn (e.g. multiple falls while learning to ride a bike). The bias backfires on us when we think about investing in protection against low-probability, high-consequence events. We quickly forget the lessons of past disasters.
“People resettle in floodplains, stock market crashes come in cycles, and careless drivers suffer repeated crashes.” Disasters seem destined to repeat themselves, but protective actions more often go unrewarded than rewarded.
Optimism may be the most dangerous bias when preparing for catastrophes, as it is responsible for the “it won’t happen to me” mentality. We see too much of this with insurance customers. People believe they’re more immune than others to threats; even when they admit a broader threat is likely to occur, they’re less likely to take actions (like buying insurance) to reduce personal damage.
Inertia gets dangerous in high-stakes decisions. When confronted by a threat – especially one we’ve never experienced before – our natural reaction is to do nothing. And when it comes to things people don’t want to think about, such as insurance, the tendency is to avoid it. This default provides an easy mental exit and can mean that people do not take action when required.
When we problem solve, our brains tend to process only the cues it perceives to be large. When a large number of cues grabs our attention, we tend to look for shortcuts.
This means that when faced with the threat of events whose probability of occurring is small, such as a catastrophe, a series of actions is typically needed to achieve protection. When our problem-solving process is simplified, the first one or two steps might be taken but other critical protective measures may be left incomplete.
People imitate the decisions of those around them – even when a collective crowd is no better informed than the least informed in the group. The consequences, of course, can lead to dire outcomes.
From an insurance standpoint, perhaps it means a client makes insurance decisions based on what coverage their neighbour has, rather than consulting with a broker or acting independently by doing research of their own.
We all have a role to play
The Ostrich Paradox concludes by explaining how community planners can use behavioural risk methodology to create preparedness plans that work with, rather than against, these natural biases. It also acknowledges that “the responsibility of safety lies in the hands, not just of individuals, but also elected officials concerned with disasters’ impacts on the general public.”
There are clearly steps to be taken at all levels to reduce catastrophic loss from large-scale climate events. We encourage brokers to consider what biases may be preventing your clients from pursuing the best coverage, and how you and your team can help your customers and communities prepare.
After all, as the book reminds us, “It’s hard to convince people that the best return on an insurance policy is no return at all.”
We encourage you to leverage loss prevention resources such as those published by the Institute for Catastrophic Loss Reduction (ICLR) – and, of course, our Wawanesa blog – to help change the conversation about Catastrophe risk and loss prevention.