Canadian Underwriter
Feature

Evolving Geo-Risk Research


October 1, 2014   by Canadian Underwriter


Print this page Share

Throughout history, people have strived to better understand natural catastrophes. Yet significant improvements in prediction remained elusive until the advent of technological advances in the 20th century.

Meteorologists, for example, began using aerial observation and satellite imagery to forecast weather, while seismologists developed sophisticated instruments for measuring the earth’s movements. Today, more than ever, growing frequency and severity of weather-related loss events highlight the need for geo-risk research.

The insurance industry, of course, has a vested interest in the topic. Costly events, such as the 1906 earthquake and subsequent fires in San Francisco, made the significance of geological risks painfully clear to insurers and reinsurers, who began early on to use retrospective analysis in estimating natural catastrophe risk.

However, a string of outlier natural catastrophes at the beginning of the 1970s suggested a change in dynamics, which – if borne out – would render purely retrospective modelling obsolete. At about the same time, the scientific community began pointing to the rising temperature of the earth’s atmosphere and its effects on ocean temperatures, polar ice caps and glaciers, as well as an increasing level of carbon dioxide (CO2) in the atmosphere and its effects on absorption of the sun’s energy.

In light of those developments, the insurance industry needed a deeper scientific understanding and better analytical tools to model nat-cat risks in a changing climate. Recognizing that need, Munich Re’s efforts starting in the 1970s eventually culminated in its Geo Risks Research unit. Established 40 years ago, the unit’s two initial focuses were to determine whether or not natural catastrophes were, indeed, on the rise – and, if so, which types of events were increasing – and to develop more sophisticated risk modelling to calculate the cost and capacity of insurance products.

Early on, geo-risk researchers concluded that while seismic events exhibited no long-term increase, weather-related events had, in fact, grown in frequency and intensity. With the combination of continuously increasing computer influence and growing databases, geo-risk expertise within the insurance industry and in the global scientific community developed dynamically over the years.

By the late 1980s, probabilistic risk modelling – which goes beyond retrospective analysis and extrapolation to include theoretically possible events based on known physical parameters – was introduced. This has proven useful in preparedness for extremely low-frequency, high-loss occurrences, such as the 2011 Tohoku, Japan earthquake and tsunami.

GEO-RISK RESEARCH TODAY

Major reinsurers, including Munich Re, have dedicated considerable resources to geo-risk research. For example, the Geo Risks Research unit maintains the world’s largest natural catastrophe database, containing more than 35,000 data sets, and its modelling instruments are used by stakeholders in the public and private sectors, primary insurers and Munich Re itself. The unit’s most recent addition is a high-resolution, flood-mapping tool introduced in 2013.

Insurers can now use risk models for natural hazards with return intervals ranging from once in a year to once in thousands of years. Geo-risk experts are also continuously learning from large-scale events and how they unfold.

For example, financial impact calculations take into account the demand surge that drives up the costs of materials and labour related to relief and reconstruction in the wake of a major catastrophe, as was the case during Hurricane Katrina and the subsequent flooding that devastated New Orleans in 2005.

Lessons learned in recent years also include the potential scope of cumulative losses, for instance, through business interruption as a result of a natural catastrophe, as occurred during the severe flooding in Thailand from mid-2011 through early 2012.

Still, the quality of nat-cat risk modelling varies widely from one type of event to another. Hurricanes, for example, rely on a number of measurable data points for formation and sustenance, including sea surface temperature and high-level winds.

Based on these criteria, researchers are able to reach conclusions on future frequency and severity. However, severe convective storms (tornadoes, strong straight-line winds, hailstorms, intense precipitation) with multiple and variable contributing factors are much more difficult to model with a reasonable degree of accuracy.

In addition, tornadoes and hailstorms tend to strike in random and chaotic patterns, compounding the challenge of predicting which assets are at risk.

Severe convective storms produce damage from large hailstones, powerful straight-line wind gusts, lightning strikes, torrential downpours and tornadoes. The devastation from flying objects, including branches and uprooted trees or even vehicles, can be extreme.

Typically, homes take the brunt of the damage, but no assets are entirely immune when severe convective storms strike.

PUTTING KNOWLEDGE TO WORK

With the frequency and intensity of weather-related natural catastrophes increasing, the insurance industry and society as a whole must adapt. Here, the challenge is to keep the big picture in view and avoid distraction by short-term trends.

For example, Munich Re recently reported there were comparatively few major natural catastrophe losses worldwide during the first half of 2014. As of the end of June, overall economic losses were US$42 billion and insured losses were US$17 billion, considerably below the 10-year averages of US$95 billion and US$25 billion, respectively. But as Torsten Jeworrek, Munich Re’s board member responsible for global reinsurance business, cautioned: “There has been no change in the overall risk situation. Loss minimization measures must remain at the forefront of our considerations.”

In Canada, a country historically known for comparatively low natural hazard risk, the upward trend in frequency and intensity of weather-related natural catastrophes has been felt sharply.

Five of the costliest events since 1990 occurred within the four-year period from 2010 to 2013, including the heavy flooding and severe storms that hit southern Alberta and southern Ontario last year, causing more than $7 billion in total economic losses, around $2.5 billion of this insured. 2013 has been by far the most expensive natural catastrophe year in Canada’s history.

The Disaster Financial Assistance Arrangement (DFAA), the cost-sharing program between the federal government and Canada’s provinces and territories, has made 96% of all payments in its 45-year existence within the past 18 years. While DFAA averaged nine payouts annually between 2000 and 2010, it made 26 in 2011 alone.

Whether or not this trend is related to climate change has become less the issue than the pressing need for preparedness.

Paul Kovacs, executive director of the Institute for Catastrophic Loss Reduction (ICLR), shares the view that climate change is affecting extreme weather in Canada, but points to a number of potentially avoidable factors that are proving the biggest drivers in the increase in storm-related damage, including aging infrastructure, growing populations and development in at-risk areas.

Accordingly, ICLR – which identifies the actions and people who can contribute to loss reduction in terms of infrastructure, land use, building codes and general public awareness – is dedicated to providing decision-makers with the information and tools they need to better understand hazards and reduce vulnerability. One of ICLR’s co-operation partners in this task is the Institute for Building and Home Safety, a U.S.-based independent organization that works with the construction industry and building authorities to improve weather readiness.

“You can’t prevent extreme rainfall, storms and flooding. But if you’re ready, you can prevent them from becoming
disasters,” Kovacs says.

In Canada, where flooding is expected to remain the number one driver of nat-cat losses, much can be achieved by zoning construction and development out of harm’s way. Recent advances in flood mapping are also a key asset.

Convective storms will clearly require much more research before a comparable level of risk modelling can be reached.

Climate change and new severe weather patterns may represent one of the biggest challenges facing humankind. Whatever responses it demands, the insurance industry and geo-risk researchers have an important role to play.


Print this page Share

Have your say:

Your email address will not be published. Required fields are marked *

*