January 24, 2014 by Canadian Underwriter
Having complete and comprehensive data before running a catastrophe model is crucial when assessing earthquake risk, AIR Worldwide’s experts argue, reiterating a point stressed by the federal regulator in its recently-revised guideline on the issue.
Uncertainty in data input creates uncertainty in modelling, David Lalonde, senior vice president in the consulting and client services group at AIR Worldwide noted during a webinar Thursday hosted by Canadian Underwriter, titled “Earthquake Risk: From Science to Compliance.”
“Garbage in, garbage out – it’s a common refrain in catastrophe modelling,” he said. “Simply put, if the exposure data being input into the model is incomplete or inaccurate, the model cannot be expected to generate accurate loss estimates.”
“OSFI recognizes this by dedicating an entire principle to exposure data,” he added.
Lalonde was referring to the Office of the Superintendent of Financial Institution’s Earthquake Exposure Sound Practices guideline (B-9), a revised version of which was released last year. It includes three main points in terms of assessing data, including ensuring data integrity, data verification and data limitation.
“Senior management needs to understand the data requirements of the model(s) used and place a high priority on the quality of data and its timely capture,” guideline B-9 notes.
OSFI also says that data should be subject to at least an annual review, if not more frequently, by “individuals independent of those responsible for data collection and data quality.”
Raw exposure data collected, such as a property’s location, risk characteristics (such as construction, age, and height), replacement value, and policy conditions, all need to be accurate for sound modelling, AIR Worldwide noted in a recent issue brief on earthquake risk models and regulatory compliance in Canada.
“Prior to catastrophe loss analysis, model users should assess the quality of the exposure data and enhance it where possible,” the briefing notes.
“Users should also perform reasonability checks on exposure data summaries, including minimum replacement value, maximum replacement value, and the average value per risk.”
AIR itself uses a four-part method to ensure data consistency, accuracy and completeness for organizations using earthquake models.
In part, that includes benchmarking the organization’s data against the industry by using its own exposure databases, which it develops in each country where it models.
It also tries to reduce uncertainty by adding missing data or replacing questionable exposure data using a replacement cost estimator or using industry averages to distribute unknown values.
In terms of selecting an earthquake model for risk management, “a basic reasonability test is how well modeled output compares to actual loss experience,” AIR’s recent issue brief notes.
“There are a variety of ways to do this, including: examining whether modeled industry and company losses and associated exceedance probabilities make sense for large historical events; comparing modeled and reported average annual losses; and assessing how the model performs in real time as an event unfolds.”
In his presentation, Lalonde also emphasized that earthquake risk management requires at minimum an annual review, but it should be an ongoing discussion.
To view the archived webinar, visit http://bit.ly/CanadaEarthquakeRiskWebinar