Canadian Underwriter
News

Industry urged to get high-quality data into the hands of decision-makers


December 4, 2020   by Adam Malik

Businessman with stack of office files

Print this page Share

Getting consistent, high-quality scientific data into the hands of decision makers can protect people from the fallout of bad choices leading up to a catastrophe, such as what’s happened in Fort McMurray, Alta., a catastrophic data expert says.

But it will take a coordinated effort to determine the right — and most effective — data to collect, says Caleb White, managing partner at Climate Engine, a climate data company.

White wants to see scientific data collected consistently and put into the hands of homeowners and local and federal government representatives so that a collaborative process is taking place to reduce risk.

This didn’t happen in Fort McMurray, and still isn’t happening today, White says. He blames the absence of key data for some poor decisions being made that led to Fort McMurray being required to endure multiple rounds of home rebuilding recently. A major wildfire tore through Fort McMurray in 2016, causing about $3.8 billion in insured damage; just four years later, spring flooding in 2020 caused another $522 million in catastrophic damage. White called the situation “sickening.”

He was speaking on a panel at CatIQ Connect’s quarterly webinar event alongside Robin Bourke, engineering advisor at Public Safety Canada, and Andrew Smith, chief operations officer at Fathom, which is a flood mapping and modelling company.

Moderator Shawna Peddle, program director for Co-operators Community Funds at The Co-operators, pointed out that not only are risk maps outdated across the country, but they’re inconsistent.

“So how can we bring together the right stakeholders across all jurisdictions — [which use] different data, different methods, different philosophies, [and] different resources — to create tools that are consistent enough for effective decision making across the board?” she asked the panel in a session called, The Future of Flooding: Mapping Risk for a Changing Planet.

Clockwise from top left, moderator Shawna Peddle from The Co-operators, Andrew Smith of Fathom, Robin Bourke from Public Safety Canada and Caleb White from Climate Engine, take part in the panel The Future of Flooding: Mapping Risk for a Changing Planet during CatIQ’s quarterly webinar event on Dec. 3.

One approach is to look at the availability of remote sensing data, which includes data gathered from drones, planes, LIDAR (light detection and ranging), satellites, and anything else they can get their hands on, White replied.

“And that’s one piece of the equation. The other piece is the actual science,” he added.

The science part is tricky, White said, because there are a number of different modelling techniques; picking just one wouldn’t be the right way to move forward. “It should be a complementary ensemble,” he explained.

For example, White’s company is doing a project in the United States involving evapotranspiration data (evapotranspiration is basically the movement of water to the air from sources such as the soil, canopy interception, and water bodies). The purpose of the project is to bring together leading scientists from around the world, and to agree on the right approach and the right data.

“In terms of data wrangling, I think it’s on all of us to get out of the weeds,” White said. “Because ultimately we need to enable the decision makers. We need to enable homeowners at the citizen level, [and] we need to enable communities at the municipal level. There are provincial considerations, and then the work that Robin and his team are doing at the federal level.”

Such a program is crucial for helping decisionmakers mitigate risks in certain parts of the country, White said. He raised the example of Fort McMurray, which has experienced two major catastrophic events in just a few years.

“People [are] now rebuilding their homes twice. I find that just sickening,” White said. “I feel like, as an industry, we need to get out of our own way a little bit. We need to start collaborating and start the conversations of, ‘Let’s enable the decision makers.’ Because nobody cares what resolution the LIDAR data is, or whether it was open-source satellite imagery, or planet data.

“What is the end result of it? What is the output of it? That’s what decision makers truly care about. And that’s what we as a society should care about in terms of building resilience. So we need to enable ourselves to give the people in the decision making chair the capability to make those informed decisions.”

The issue, he pointed out, is not technology. “The issue is, can you compile all of this data in…the right formats, the right aggregated layers, and make it available for decision making?”

 

Feature image by iStock.com/triloks


Print this page Share

Have your say:

Your email address will not be published. Required fields are marked *

*