Data is key to navigate severe climate events

A person using dual monitors and a keyboard at a desk
Adobe Stock

Digital Insurance spoke with Behzad Salehoun, Canada insurance lead at Capco, about how insurers are navigating severe and frequent weather events with advanced data techniques and climate science. Salehoun supports insurance companies in business-led tech transformations.

Responses have been lightly edited for clarity.

Can you speak to the aftermath of catastrophic events like Hurricane Helene and the impact of flood insurance gaps?

You're getting really the unfortunate best of both worlds happening at the same time with catastrophic losses like Hurricane Helene and the wildfires out west. In Canada, we had record wildfires and flooding as well, happening at the same time. From a North America approach, you're seeing this quite, quite badly. And so a lot of the property and casualty insurance companies have invested in climate science and climate analytics. For example, one of the insurers that we do work with they have two climate scientists as part of their 200-person strong data science team. That is looking at doing all of this modeling, really trying to do analysis in order to come up with, effectively, a better model on how to predict losses and future losses.

Behzad Salehoun

In terms of capabilities, many insurers are looking to develop things like model granularity. Can they be even more granular in terms of their portfolio's outputs and individual location losses? This is something that most sophisticated insurers are already near. But what we're talking about is going from 1000s to millions of parameters to now billions of parameters. When we're looking at that kind of scale, it really kind of makes some of the actual applications and necessary tools that enable that specific capability, a bit challenging and a large scale technical implementation.

A lot is about workflow optimization. It's about being able to do various kinds of scenario analysis, marginal analysis, and ultimately driving sort of speed and performance. And in order to be able to price right, to be able to be available in a market, and this is going to be some of the challenges that I think some regions in North America will have where a lot of insurance companies may not want to participate in those markets because they've deemed it to be potentially too challenging, too risky to price appropriately, and so that requires a lot of, I would say, collaboration with various sorts of government bodies like local and regional and entities, and doing a lot of sharing of this sort of sophisticated data, and ultimately resulting in what we call risk engineering. Risk engineering is really about working with the insured's around identifying areas of improvement for risk management. 

For example, another insurance company is working with a resilience company that goes in and builds homes that are in regions that are prone to wildfires, making them more resistant to wildfires. So for example, you might have homes that are able to withstand for a longer period of time so insurance companies are incentivizing homeowners to actually implement these types of solutions to their homes. A whole ecosystem of risk management solutions is sort of being developed today to really be able to actively prevent or lower the overall impact of such a catastrophic loss, or whatever the actual weather event may be. It's not just the models, it's a lot of the actions and activities that need to be put in place to make this stuff work.

Would you expand on the challenges around data granularity?

The granularity piece is a significant kind of data management problem. There may be insurance companies of varying sizes that have better data. There are some situations where some insurers are data poor, and they may be working really closely with their reinsurance partners to capture more data and get more information content. 

And then there's a skill set challenge, having data scientists, having actuaries that are very, I would say, sophisticated around data management is incredibly important to be able to take advantage of that data and unlock it, and be able to get down to that sort of level of granularity that one wishes to to sort of achieve. 

The other piece is having the systems that support this. And so from a systems perspective, cloud and cloud management solutions out there are becoming mature today, and this is sort of part and parcel with an insurer's overarching technology strategy. And so it's not just the underwriters and the actuaries that have to work with these systems, but they have to work really closely with the leadership in these insurance companies to get the systems that they need to be able to achieve this. 

The last piece of the puzzle is the applications that support this. So it's not just sort of big databases that need to be developed, but also applications that support sort of model versioning, workflow optimization, being able to do scenario or marginal analysis and stuff like that. There's layers of capabilities that are required to achieve that model granularity. Second is risk engineering. It is about the risk assessment. It is about doing risk remediation prior to the action happening. So prior to a hurricane, prior to a flood, we want to make a home or a commercial building more resistant and increase its survivability so that the damage when it does happen, when that wildfire does happen it is not as damaging. And so that's not traditionally what insurance companies have been doing, and now they're being a bit more forward and proactive in terms of risk prevention. 

A lot of the insurance companies are working with local government bodies for example, communities or cities and towns, where if they've identified specific areas that are prone to flood, they will work with the local government to identify how to make that area more resistant to flooding.

What does that collaboration between insurers and external partners look like?

It is really about protecting communities, protecting homes and so there is quite a bit of alignment in terms of what a local government is trying to achieve for their community, as well as what an insurance company is doing and trying to achieve. And ultimately, no one wants to see premiums increase exponentially. And so where there has been a lot of data out there is as climate risk becomes more challenging to predict, it is going to be situations where we could see premiums in certain areas be so high that it's going to make it extremely unaffordable for people, and so it makes sense for local governments to work closely with insurers to really reduce these risks.

How can technologies like predictive analytics help?

So, for example, floodplain data has existed for a really long time. A lot of this type of data has been around, and really now it's about the collaboration between, I would say, the sophisticated insurers that are looking at this stuff and working closely with the communities to develop the risk prevention sort of infrastructure that's required to be put in place. Now, that requires a lot of investment, and so that's sort of out of the domain of how I can support [that] but from that, collaboration comes with, you know, the ability to facilitate the coverage and the risk management of these areas. For example, there are areas in North America that are effectively insurance deserts. 

I would say it's a difficult situation to be in because of the new numbers. I think the latest I saw, Helene's going to cost about $160 Billion and it just keeps getting bigger every year. And so from that perspective, a collaboration needs to happen. And for example, in the Canadian market, there is a collaboration, a consortium, that's being stood up in order to look at this and provide content and research to entities to support moving forward.

Have you noticed a demand for data scientists and engineers?

Talent acquisition has always been a bit of a challenge from a technology perspective in the insurance market. With all of the Gen AI and all that's coming out as well lately, that's just going to get a bit more difficult. Simply because insurance companies should be looking also at utilizing not just traditional data science methods, but also looking at utilizing sort of tooling like Gen AI in a lot of different scenarios, a lot of different use cases. 

The best way today is looking at actually working directly with the universities, and creating sort of programs that work in collaboration with universities giving university level students the opportunity to work closely at the insurance companies, collaborations, internships, etc. That's the only way that we can get folks to have a taste of insurance. No one coming out of university thinks, 'Okay, my career is going to go into insurance,' unless they study actuarial science. 

Insurance companies are also at the tip of the spear on climate change. They're the ones that feel that the earliest, and so from that perspective, it's actually a very interesting space to be in right now.

Anything else you would like to share?

I think the risk engineering part is actually quite interesting. Insurers are collaborating with third-party vendors to go through this kind of work. We are already seeing insurers actually acquire these vendors and make it a part of their sort of offering. So, that piece is going to grow quite significantly, especially in commercial insurance, where you might have high-value factories or warehouses or something like that, that may be impacted by a wildfire or a flood. It makes sense to actually deploy risk engineering specialists to this particular client or insured site to support their risk management activities. You'll see a lot of that forward deployment of risk managers from an insurance company starting. I would say it's happening now. You're going to see more of it in the future.