The Center for Data Innovation spoke with Tanguy Touffut, CEO of Descartes Underwriting, a company headquartered in France using advanced data science to offer innovative insurance products internationally. Touffut discussed how the company is using innovative data sources to improve risk modeling and process claims faster.
The interview has been edited.
Christophe Carugati: How does your company use data science to underwrite insurance policies better?
Tanguy Touffut: Data science is at the core of our business model, and we use it throughout the entire insurance process. It helps us design innovative insurance products tailored to our broker and client needs. Data science is also key to a better assessment of risks and client vulnerabilities, enabling us to offer a fair and competitive price to our brokers and their clients. This has become paramount in the current market environment where companies are struggling to cut or contain costs. Data science can ensure quick claim payment as well. For instance, take the United States, where large corporations have to wait an average of 550 days to receive a claim payment, whereas we are used to processing a payout in almost real-time thanks to remote sensing technologies.
Carugati: How is the insurance industry adapting to the greater availability of data?
Touffut: The insurance industry is undergoing a paradigm shift. In the past, there was a clear advantage given to incumbents. Having proprietary data, sometimes spanning more than a century, was key to underwriting risks properly.
Today, external and publicly available data from satellites, the Internet of Things, or other remote sensors can provide accurate and reliable data points. This has removed the barrier to entry. We see that “tech” start-ups mastering artificial intelligence techniques can now price some perils better than well-established industry players, despite their historical data and massive resources.
To give you an example, hail storms represent a large risk for many corporates, e.g., car manufacturers or renewable energy producers with solar farms. In the old world, the only way to price hail risk was by using hail maps based on local observations and insurers’ historical loss data. Today, we can develop and run powerful algorithms to significantly enrich and correct such maps, largely thanks to the analysis of specific satellite and sensor data. In other words, we track the overshooting tops of “cumulonimbus” clouds responsible for hailstorms, or we use local sensors to assess the size, number, and intensity of hailstones.
Carugati: What types of data are most useful for risk modeling?
Touffut: It depends on the perils you want to model—cyclone, flood, drought, earthquake, etc.—as there is a plethora of data. The growing availability of data coming from satellites, synthetic-aperture radars, weather stations, gauges, etc. are particularly useful for our approach of modeling underlying phenomenon directly.
If we take bushfires in Australia, as an example, using satellite imagery is a game-changer. The latest satellites can analyze very precisely the burned areas as well as the salvage ratio. With this data, the need to send loss adjusters on the ground becomes obsolete.
Carugati: What hurdles do you face to collecting and processing data in a timely fashion, and how do you overcome them?
Touffut: In most cases, we can collect and process data in real-time. For specific risks such as hurricanes in the United States, we may have to wait a bit to assess the wind speed in the most accurate manner. For example, U.S. reconnaissance aircraft may be flying into hurricanes and collecting valuable data that may slightly correct or better calibrate satellite data.
Of course, the potential incurred costs for some data sources remain a challenge. Plenty of valuable public data is available from the NASA, the ESA, or the JAXA, but there are also many private companies developing specific technologies to better capture specific pieces of information. For instance, if you use drones, the costs could be high, especially if you want to insure a large territory.
Carugati: What other data-related services are you offering?
Touffut: Claim handling is the moment of truth in the insurance industry. When required, we may provide services related to calculating the loss incurred. However, our key focus is to make insurance products fit for purpose: transparent, affordable, and providing quick claim payment when needed. We are not seeking to become a data provider nor to commercialize services not directly related to insurance products. Our potential market is already greater than $100 billion GWP (editor’s note: GWP, or gross written premium, is the total revenue from a contract expected to be received by an insurer before deductions for reinsurance or ceding commissions), and the challenge our products seek to address, climate risks and weather-related business interruption, is only growing with climate change.