Canadian Underwriter
News

Data variety and velocity seen as main challenges of big data: Celent


April 29, 2013   by Canadian Underwriter


Print this page Share

The major challenge for insurers in the era of big data appears to relate at this time less to volume than to data velocity and variety, note survey findings released last week by Celent.

Big data

The survey of 276 Celent insurance contacts found that velocity and variety of data are the most important challenges faced by insurers, with volume, finding value and veracity of data being less problematic, notes a statement from the international financial research and consulting firm.

Celent issued a report in February that spells out five tiers of data capability: spectator, experimenter, practitioner, innovator and scientist. The new survey follows up on that model, findings of which are detailed in Perceptions and Misconceptions of Big Data in Insurance.

The survey sample – covering North America, Latin America, Europe Middle East Africa and Asia Pacific – includes 49% of respondents active in the property and casualty line of business, 24% in life, and 34% in firms that offer both.

More than 80% of respondents experience difficulty in collecting and analyzing data quickly enough, and more than 70% have difficulty in dealing with unstructured or semi-structured data, notes the report.

“When data lacks structure, analytics tools tend to be inconsistent in delivering reliable results. Therefore, insurers need to appoint specific people to identify and modify certain data before calculation can be performed again,” Celent recommends.

Another key finding was that although a minority of insurers are using external data, they understand there is potential value in leveraging external data sources – such as those now openly available via Internet sites, social networks and media – in different domains of their businesses. “One of the major problems, though, is finding an efficient approach to include these data sources in their daily analysis.”

The report points out that less than one insurer in five uses data from open government schemes, and this proportion drops to one in 10 for other types of data sources, such as private customer data that are willingly shared, data from customer-owned devices and from social networks.

“Therefore, we think insurers have not really captured the importance of mixing data from different sources to the benefit of their activities,” Celent notes.

“Many insurers are not able to perform data analysis quickly because they do not have the necessary computation resources to run algorithms on a large volume of data,” the report states. “Sources of data insurers can and need to leverage to perform certain types of analysis are growing rapidly. In this context, we think insurance companies will have to address the volume challenge with new investments in technologies improving the performance of their analytic tools.”

Survey responses also indicated insurers are taking a pragmatic approach to data and adopting new data technologies, but significant gaps exist in their perception of both their competitors and their customers.

Celent views that as a key strategic issue. “Insurers should not believe that competitors are laggards, and they must treat data as critical raw material,” the report states.

“An insurer not investing in big data technologies is behind the competition already. An insurer cautiously investing in these types of technologies over the last year to two years is in line with the competition – not building a lead.”

With regard to customers, “overall, insurers are still seeing customers through the usual ‘good quality versus price’ paradigm. More than one insurer in 10 believes that customers are still expecting the same old products, services and offerings that they are used to,” notes the report. “In other words, insurance companies do not seem to feel pressure from customers’ expectations to change the way they use data, whether that’s to provide innovative insurance products or better meet underserved client segments.”

Other key survey findings include the following:

  • insurers leverage data analysis technologies to improve their business basics with growing interest in cloud-based analytics, but big data-related technologies are still unknown for many insurers;
  • insurers think their industry still struggles to find value in big data; and
  • pricing optimization, customer segmentation, underwriting and fraud detection are the priority analysis insurers want to perform when leveraging big data infrastructure.

“In many mature insurance markets (especially general insurance), loss ratios deteriorated as fraud became more organized and structured,” the report states. “We think fraud mitigation tools can offer strong value to insurers when they are coupled with big data infrastructure with a number of out-of-the-box solutions already implemented on big data techniques and technology,” it adds.


Print this page Share

Have your say:

Your email address will not be published. Required fields are marked *

*