Data quality in insurance firms : an issue underligned by the PRA

12/11/2019

 

The PRA letter dated November 5th 2019, addressed to British insurance companies CEOs, underscores the weak data quality in insurance companies and the consequences in the insurance reserves calculations.

This quality problem is explained by factors such as inflexible legacy systems, insufficient controls over data capture, and difficulties in the re-use of registered data.

The regulator identifies this matter in its use of Solvency II insurance reports, and announces more checks and enquiries for insurers.

The PRA letter is here.

 

Extract :

Data quality issues

A number of firms are struggling with the consequences of poor‐quality underlying data. Causes
include inflexible legacy systems, insufficient controls over initial data capture, and data being stored
in formats which prevent effective analysis. In a number of cases we have observed actuarial teams
and other control functions having to spend significant proportions of time and resource cleaning
data for analysis rather than using the data to inform high‐quality decisions. This limits firms’ ability
to identify and respond to trends quickly. Responsibility for ensuring satisfactory firm‐wide data
quality goes beyond any single control function such as the actuarial function. Management would
therefore benefit from considering their data quality management strategy both at the level of
individual control functions and across the business as a whole.

The PRA also makes extensive use of Solvency II quantitative reporting relating to technical
provisions. We have identified a number of data quality issues in these reports as part of our review
work this year. We plan to increase our plausibility checks on this data and your teams can expect
follow up enquiries from us when regulatory reporting data appears to us to be inaccurate.