Do you have a CRISIS in data governance?

Womans hand interacting with data on tablet

At the last Data for ERM & Solvency II conference hosted by Insurance ERM, MBE partner Vibeke Edvardsen chaired a forum on “Ensuring that your data governance framework is sufficiently agile to cope with the changing regulatory landscape”.


Discussions with Colin Lethbridge, Solvency II Data Governance Lead at AVIVA and David Lodge, Global Data Governance Manager from Allianz, questioned how current frameworks could be adapted to provide optimum service and whether they were flexible enough to accommodate the next wave of industry requirements.

As the debate unfolded, it became apparent that the real cause for concern was not the tools and frameworks, but the data itself, leading to some critical questioning of how we understand and value the data we use.

  • Do we really understand the difference between usable data and what is just information?
  • Do we ensure that data producers and data consumers understand one another i.e. who captures and processes the data, where and how?
  • Do we create and promote enduring quality partnerships, enabling clear articulation of business need to facilitate the supply of the right data?

These questions are echoed in a recent project MBE has undertaken with a major client to shape and embed their Data Governance framework.

Working with stakeholders, we developed a list of fundamental considerations to avoid a CRISIS in Data Governance: Control, Risk, Information, Standardisation, Integrity and Source.

Control – How well do we control data flow through the end-to-end process?

There has been a clear shift in risk management, with the industry moving to adopt a more regulatory-driven strategy in ensuring compliance within ERM functions, coupled with a focus on maximising value.

As changing regulation calls for more interaction, companies face the challenge of integrating traditional finance departments with actuarial and compliance under a single governance structure. Unifying these previously silo-based functions also means establishing a carefully considered balance of skills and resources to enable high quality, sustainable processes underpinned by both preventative and reactive controls.

Risk – What are the risks and what do we do to mitigate them?

In the current economic climate insurers are being forced to address their existing delivery models and ensure that risk is clearly managed in conjunction with appropriate controls.
Designing and implementing a framework where risk factors are clearly identifiable and link to corresponding controls promotes value and helps drive down operating costs from a risk-adjusted perspective.

Information – Do we understand the difference between information and usable data?

With pressures on quality, cost and delivery, there is not always the time to pause and differentiate between what is simply information and what is true data.

By definition, information is ‘facts provided or learned about something’ while data in our environment is ‘facts and statistics collected together for analysis and making the basis for reasoning or calculation’. Appreciating the distinction between the ‘how, when and why’ of the information versus the ‘what’ of the data is a critical factor in the quality of process outputs.

Standardisation – When my apple is your orange.

How much time and effort do we waste in handling and assimilating the same data in different formats and languages?

A lack of standardisation continues to be one of the biggest data management challenges we have found with our clients, where different teams are using and sharing the same data, but under different names or presented in a slightly different way. Typically, this manifests where we try to link processes with their respective inputs and outputs data and find a wide array of disconnections and logic failures.

Overcoming this takes time and requires detailed discussions to properly understand the data and establish simple rules around data sharing and management. The key benefits of enforcing standardisation include making data easily identifiable and minimising in-process assimilation time.

Integrity – Is your data fit for purpose?

No matter how innovative and robust systems and processes may be or how confident we are with the results, there is also a need to understand the intended business purpose for the data we produce and the cost of poor quality.

During the data governance journey with our clients we find ourselves frequently revisiting the same questions:

  • Is the data fit for purpose?
  • How vulnerable is the data to manipulation?
  • Who the suppliers and customers of our data are, what
  • their needs and expectations are and whether the
  • outputs we provide are fit for their intended purpose?
  • What are the value-adding elements?
  • What are the critical factors to achieving acceptable data quality?

Data integrity is reliant on clear and open communication across teams and departments, with quality partnerships ensuring that each part of the process is fully aware of both the up and downstream demands, and the implications for failing to meet those requirements.

We have found that producing a simple quarterly Data Governance report, identifying all the processes in scope and impacts at each stage, helps to avoid issues and errors while strengthening working relationships.

Source – Who owns your data?

With increased regulation and evolving business processes, it is easy to lose sight of who ultimately owns and is responsible for our data.
Data ownership and responsibility is a key question for every data quality function, with clear guidance necessary for all stakeholders and those who use and interact with that data, enabling a clear line of communication and engagement in the event of data quality issues.

Despite Solvency II being the biggest regulation change for decades, it is not yet fully embedded within firms.

The PRA has not specifically given guidance on how to implement data governance. They have, however, shared some resources to facilitate the process. For example, the “Solvency II: internal model approval process data review findings” report, published in February 2016, has proven to be an insightful resource, particularly around data flows.

Thanks to Solvency II, data is finally the number one priority on the agenda in risk management, and with further regulatory developments such as IFRS 4 Phase II and General Data Protection Regulation, this will continue to be the case.

Businesses are finally turning the spotlight on their data management and governance frameworks. The appreciation is dawning that irrespective of systems, processes and skills we can only ever be as good as our data allows us to be – meaning that data control, risk, information, standardisation, integrity and sources are fundamental to consistency and quality in the continued evolution of the industry.

Image (c) Shutterstock |  Pressmaster

Vibeke Fennell