The guide to set up data quality monitoring in practice

Blog

The guide to set up Data Quality monitoring in practice

A proven six-step approach and accelerator for Data Quality Management

How to transform the regulatory requirements of BCBS #239 articles and the DAMA DMBOK theoretical principles into a practical and value-oriented solution? How to convince supervisors that your Data Quality Management is sound and functional? How to derive ongoing insight into the impact of incomplete and inaccurate datasets? Find the answers and best practices for these questions and more in this blog - the third and final episode of a three-blog series.

A short overview of this blog series

In the first blog we outlined the renewed focus and momentum for Data Quality Management in the banking sector, and as a consequence, the need for a solid foundation of a Data Quality Management practice. In blog 2 we described the first two steps of our six-step approach. First, we discussed the positioning of Data Quality Management in the organisation and the corresponding governance. We continued with the identification of the Critical Data Elements (CDE) and the business requirement setting. In our third and final blog of this series we focus on the remaining four steps, including the Deloitte accelerator tool and our best practices.

Step three: Designing data quality rules

Each of the CDEs will have to be carefully analysed in order to determine the set of business requirements that will be applicable for each CDE. These business requirements serve as direct input for the data quality rules associated with the corresponding CDE. Data quality rules comes in all shapes and sizes: e.g. business checks versus technical checks, signaling versus blocking rules, and IT versus process rules. Finally, the data quality rules will be translated into the language suitable for the technology stack in use, which can be implemented into the IT process. 

In order to construct these data quality rules, guidelines based on DAMA DMBOK (2017) dimensions are used. There are six dimensions: Completeness, Accuracy, Uniqueness, Consistency, Timeliness and Validity. These dimensions serve as input for the translation of business requirements into technical data quality checks which can be monitored through a dashboard.

Step four: Setting thresholds for the CDEs

As we focus on the risk and finance environment, we deal with a magnitude of regulatory reporting requirements. One of the fundamental guiding principles is that the data quality rules should target the granular data in the original source – the so called ”golden source”. Fixing data quality issues at the source is highly efficient. After all, if multiple regulatory reports use the same data, you just need to fix it once. When all CDEs and the data quality rules are in one central location it is easier to address scenarios where the same CDEs used for different reports need different data quality parameters, and to add to these rules based on (slightly) different regulatory requirements. Specific filters can be applied to each data quality rule by defining the parameters or thresholds that the data quality check must adhere to. Say, for instance, that 99% completeness is required.

While building up the business rules repository, the organisation should continuously consider how it can identify, document and prioritise its CDEs. It should see the relationship between CDEs, product portfolios and regulatory reports. The correct thresholds and KPIs need to be defined for individual CDEs and combinations of CDEs, as well as for understanding the impact of each data quality issue in order to prioritise correctly.

Step five: Implementing data quality rules in the IT process

The designed data quality rules need to be documented and registered in a central place within the organisation in order to actively manage and track them. Ideally, a tool facilitates steps one through four to successfully implement the data quality rules in the governance and IT process. An example is a self-service repository model via SharePoint that allows users to add and edit their own CDE, and adjust the definition of the CDE. Openly storing the data quality rule repository allows the user to search for all DQ rules associated with a CDE or regulatory requirement. Another example for documenting repositories is the Collibra service.

Step six: Monitoring data quality and resolving issues

The above outlines the backend of the tool that helps to facilitate the data quality monitoring process. The results of the data quality rules should be made available via a dashboard on which active Data Quality Management can be performed. What to expect from such a dashboard for data quality?

  • For each CDE, how many data quality checks passed/failed in the latest iteration?
  • What are the CDEs with the most significant issues – in other words, material impact?
  • What is the breakdown of data quality issues per regulation or product portfolio?
  • The trend in the data quality issues: is it improving, stabilizing or deteriorating? 

Based on the governance within the organisation, this dashboard allows the right departments to monitor the data quality and to take the first step to resolve issues. In addition, such a dashboard is input for Data Quality Incident and Problem Management. Data incidents can be collected, so that the incident can be investigated alongside finding the root cause of the data incident. Ultimately, this could improve data quality and could prevent these data incidents from reoccurring. Hence, this step requires a clear data governance with clearly specified roles and responsibilities to follow up on Data Quality Incidents and Problem Management.

Getting the most out of Data Quality Management: best practices

The six-step approach outlines the best practice for Data Quality Management. Deloitte offers end-to-end advice and solutions based on specific questions and challenges. Organisations have to challenge themselves with self-reflection:

  • How to transform the regulatory requirements of BCBS #239 articles and the DAMA DMBOK theoretical principles into a practical and value-oriented solution? 
  • How to convince the supervisors that we have a sound and functional Data Quality Management? 
  • How to derive ongoing insight into the impact of incomplete and inaccurate datasets?

Deloitte offers an accelerator (Data Quality Framework) . This is a fully functional tool that comes with a useful repository of CDEs targeting widely applicable regulations, most common data quality rules and a technical stack to automatically run these rules and report on them. The repository of CDEs and data quality rules have been collected and fine-tuned over the years, taking advantage of Deloitte’s in-depth experience in the banking domain. 

Would you like to know more or discuss the data quality challenge in your organisation? Please reach out to us.

Did you find this useful?