Master data management


Master data management

Master data management is a method used to define and manage the critical data of an organization to provide, with data integration, a single point of reference. Poor master data quality can affect a business in several ways. It can either impact business transparency and therewith jeopardizes effective management, and it can also slow down processes in the organization since additional manual work arounds and extra steps might be necessary.

Poor data quality can have several cause layers, internal and external, and only if all potential causes for poor master data are identified, corresponding counter-measures can be taken.

Data quality framework

A data quality framework ensures a methodical approach to measure, maintain and improve data quality, the objective of the data quality framework is:

  • To provide a concept for ensuring high master data quality by measuring, analysing and correcting master data based on the found errors
  • The definition of data quality dimensions and business rules
  • To manage master data effectively through clearly defined target levels for data quality and maintenance performance
  • To measure data quality and create quality reports
  • To have clear roles and responsibilities responsible for master data quality measurement and the data quality management process
  • To enable sustainable master data cleansing after the first go-lives of MDG
  • To provide a flexible framework to enhance the concepts with specific business rules which are defined step-by-step by the Information Owners

Five key concepts of the framework

The framework is based on five key concepts to ensure a holistic approach towards data quality management:

  1. DQM Process: A clearly defined process ensures that data quality is
    measured and managed on a regular basis and it also includes system support for reporting and data analytics
  2. Reports: A structured reporting enables the business to identify
    problematic data quality areas and provides decision support for data quality improvement measures
  3. Roles & Responsibilities: Clearly assigned roles and responsibilities ensure that data
    quality is not an one-time effort but rooted within the organization
  4. Business rules: Business rules define how „good“ data quality should look like
    not only during migration but also during operational execution
  5. KPIs: KPIs are based on the business rules defined per object, type or attribute and should be aggregated for each dimension and also define target levels for data quality

MDM provides a single centralized location for metadata content and enables developers and business users to understand the origins, definitions, meanings and rules associated with master data.

Contact us

Donovan Spronk

Donovan Spronk

Partner AI & Data | AWS Alliance Lead CE

Donovan is a Partner in the Consulting department and Leader of the AI & Data team within Deloitte in the Czech Republic. Donovan leads the AWS Alliance in CE, which brings the joint power of Deloitte... More