Data and technology infrastructure: The cornerstone to an effective regulatory reporting program has been saved
Data and technology infrastructure: The cornerstone to an effective regulatory reporting program
A firm’s data architecture and the enabling infrastructure, which supports aggregating fit-for-purpose data for reporting, is fundamental to implementing an effective regulatory reporting program. This includes providing the capability to integrate different data sources in a highly automated fashion. Yet, implementing a firm-wide data architecture and IT infrastructure to meet these objectives remains challenging for many institutions due to legacy issues and pressure on costs and technology. At the same time, banking organizations are challenged by ever-increasing demands for detailed data by the regulators with more frequent turnaround times. As a result, a fragmented automation environment continues to exist in the banking industry.
August 20, 2018 | Financial services
In the previous blog from our series on the maturity of regulatory reporting, we discuss the need for a firm-wide data architecture and technology capabilities to manage these data.
Challenges to an effective infrastructure
As the volume, complexity, granularity, and regulatory expectations for regulatory reporting increased, firms have not made as much progress in maturing the data and technology environment as other areas (e.g., governance). The high investment and long runway that is required for strategic transformation of data architectures and IT platforms is a major obstacle for institutions to overcome. There are several other significant challenges that firm’s face to achieve a fully effective data environment, including:
- Data are held by data owners in silos. The data attributes are maintained from a business view with less attention paid to regulatory reporting or compliance needs
- Legacy systems and platforms persist (often from prior M&A activity) that require significant cost line to modernize
- No central function exists to set standard for data architecture and IT infrastructure, or if even when such a function exists, the governance structure to ensure compliance with firm-wide standard either is not present or is not effective due to a lack of accountability
A firm-wide approach to data can help ensure high quality data by having a single source of data that are used many times, across many reports. Such an approach also reinforces that regulatory reporting is an activity with shared responsibility across the organization. Our point of view on the state of regulatory reporting in banking industry describes the maturity level of regulatory reporting by the different components. As it relates to data architecture and the enabling infrastructure, an optimized regulatory reporting organizations has:
- Centralized data repositories that contain granular-level product information available and vetted for consumptions by reporting teams
- Firm-wide, consistent application of automation technology to data analytics and source to mapping
- Firm-wide governance over authoritative sources of data and data definitions supported with metadata polices and tools that are easily accessible and searchable
- Measures for data quality that are integrated with the firm’s accountability policy
The traditional regulatory reporting data model is to request specific data elements for a specific report from each business line in either a pre-defined template or through access to a shared data repository (e.g., data warehouse). This was designed to meet the specific report production cycles. Centralizing data into data warehouses or data lakes is an important step in the maturity of the regulatory reporting process. The next step in this evolution is storing all products and transactions data, with all the applicable attributes used for reporting and to meet business needs. This approach allows the regulatory reports to simply be an output of mapping data elements into the regulatory reporting templates. Coupled with a report automation platform, the report preparation process becomes much more efficient and effective.
Often, there is need to merge data sets to meet reporting requirements, including the integration of financial and non-financial (unstructured data). This requires the capability to integrate data between repositories and have interoperability between IT platforms. One of the largest hurdles that firms face is having to integrate data sets and provide interoperability between legacy systems. Conceptual IT and data architecture cannot be fully implemented until legacy systems are modernized to comply with firm’s data and IT strategy. Obtaining this state requires firm’s willingness to move away from business line-centric approach, which culminates in siloed data and separate technology strategies. To do so, senior management should support the cultural change by instituting accountability for the business line to conform to the enterprise framework and provide funding for a firm-wide data, reporting infrastructure, and new tools and technologies. This means implementing a governance structure that make certain that business lines or regions do not depart from the firm-wide architecture parameters.
Integrations should go beyond data availability and provides the capability to efficiently interact with common analytical tools. Data owners at the business line and at corporate function should have the capability to:
- Access the data
- Apply shared analytical tools to analyze the data for quality assurance
- Gain business value from the underlying data
As the granularity and volume of data increases, the importance of having analytical tools that can easily integrate with data sources becomes a critical need.
Firm-wide approach to data ownership and data definitions
Well-designed data architectures, data accessibility, and an effective supporting infrastructure can only be realized when the underlying data are well defined and all the needed attributes are available. The foundation to any data repository is a data dictionary, which requires that a firm-wide policy exist for defining the terms and needed attributes for each data element. Optimized organizations have a robust approval process for data definition that includes stakeholders throughout the firm with corporate regulatory reporting holding the final approval. Data stewards throughout the firm should be responsible for maintaining the metadata repository.
The data stewards should also be skilled in using the metadata repository to search for data that is already available to meet new needs or requirements. To ensure the use of a common data elements and that data users understand the content of the data they are using to meet their business need, a metadata repository should be available that is easily accessible and searchable. The metadata repository should include work flow tools to support the data definition approval process. As discussed in our change management blog, data definition may change, which requires having a firm-wide policy for revising data definitions from new interpretations. The metadata workflow tool should support the change management process.
Data lineage should also be available through the metadata repository. Data lineage provides crucial information to understand the source of the data, the data owner, and any transformations that data may have gone through before being stored in the data repository. This information is essential in assessing data quality, risk, and overall conformance with firmwide data polices.
Measures and accountability
A firm’s data architecture and the supporting infrastructure capabilities should support:
- Risk assessments
- Data quality measures
- Accountability, certification, and attestation processes
That is, the ability for chief financial officer, controllerships, and chief data officers to easily obtain detailed data to understand risk; for example, where there is an extensive number of transformations occurring. Measuring data quality is an activity needed to understand data weaknesses and risk of using data. In order to have effective measures of data quality, including KPIs, maintaining all versions of data and all actors touching the data are needed in the data repository. These capabilities need to effectively integrate with the firm’s accountability policy. Quantitative data around the data quality can help ensure that appropriate actions are taken when the policy is breached.
An increasing control used to ensure data quality is certifications and attestation processes, which are used to reinforce accountability across a firm. The supporting technology for the regulatory reporting process should include workflow that automates these processes. Information on data lineage is needed to ensure all data owners and senior management have accurately captured the appropriate level of assurance of data quality.
When a strategic firm-wide data architecture and the enabling infrastructure are implemented, the quality of data should increase by providing clarity in definition and providing effective tools to conduct quality assurance. Strategic data and technology solutions can also increase efficiency by freeing staff from manual, inefficient process steps to an increased focus on the data. Therefore, optimized regulatory reporting organization use the foundations of their data architectures to sustain effective regulatory reporting programs.
This publication contains general information only and Deloitte is not, by means of this publication, rendering accounting, business, financial, investment, legal, tax, or other professional advice or services. This publication is not a substitute for such professional advice or services, nor should it be used as a basis for any decision or action that may affect your business. Before making any decision or taking any action that may affect your business, you should consult a qualified professional advisor.
Deloitte shall not be responsible for any loss sustained by any person who relies on this publication.