Assessing data quality programs for regulatory reporting has been saved
Assessing data quality programs for regulatory reporting
Current environment of data quality assurance procedures
Banks are struggling to sustain an appropriate level of data quality in their regulatory, risk, and management reporting. What steps can banks take to create an effective data quality program?
July 24, 2019 | Financial services
The current environment of data quality and regulatory reporting
Firms continue to face challenges in sustaining the appropriate level of data quality in their regulatory, risk, and management reporting. Legacy systems and fragmented infrastructure throughout the data supply chain are formidable obstacles to efficient reporting processes with high-quality data. Remediation of data issues or implementation of new regulatory requirements is often limited to tactical interim solutions as the firm plans for a strategic solution. In reality, tactical solutions often remain in place and become permanent, and strategic solutions fall by the wayside because of cost pressures or new initiatives with higher priorities.
To achieve a sustainable solution and drive long-term efficiency, firms should invest in the data infrastructure and data quality controls. However, until that occurs, the risk of poor data quality must be mitigated by a strong internal control environment. These controls include effective data quality programs including quality assurance (QA) functions. Outlined below are some well-developed practices that can be considered for creating effective data quality programs, including a robust QA function.
Regulators' expectations regarding data quality controls
Heightened regulatory expectations for the data quality used for supervisory and risk management purposes are forcing firms to take new approaches to manage data quality. To meet these expectations and increase internal reliance on data, a well-controlled reporting and data process, supported by a well-designed data infrastructure, is needed. An effective reporting and data framework can be divided into four components:
- Governance and oversight: Leading governance structures should have the following attributes: senior management oversight, accountability framework, monitoring of data quality metrics and issue resolution, and a firm-wide training program.
- Data infrastructure: An integrated data infrastructure includes end-to-end data process flow with lineage for critical data elements, the integrity of source systems data, and metadata repositories (including data dictionaries).
- Internal controls: An effective internal control environment includes standard policies and procedures, a firm-wide data quality program, data/process flow control activities, issue tracking process, and an independent QA function that focuses on reporting accuracy and completeness, including the integrity of the underlying data.
- Internal audit: An internal audit program should be around design and effectiveness of the regulatory reporting controls (including the effectiveness of QA function) and reporting data quality, with transaction-level testing and validation of self-identified issues and regulators’ findings.
As outlined in the 2019 Banking Regulatory Outlook, data owners—including business lines and oversight functions, the chief financial officer, the chief data officer, and the chief risk officer—should consider asking the following questions:
- Is the quality of data fit-for-purpose?
- Are the origins of the data clear and well documented?
- Are data definitions and standards established and consistent across the firm?
Producing high-quality, fit-for-purpose data is a firm-wide activity with shared accountability across the three lines of defense. Thus, regulatory expectations focus on strengthening governance and oversight, building data competences across the firm, and establishing an integrated approach to data.
Effective data governance and oversight requires a cultural shift supported by senior management and the board of directors. A capable governance structure supports accountability for all stakeholders, including business lines, which can be evidenced and measured. Measurement and monitoring are critical to the success of the governance structure and requires data quality to be measured quantitatively and qualitatively. The level on which internal controls operate cannot be assessed without appropriate monitoring capabilities and measures. Therefore, a centrally-managed data quality issue tracking and resolution process establishes program discipline and can help ensure that stakeholders are accountable.
Effective internal controls over reporting data require an enterprise data QA program that resides in all three lines of defense:
- The first line of defense (LOD) consists of the report production and data aggregation area, as well as the data origination (largely owned by the lines of business or 'LOBs') and transformation performed by LOBs and various corporate functions. To ensure the integrity of data and accuracy and completeness of regulatory reports, the first line is responsible to implement various quality controls. These controls include data quality checks at the transaction/instrument origination and at each transformational point across the reporting data flow.
In addition, the first line controls are represented by cross-data set reconciliations, quantitative business rules, and qualitative analysis. These activities are—in turn—supported by various levels of management review and attestation and comprehensive supporting documentation and analytical capabilities and tools.
- Independent QA function plays the role of the second LOD (it may also reside incorporate function, for example, finance and referred to as '1.5 line'). The QA function performs detective controls that are independent and objective by providing an effective challenge to report accuracy and completeness through testing and data integrity validation.
- Internal audit, as the third LOD around reporting, is responsible for validating the work of the first and second lines of defense by implementing an independent program that evaluates integrity and accuracy of reporting, evidenced by controls and transaction testing. Internal audit should include in its audit program the evaluation of the QA function’s scope and activities.
QA function considerations
Mandate and scope
The primary role of the QA function is to act as an independent and objective control over data quality and through its activities providing insight into the effectiveness of a firm’s data integrity controls. Therefore, the QA program is expected to cover data at the corporate level (report owner) and the lines of business level (data owner).
It is important to note that QA function does not provide a “seal of approval” with zero defects, nor does it guarantee the completeness and accuracy of regulatory reports. Rather, it serves as a control that is complementary to, but independent from, the existing first-line processes and controls related to reporting preparation.
The approach to planning an annual QA program needs to prioritize efforts by assessing the impact of the data and the risks to the reporting processes. In enhancing the QA planning process, forward-looking analytics can be leveraged to anticipate key risk areas. Testing methodology is a crucial consideration in planning and scoping.
Historically, the independent QA functions were first established within the controllership area of finance organization given finance’s role in regulatory reporting and control-related activities performed under the Sarbanes-Oxley program. As the industry evolved and the focus has shifted to data integrity and quality, some financial services organizations formed lines of business-centric QA teams. Other organizations established data integrity teams with the objective to perform testing of the reporting data based on the data quality requirements linked to regulatory reporting. Finally, in the effort to maintain "true" independence from reporting processes, several firms moved QA responsibilities to a specific area of the risk function.
While the QA team usually has a direct reporting line to a corporate function, it can also have a periodic reporting and escalation to certain oversight bodies (e.g., steering committees). The QA function should also coordinate its activities with internal audit to avoid redundancies in coverage and engaging report production teams at the same time.
From the team composition perspective, the QA function includes individuals with business and data analytical skills and with a knowledge of regulatory reporting requirements and key products offered by the entity.
QA activities typically include transaction testing and internal controls testing. Transaction testing is intended to determine proper classification/mapping of underlying transactions or instruments and whether the data in the source system reflects the underlying economic transaction and conforms to the data definitions for reporting. Internal control testing activities focus on design and operating effectiveness of report preparation controls, including data quality checks, business rules, and reconciliations, as well as governance controls including policies, procedures, standards, and training.
In addition to these core activities, the QA function can also be involved in validating results of regulatory report risk assessment or prioritization done by the first LOD, monitoring issue resolution or other remediation activities, performing unit acceptance testing, or report deconstruction as part of new report implementation or automation.
The testing process includes scoping, sample selection, testing execution, and reporting of results. The QA function aggregates and compiles test results by testing errors and issues which are then presented to the key stakeholders such as report owners, functional or LOB stakeholders, and governance bodies for resolution.
Product-based testing and report-based testing are the two most common testing methodologies. The product-based approach relies on the ability to extract data from product capture or processing systems. It requires a detailed understanding of the report production process including mapping of products from source systems all the way to the final report. Since products can cross many reports, a product-based testing approach can enable added efficiency to the testing process by applying the test across multiple reports.
Alternatively, the report-based approach focuses on front-end inputs and resulting report outputs. It also includes an assessment of the appropriateness of key data attributes and accuracy of static data to document whether transactions or trade data are appropriately classified and meet the reporting requirements.
Use of data analytics and automation
Data analytics plays an important role in QA processes. To identify data quality issues, data profiling is used based on underlying record level data. Data analytics activities have been effective for high-risk reports with a history of data quality and interpretation issues and requiring aggregation of a large volume of the instrument or transaction-level data.
Several technologies can enhance the quality of regulatory reporting and the testing process. Test case management solutions and rules engines can automate test cases on a real-time basis. The implementation of test cases using automation can help monitor data quality on a continuous basis and exceptions can feed into the workflow for exception management and be published in dashboards for effective governance.
Workflow management tools offer an opportunity to track reports through the QA process and manage exceptions. Those exceptions can also feed into a change management process and resolution. Using technology to assist in this process can enable better controls over completeness and timely resolution. Transparency into the process also enables effective governance.
Data from the testing and workflow can be published in a dashboard to diagnose issues faster and to allocate resources. Progress reporting also enables the governance function to track the book of work to completion.
Emerging QA needs
While the traditional QA program targets established or matured regulatory reporting processes, there is an increasing need to set up a second LOD to cover many other reporting obligations across the banking organization. These reporting needs include ad-hoc requests from local or foreign regulators, non-financial regulatory reports, and data collection related to regulatory exams with on-site visits.
Ad-hoc reporting requires a much more fluid process when it comes to reviewing. There are cases where the turnaround of submission is within days from the original regulatory request. Therefore, the QA team needs to be proactive as well as adaptive to differing requests in both planning and execution.
The scope and activities of QA function continue to evolve based on the increased demands for more frequent, high-volume reporting, with very granular data at the product and transaction levels. Continued investments into QA teams and related infrastructure may help institutions ensure accuracy and completeness of their regulatory reports and provide a foundation for strong governance and oversight over the reporting framework.
This publication contains general information only and Deloitte is not, by means of this publication, rendering accounting, business, financial, investment, legal, tax, or other professional advice or services. This publication is not a substitute for such professional advice or services, nor should it be used as a basis for any decision or action that may affect your business. Before making any decision or taking any action that may affect your business, you should consult a qualified professional advisor.
Deloitte shall not be responsible for any loss sustained by any person who relies on this publication.
State of the US financial system, supervisory trends, and emerging risks
Resolution plan filing groups and requirements