Skip to main content

IFRS 17 is a Challenging Test

Insights from the IFRS 17 Trenches – July 2021

We are in the second half of 2021 and, from our interaction with insurers on implementation projects, they now understand the requirements of IFRS 17: Insurance Contracts (I will refer to this as “the Standard” below). They also have varying degrees of knowledge about how to test a new solution. However, we are seeing that the combination of the two: IFRS 17 and testing of a solution is proving to be a challenge. Why is this?

In simple terms, the purpose of testing is to evaluate whether a solution meets the requirements set out prior to the build or implementation phase. The objective is to find gaps, errors, or missing functionality and to rectify these to ensure the delivery of a solution that produces accurate and complete results when the solution goes live.

The requirements for IFRS 17 are complex, multi-faceted and interdependent. Implementation teams are focusing their energy on ensuring that the data, systems and processes are appropriate for their businesses. We do however see a risk that teams are possibly not dedicating adequate time to testing (strategy, planning and execution), which will lead to issues during parallel runs and live reporting.

The central challenge is that IFRS 17 is new and fundamentally changes the presentation of insurance results in the income statement and balance sheet. There are no existing results or processes to use as a benchmark; there are no reference points to compare against. The situation is further complicated by the fact that it takes a team of actuaries, accountants, and IT and data professionals to deliver a viable IFRS 17 end-to-end solution (i.e. from data source to production of IFRS 17 compliant financial statements). All three disciplines approach the situation from diverse perspectives and use different methodologies and techniques, especially when it comes to testing.

IFRS 17 has long been labelled a data intensive standard and requires considerable effort to access data at the required level of granularity. This is causing a delay in the execution of testing as representative data, in terms of both content and volume, is not readily available.

In this article I will share further details around the challenges that we are witnessing in the testing of IFRS 17 solutions. These can be summarised into five key topics:

• No existing system to test against

• A testing framework• Data readiness

• Timing

• Testing team requirements

It is obvious that IFRS 17 is a new financial reporting standard, but what some underestimate is the requirement for interpretation and decision-making, a new operating model, processes, systems, and data. Since so many aspects of the current reporting process require simultaneous change, validating and confirming the results is extremely difficult since there are no existing IFRS 17 results to compare against.

While vendor solutions may come with a pre-configured test plan, insurers who have selected a vendor solution for the calculation engine and sub- ledger have not removed the need to test. This is because vendors have approached the requirements of the standard differently and hence insurers need to be sure that there is alignment with their approach. Also, each time a new version of the vendor software is released, it is important to retest the results to ensure validity according to expectations.

To validate the output from the new IFRS 17 components (including processes and models), many teams are using an IFRS 17 prototype, often developed in Excel, as a comparison. These prototypes provide the teams with some confidence around the accuracy of their results, without having to derive the numbers from first principles. The challenge is that the prototypes generally lack sophistication and incorporate assumptions and sample data to simplify the time to deployment. Hence the results are not necessarily representative of those that would need to be presented in the Annual Financial Statements. This comparison can create some uncertainty in the results but is a useful method to hasten the testing process.

For those who have attempted to start testing early, the challenge remains that implementations are delayed, aspects of the new finance and actuarial processes have not yet been defined and the correct data is not available. How can you be sure that your results are valid?

As mentioned above, considerable time has been invested in understanding the Standard and in designing a solution. We are seeing that the time remaining for test planning and execution is narrowing as implementations run into delays and additional build tasks are identified, such as the refinement of the expense allocation engine, the process to segregate investment funds and the integration of the reinsurance data. Few organisations have invested adequate time to develop a robust testing strategy and plan containing details of the testing phases, objectives, resources (who is accountable), management plan and timelines. The following diagram illustrates the aspects to be considered in testing.

The concern is not only about the lack of planning, but we also see that the scenarios, usually referred to as test cases, are not adequately defined in terms of required input data, scenarios being tested (e.g. CSM), testing procedure (e.g. steps to follow to calculate CSM), and expected results (e.g. actual CSM). Test cases should be defined for each requirement specified for the solution and at a level that is granular enough for the functionality to be tested and signed off. The risk of missing this detail is that less obvious scenarios will not be sufficiently tested and, as a result, there may be insufficient time for remediation, or the scenario will only emerge when the system is live.

Best practice for testing requires that the team evaluating the functionality of the end-to-end solution should be different to those who designed and built the solution. The risk of not segregating teams is that the build team are often overly optimistic and take the view that “it should work” and therefore do not test every aspect. In essence, they are effectively marking their own homework which is far from ideal.

Independent testing teams bring an impartial view as they do not assume that certain functionality exists or that a component or model works in a certain way. However, the reality is that there are a limited number of experienced IFRS 17 resources globally and therefore it may be difficult to form an experienced testing team. Hence any independent testing team will most likely lack the IFRS 17 knowledge and context around complexities and nuances of the solution. This may result in extended testing timelines and solutions deployed into production with bugs and flaws that are only detected in the live environment. For this reason, many are leveraging a hybrid team with independent testers and IFRS 17 experts.

Prior to embarking on parallel runs in 2022, it is vital that sufficient time has been allowed for the business users to learn, understand, and familiarise themselves with the newly defined reporting process. It is not adequate that the design and build team know the system intimately; the users need to be comfortable with the functionality and usability. We are concerned that end users are occupied with their business as usual commitments and have little bandwidth to learn about and focus on the new financial operations and financial close process that will be needed under IFRS 17.

The readiness of data required to feed the IFRS 17 solution is causing a delay in the testing process. It is ideal to conduct testing with data that is representative of a live environment, but the data is often not ready due to availability, quality, and granularity issues. Considerable time is being spent in remediating poor-quality data. We also see that clients are having issues in finalising the data specifications from the actuarial teams and in accessing the required data from source systems.

For these reasons, many insurers are using manufactured or derived data which takes time to produce and is constructed to be the best version of the data rather than to replicate reality, which comes with its own associated flaws. The use of dummy data can assist in the testing of individual systems but not in the end-to-end testing of the solution.

To cater for the lack of available data, testing approaches are also being amended to leverage restricted volumes and selected ranges of data only. This means that the testing of performance and scalability of the solution to manage substantial volumes of data is being left for much later in the programmes. The question remains, at what point will testing be conducted on real data at the anticipated volumes and complexity and what does this mean for the first reporting deadlines? What impact will the data have on the length of time it takes to run the end-to-end process to ensure that reporting deadlines are met?

One of the requirements of IFRS 17 is to ensure that the finance close and reporting process can be completed within a set timeframe, particularly for listed insurers. To ensure that this is achievable, the end-to-end process must be tested with as close to real-life conditions as possible. These include business as usual demands, representative data volumes, group consolidation tasks and auditor involvement.

If adequate time is not allowed for this level of testing then bottlenecks in the close process relating to the new processes, operating model, and hand-off points will not be identified and resolved prior to the go live date.

As the diagram illustrates, there are specific tasks to be completed before going live to ensure that the business users understand the new solution; that the reporting team are well rehearsed and that the provisional numbers are as expected and signed off by relevant stakeholders.

In preparation for the IFRS 17 finance runs, validation of the transition numbers must be completed. This is a once-off exercise to confirm that the opening balances are correct as at 1 January 2022 (assuming December year-end).

Dry runs are a tempting task to remove when timelines are tight as some believe that the parallel runs will provide the opportunity to interrogate the new financial results. However, dry runs provide the opportunity to run real data through the solution and to assess the result of the numerous design decisions taken during implementation. If the results are not as expected, then the team can alter some of the decisions such as portfolio definitions and reporting aggregation to improve reporting figure quality. Dry runs offer the opportunity for a dress rehearsal.

Parallel runs are to be conducted in 2022 and are not intended as a test run but rather an opportunity to produce the first set of IFRS 17 results that can be socialised, reconciled, and explained to various stakeholders and shareholders. The numbers may be very different to those produced under IFRS 4 and hence many of the assumptions and decisions may require justification.

It requires a multi-disciplinary team to implement the components of an IFRS 17 solution. Actuaries, accountants, and IT and data professionals approach testing differently; the jargon used by each discipline is unique, the objectives are nuanced, and the execution is dissimilar.

If you are an actuary, you generally test the output of your models by comparing it to a replication model. It is an iterative process comprising variable changes, calibrations, and scenarios as well as scaling from unit to bulk testing.

If you are in data and IT, then you typically test the system against the functional and non-functional requirements, ensure that the components of the system integrate appropriately and that the solution scales for performance and efficiency. Terminology such as system integration and user acceptance testing are commonly used.

If you are in finance, you typically test the output by reconciling to the input from the respective source systems, with a focus on the completeness, accuracy and validity of the results, as well as whether the logic generates the appropriate accounting entries.

Considering this diversity, it is not surprising to witness confusion as to who is responsible for defining and managing a testing strategy and who will drive the integration of the teams.

Never have three such distinct groups had to work together to deliver a complex solution in Insurance. Actuaries, finance, and IT and data professionals have not typically collaborated to bridge their differences in perspective, jargon, and underlying knowledge. These relationships are further complicated by external parties such as technology vendors and implementation partners who are often added to the mix for an IFRS 17 implementation.

Without a concerted effort to break down the silos, there is a considerable risk of wasting valuable time through confusion and misinterpretation in communication, false assumptions around alignment of approach and tasks falling through the gaps due to confusion around responsibility and accountability.

The testing phases, especially the end-to-end solution testing is highly dependent on the successful collaboration of the various disciplines, it is the proof of the pudding!

Testing of IFRS 17 solutions is not simple; new functionality, processes, calculations, systems, and data requires validation. Sufficient time and planning must be factored into IFRS 17 programmes now to ensure that accurate and complete financial results can be delivered in 2023.

Teams cannot leverage typical IT systems testing approaches to ensure correct results; a tailored testing strategy is required where finance, actuarial, data and IT methodologies for confirming functionality and the production of valid, accurate and complete financial results are considered.

Teams must use 2022 to uncover bugs and errors, to expose weaknesses in the reporting process, remediate issues and to demonstrate to business that the numbers produced are fit for sign off.

The requirements to interpret IFRS 17 and to design and implement a suitable solution have forced teams to break down the traditional silos, to effectively communicate requirements and to overcome misunderstandings to find suitable solutions. Effective testing of the new IFRS 17 processes, operating models and systems demand similar effort, planning and execution. It is hard but necessary!

Did you find this useful?

Thanks for your feedback

If you would like to help improve Deloitte.com further, please complete a 3-minute survey