Posted: 18 May 2022 8 min. read

EU Digital Services Act: Are you ready for audit?

The EU Digital Services Act is expected to come into force in 2024 and will be directly applicable across all Member States to intermediary service providers, established in or outside the EU, who provide services to recipients with establishment or residence in the Union. The Act imposes new transparency and accountability requirements on the providers of intermediary services, with additional specific annual audit requirements for very large platforms.

The Digital Services Act was agreed on 23rd April 2022 by the European Council and the European Parliament. The Act builds on the existing E-commerce Directive, adding further requirements on content moderation, algorithm reviews, reporting, complaint handling and policing of goods and services. Whilst most rules under the Act will apply to all intermediary services providers, some further obligations will apply specifically to very large platforms; those catering to 45 million active monthly users or more.

This article will mainly focus on summarising the aspects of the Act which are subject to annual independent audit requirements for very large online platforms, as set forth in Article 28. The annual audit requirement is divided into two sections:

  1. Review of platform’s compliance with Chapter III; and
  2. Specific review of any commitments to the codes of conduct and crisis protocols outlined in articles 35-37 of Section 5 of Chapter III.

Auditors are required to provide their findings in the form of an “opinion” which has to be provided as “positive”, “positive with comments” or “negative”.

The Act further mandates that where an audit opinion is not positive, operational recommendations on specific measures to achieve compliance should be included. The very large online platforms shall, within one month from receiving those recommendations, adopt an audit implementation report that sets out those measures.

Overview of Chapter III requirements relevant to the independent audit

Chapter III of the Digital Services Act is split into 5 sections, each addressing different aspects of transparency and accountability requirements.

Sections 1 to 3 apply uniformly to all online services providers, regardless of size, whilst section 4 requirements are specific to very large platforms. Section 5 addresses the future standards, codes of conduct and crisis protocols in this area, which platforms may have to regularly report on once established.

More specifically, Section 1 mandates a number of detailed requirements for intermediary service providers, with limited scope for interpretation. These requirements include, amongst others, establishment of a single point of contact and legal representatives in the Member States. Additionally, intermediaries will have to publish their Terms and Conditions for use including information on policies, procedures, measures and tools used for content moderation, algorithmic decision-making and human review. Section 1 further requires organisations to publish annual reports detailing the number of complaints received, and the decisions taken to remediate identified issues.

Section 2 handles the tracking and addressing of notices flagged to hosting services providers and online plaforms, warning of illegal content being shared on their platform. Unlike the more specific terms of Section 1, Section 2 requirements provide more scope for ambiguity and interpretation. Section 2 requires the hosting service providers to have in place “user-friendly” and “easy-to-access” mechanisms to notify them of illegal activities. Subsequently, the hosting service providers and online plaforms are required to process the notices in “timely and objective” manner. The Act does not clarify what “timely” would mean, opening up scope for inconsistent interpretation across the industry. 

Section 3 delves into additional provisions for internal complaint-handling systems and the use of out-of-court resolution bodies (certified by the Digital Services Coordinators) for online plaforms. The regulator requires the online platforms to deal with complaints in a “timely, diligent and objective manner” and to inform the complainants of their decision without “undue delay.” Comparable provisions had been previously instigated within the financial services industry through MiFID II, where “timely” was interpreted as being 8 weeks  by the FCA in the UK.  It remains to be seen, however, whether financial services legislation can be used as a guidance for the terms used in Digital Services Act or whether regulators will tailor specific rules to the Digital Services industry.

Measures against “misuse” are further mandated under Section 3, requiring online platforms to suspend access to services for a “reasonable time period” for those recipients who “frequently” provide illegal content. Firms will be expected to define what these terms mean for their operations, and to implement their approach consistently.

Section 4 is relevant to the very large platforms, as defined by average active monthly users amounting to about 10% of the EU population, most of which mandate heightened transparency and accountability requirements. They will be further required to perform annual risk assessments, implement effective risk mitigation plans, and to appoint one or more compliance officers with appropriate qualifications, knowledge, and experience in the field.

Finally, Section 5 of Chapter III introduces standards, codes of conduct, and crisis protocols that firms may be required to commit to in the future. The European Commission and the European Board for Digital Services will facilitate the drafting of these, their implementation and promotion within online platforms. There will be an expectation for external auditors to review firms’ commitments to these standards, codes of conduct and crisis protocols. It is also expected that independent opinions on hosting services providers’ compliance with these standards will become part of their broader annual reports.

Next steps

Firms will be required to be compliant with the Digital Services Act requirements when they come into force in 2024. The specific parameters and audit methodology required to produce the required independent audit opinion have not been laid out in the Act and so firms and their chosen auditors will need to consider the format, approach and detailed methodology required to meet these requirements ahead of the audit execution.

Deloitte has been tracking the development of Digital Services regulation for a number of years. We have extensive experience in algorithms and AI system assurance, including supporting firms enhance their existing controls frameworks and assess conformance with regulatory requirements.

You will find our previous blogs and our insights packs on algorithmic and AI assurance here. If you would like to discuss this topic further with us, don’t hesitate to get in touch with any of the authors.

 

  [1] DISP Handbook 1.6 Complaints time limit rules

Key Contacts

Mark Cankett

Mark Cankett

Partner

Mark is a Partner in our Regulatory Assurance team. He is our AI Assurance, Internet Regulation and Global Algorithm Assurance Leader with 20 years of experience across financial services audit and assurance, regulatory compliance, regulatory investigations and disputes. He has led the development of our assurance practice as it relates to our approach to assisting firms gain confidence over their algorithmic and AI systems and processes. He has a particular sub-sector specialism in the area of algorithmic trading with varied experience supporting firms enhance their governance and control environments, as well as investigate and validate such systems. More recently he has supported and led our work across a number of emerging AI assurance related engagements.

Barry Liddy

Barry Liddy

Director

Barry is a Director at Deloitte UK, where he leads our Algorithm, AI and Internet Regulation Assurance team. He is a recognised Subject Matter Expert (SME) in AI regulation, has a proven track record of guiding clients in strengthening their AI control frameworks to align with industry best practices and regulatory expectations. Barry’s expertise extends to Generative AI where he supports firms safely adopt this technology and navigate the risks associated with these complex foundation models. Barry also leads our Digital Services Act (DSA) & Digital Markets Act (DMA) audit team providing Independent Assurance over designated online platforms' compliance with these Internet Regulations. As part of this role, Barry oversees our firm's assessment of controls encompassing crucial areas such as consumer profiling techniques, recommender systems, and content moderation algorithms. Barry’s team also specialises in algorithmic trading risks and controls and he has led several projects focused on ensuring compliance with relevant regulations in this space.

Lenka Rueda Molins

Lenka Rueda Molins

Senior Manager

Lenka is a Senior Manager in our Algorithm and AI assurance team in London. Lenka is a US qualified lawyer with 9 years’ experience across financial services and digital services regulatory compliance, investigations, audit and assurance. Lenka’s focus revolves around algorithms and AI in the capital markets and she leads on the development of our EU Digital Services Act assurance framework.