Article

What are the characteristics of an effective governance program?

The CLO’s guide to Generative AI risks and opportunities

While the risks associated with Generative AI (GenAI) are numerous, such risks should not deter organizations from realizing the opportunities associated with it. Many risks associated with GenAI can be mitigated through the implementation of a robust and centralized GenAI governance framework.

Effective governance programs are often characterized by comprehensiveness and adaptability, integrate a variety of measures, and exhibit the following characteristics:

Self-sustaining. It can function, adapt, and thrive after implementation.

Strategically driven. It is informed by an organization’s vision for AI use and its broader strategic priorities.

Risk-informed. It implements controls, monitoring, and oversight that are tailored to the degree of risk associated with the use case, striking appropriate balance with strategic priorities.

Values-aligned. It is consistent with, and supportive of, the organization’s mission and values.

Agile. It consists of policies and controls that are flexible and adaptable to rapid changes in emerging technologies and governing legal regimes.

Proactive. It contains cohesive workflows and chains of responsibility to maintain a consistent, proactive approach to AI development and implementation.

Management of risk through corporate governance should be a continuous cycle of mapping, measuring, and managing.

This section is an infogram

This message and the space it occupies will not be displayed when viewing this page either in Live, Preview, or "View as published" modes

The unique risks posed by AI require broad assessment from relevant stakeholders. Mapping, measuring, and managing risk allows an organization to engage in oversight and assign roles to ensure compliance. The oversight should include issue spotting by cross-functional groups/committees representing key business and compliance functions, including groups like legal (transactional and litigation), information technology, data security/information security, procurement, finance, sales, marketing, and quality. Some other common elements of an AI governance program include, but are not limited to:

  1. Flexible but effective controls – Leading practice controls necessitate a flexible approach aligned with the levels of risk involved. They should be structured to encourage and support innovation without compromising the integrity of operations. Effective control systems are top-down with increasing specificity from enterprise value statement and policy to procedures at the unit or department level. This top-down approach enables clear communication of expectations and allows for a degree of specificity that can be adjusted as per the needs of different teams, promoting both compliance and operational efficiency.
  2. Targeted testing – Testing protocols should be designed to meet the distinctive requirements of each use case. By benchmarking against prevailing industry standards, these testing protocols aim to help ensure that an organization’s practices are on par with regulatory requirements and industry norms. The adaptability and evolutionary nature of these tests are key, allowing for the incorporation of new regulatory requirements and shifts in industry benchmarks. Clear testing protocols help organizations to safeguard against emerging risks and maintain the highest standards of quality and reliability.
  3. Routine monitoring – Implementing a robust monitoring framework is important for overseeing AI applications, particularly those categorized as high-risk post-deployment. This framework should encompass clear reporting mandates and structures capable of tracking performance and facilitating adherence to the dynamic regulatory and legal environment. An organization should undertake routine checks and balances to determine whether their AI deployments continue to abide by the latest standards, thereby mitigating potential risks and maintaining accountability.
  4. Employee AI training – Training programs should be tailored to the specific roles and responsibilities of personnel across management, compliance, business, and legal fields. Training should equip individuals with the proficiency to identify potential issues commensurate with their level of involvement, accurately assess varying degrees of risk, and recognize when situations warrant escalation.

Investing in such a governance program is not merely precautionary, but a strategic move to harness the transformative power of AI while maintaining uncompromised compliance and operational integrity.

Other Generative AI topics to explore

Learn about other areas of Generative AI and how it impacts CLOs and their teams. From the basics to the more complex challenges, these resources are designed to help you navigate GenAI’s legal implications and risks with ease.

Get in touch

Danny Tobey MD, JD profile image

Danny Tobey MD, JD

Partner
Chair | Americas AI & Data Analytics Practice
Co-Chair | Global AI & Data Analytics Practice
DLA Piper

danny.tobey@dlapiper.com

Barclay Blair profile image

Barclay Blair

Senior Managing Director | AI Innovation Team
DLA Piper

barclay.blair@dlapiper.com

J. Donald Fancher profile image

J. Donald Fancher

Principal | Deloitte Risk & Financial Advisory
Leader | Chief Legal Officer Program
Deloitte Financial Advisory Services LLP

dfancher@deloitte.com

Ashley Allen Carr profile image

Ashley Allen Carr

Partner
DLA Piper

ashley.carr@dlapiper.com

Sean Patrick Fulton profile image

Sean Patrick Fulton

Counsel
DLA Piper

sean.fulton@dlapiper.com

Jon Foster profile image

Jon Foster

Managing Director | Deloitte Risk & Financial Advisory
Deloitte Transactions and Business Analytics LLP

jonfoster@deloitte.com

Erin Hess profile image

Erin Hess

Research and Insights Manager
Chief Legal Officer Program
Manager | Deloitte Risk &
Financial Advisory
Deloitte Transactions and
Business Analytics LLP

erhess@deloitte.com

This document contains general information only and the authors are not, by means of this document, rendering accounting, business, financial, investment, legal, tax, or other professional advice or services. This document is not a substitute for such professional advice or services, nor should it be used as a basis for any decision or action that may affect your business. Before making any decision or taking any action that may affect your business, you should consult a qualified professional advisor.

The authors shall not be responsible for any loss sustained by any person who relies on this document.

As used in this document, “Deloitte” means Deloitte Financial Advisory Services LLP, which provides risk and financial advisory services, including forensic and dispute services; and Deloitte Transactions and Business Analytics LLP, which provides risk and financial advisory services, including eDiscovery and analytics services. Deloitte Transactions and Business Analytics LLP is not a certified public accounting firm. These entities are separate subsidiaries of Deloitte LLP. Please see www.deloitte.com/us/about for a detailed description of our legal structure. Certain services may not be available to attest clients under the rules and regulations of public accounting. Deloitte does not provide legal services and will not provide any legal advice or address any questions of law.

Copyright © 2025 Deloitte Development LLC. All rights reserved.

Copyright © 2025 DLA Piper. All rights reserved.