Are your ethics aligned with your digitization plans? has been saved
Perspectives
Are your ethics aligned with your digitization plans?
CFO Insights
In this issue of CFO Insights, we’ll discuss why companies committed to competing in the digital economy should make sure that employees have access to the training and resources to analyze ethical conundrums.
Explore content
- Introduction
- Developing strong work ethics
- Anticipating the unanticipated
- Ethics as a competitive advantage
- In technology we trust
Introduction
The most digitally mature companies aren’t distinguished only by their advanced approach to acquiring and implementing cutting-edge technology. What they also have in common is a distinct and refined set of ethics guidelines.
That finding stood out among the conclusions of the fifth annual study of digital businesses conducted by Deloitte and the MIT Sloan Management Review. The results are drawn from a global survey1 of more than 4,800 managers, executives, and analysts, as well as interviews with executives and thought leaders.
Such ethical guidelines are necessary as companies increasingly experiment with different approaches to generating innovation. For example, among survey respondents from digitally maturing businesses,2 83 percent reported using cross-functional teams to innovate. A similar proportion, 80 percent, say their companies cultivate partnerships with other organizations to facilitate digital innovation.
No matter how loosely structured such collaborations may seem, management should remain acutely aware of the need to design appropriate governance policies, balancing autonomy and risk management. There should also be protections in place to maintain shared goals and safeguard corporate vulnerabilities.
It’s especially critical for the finance function, which acts as an on-ramp for the influx of new technologies and the proliferation of data sets, to drive an ongoing awareness of the ethical implications of digital transformation. And in this issue of CFO Insights, we’ll discuss why companies committed to competing in the digital economy should make sure that employees have access to the training and resources to analyze ethical conundrums.
Developing strong work ethics
Nearly every company, once it has grown beyond the startup stage, has an employee handbook that at least begins to spell out expectations regarding employee behavior. More mature organizations might even have an overarching values statement and crafted ethics policies. But managers may be mistaken if they assume such legacy policies are sufficient.
Such guidelines, after all, haven’t been designed to help steer digital innovators through the ethical dilemmas they may encounter as they drive an organizational transformation. Companies that commit to greater flexibility in the service of boosting innovation are invariably empowering employees. And those employees should know how to respond—or where to turn for help responding—when novel ethical questions arise.
And questions surely will arise. If teams are encouraged to experiment, they are bound to encounter situations that the guidelines haven’t envisioned. Who, for instance, could have imagined the ethical quandaries social media would continue to present? Fast-developing technologies, such as artificial intelligence (AI) and robotic process automation (RPA), should enable companies to seize emerging opportunities ahead of their competitors. But these technologies present risks of their own (See 'Building trust in technology'). AI applications, for example, often process vast data sets—data sets that may end up in the hands of those who are ill-equipped to apply them to such activities as analytics or forecasting in an ethical and responsible fashion. When asked about their biggest concerns brought about by digital innovation, respondents to the Deloitte and MIT SMR survey ranked the unethical use of data among their top worries, along with cybersecurity, digital crime, and job replacement.
That said, companies are still actively working to align their ethics with their transformation efforts. For the purposes of the study, surveyed companies were classified into one of three digital maturity categories: early (24 percent), developing (44 percent), and maturing (32 percent). As it turns out, digitally maturing companies are more likely to have adopted policies to support their organizations’ ethical standards with regard to digital initiatives. The survey found that 76 percent of them had such policies in place, compared with 62 percent of developing companies and 43 percent of early-stage ones.
Still, only 35 percent of respondents across maturity levels say their companies are talking enough about the ethical implications of digital business. Not even half (46 percent) of CEOs say their companies are spending enough time on ethical matters—a notable finding, considering that they have the most control over their company’s agenda. Not surprisingly, respondents from digitally maturing companies are the most likely to say their leaders are doing enough. But even then, the percentage barely surpasses a majority, at 57 percent. Only 16 percent of respondents from early-stage companies share that sentiment.
Anticipating the unanticipated
As much as they see how digital initiatives can generate long-term value, companies are also keenly aware of the risks associated with transformation. They’ve seen how a data breach can affect a company’s reputation or how fast-acting algorithms can intensify issues relating to bias.
For CFOs and other C-suite leaders, there’s a tension between the competitive pressure to consistently deploy new digital services and the obligation to weigh the ethical issues involved. Still, in broad terms, there are areas where organizations should be prepared for the possibility of unanticipated consequences:
- Technology-driven innovation. In a 2018 Deloitte survey of 1,400 US executives knowledgeable about AI,3 56 percent said they expected cognitive technologies to "transform" their companies within three years. Among respondents, 32 percent ranked ethical issues as one of the top three risks of AI.
- Decision-making by algorithm. As companies use technology, such as RPA, to automate more—and increasingly complex—assessments, questions will arise as to who is responsible for misguided decisions that did not involve human judgment. In finance, for example, the day may come when algorithm-driven processes can make choices based on real-time data. But—not unlike human decision-makers—algorithms are imperfect. The responsibility for any error could reside with anyone involved, from the software designer to the CFO.
- Weak governance structure. Given the abundance of risks that digital transformation can introduce, companies may want to clarify governance over technology-driven decisions. By establishing a committee to oversee the technology’s application, CFOs can also have a resource they can use to track evolving regulatory issues.
There's an ancillary benefit to taking such visible actions: It can help reassure employees, who may worry about how the digital transformation will affect them. In the Deloitte and MIT SMR survey, about three-quarters of respondents agreed that concerns over privacy have increased throughout their organization as a result of digital technologies.
Ethics as a competitive advantage
Some digitally mature companies have incorporated ethics oversight into their leadership structures—in one case, adding a "chief ethical and humane use officer." Others have created elaborate codes of conduct.
Such approaches can be helpful but may be insufficient in and of themselves. In the absence of a culture embedded with ethics, a proclamation or job description is likely not enough. That’s especially true in an environment alive with innovative activity, where the issues that arise can be especially knotty and transformative themselves.
There’s also a more tangible reason to embrace ethics: It can be a competitive differentiator in a marketplace, where reputation and values can help capture market share. But ethical considerations have to be internal priorities before they can become external advantages. In the midst of a digital transformation, organizations can trumpet their support for ethical behavior and institute guidelines, but individual employees may be the ones making the decisions. And there may not be a single “right” answer; who is to say, for example, when data collection has gone so far as to become invasive?
To develop their ethical muscles, leaders and employees alike should build an environment that is conducive to their use. What should such an organization look like? Here are some steps that can help cultivate an ethical culture:4
- Learn to incorporate ethical considerations early. When evaluating proposed applications of emerging technologies in products and services, consider possible ethical, social, and cultural implications.
- Build a broad culture of responsibility. How? By training employees to integrate principles of fairness and ethics into their decisions and encouraging desirable behaviors through strong support from the top and aligned performance management systems.
- Make sure the organization’s values keep pace with its innovations. Present ethical behavior as an enabler of growth, rather than a constraint. By solidifying the importance of ethics, companies may have a better chance of spotting issues before they grow into problems.
- Advocate for making ethics a board-level Issue. In a corporate context, an ethical mistake doesn’t just weigh down someone’s conscience; it can also affect an organization’s brand and its value. That should earn it the attention of the board, which can also enhance its visibility throughout the organization.
- Integrate ethics into the overall strategy. By creating an ethics board or appointing a group that works closely with business units and oversees transformation efforts, companies can help make sure technologies are applied in beneficial ways.
The technologies that serve as building blocks of digital transformation have the potential to drive organizations to higher levels of innovation and competitiveness. But the human dimension and corporate culture are at least as important as technology (see 'The technology fallacy: Embracing the human face of digital transformation,' CFO Insights, October 2019). As the rapid development of technology continues, its responsible use will help shape the digital marketplace.
Building trust in technology
AI, machine learning, blockchain, digital reality, and other emerging technologies are integrating into our everyday lives more quickly and deeply than ever. How can businesses create trust with the technologies their customers, partners, and employees are using? Here are a few considerations taken from Deloitte’s 11th annual Tech Trends 2020 report:4
Encode your company’s values. With technology ingrained in the business and machine learning driving decisions and actions, an organization’s values should be encoded and measured within its technology solutions. Digital systems can be designed to reduce bias and enable organizations to operate in line with their principles.5 Safeguards can promote stakeholder welfare by helping prevent users from engaging with technology in unhealthy or irresponsible ways. Examples include a company that imposes time and spending limits on habit-forming games, a content aggregator that prompts users to be skeptical about the veracity of crowdsourced information, and cloud computing providers that automatically issue alerts before customers go over budget.6
Build a strong data foundation. Without methodically and consistently tracking what data you have, where it lives, and who can access it, you cannot create an environment of trust. A strong data foundation unifies stakeholders around a single vision of data accountability and delivers on secure technology that supports effective data management.7 Leaders should aim to give stakeholders some control over how their data will be used and delete data on demand unless it’s necessary to keep it for legal or regulatory purposes.
Be transparent. Companies can build trust with stakeholders by proactively and transparently demonstrating good behavior. Transparency extends beyond policies explaining data collection and usage practices. For instance, rather than masquerade as humans, intelligent agents or chatbots should identify themselves as such.
Give employees a reason to trust. Much of the anxiety over AI and other advanced technologies stems from the fear of the displacement of labor. From an ethical perspective, this presents business leaders with a challenge: balancing the best interests of the business, the employees, and the wider community and society. It’s a task made more complex by the fact that advanced technology systems are not self-sufficient. While AI can replace some jobs, for example, it creates others that often require specialized skills and training.8 Companies can build trust with employees by advising them on how technology may affect their jobs in the future.
Teach them to fish. Training technologists to recognize their own biases, and to eliminate bias in the products they create, is an important step toward creating a culture that emphasizes trust. But it is only one step. Building awareness of how technology affects stakeholder trust in those not directly involved or responsible for technology and creating associated decision-making frameworks are additional steps organizations should consider. Companies should consider what resources may be needed to help their employees recognize ethical dilemmas, evaluate alternatives, and make (and test) ethical technology decisions.9
Contact us
Nancy Albinson |
Catherine Bannister |
Natasha Buckley |
Rameeta Chauhan |
Yang Chu |
Deborah Golden |
Satish Iyengar |
Vivek Katyal |
Doug Palmer |
Anh Nguyen Phillips |
Michael Rohrig |
David Schatsky |
Cherian Thomas |
CFO Insights, a biweekly thought leadership series, provides an easily digestible and regular stream of perspectives on the challenges you are confronted with.
View the CFO Insights library.
About Deloitte's CFO Program
The CFO Program brings together a multidisciplinary team of Deloitte leaders and subject-matter specialists to help CFOs stay ahead in the face of growing challenges and demands. The Program harnesses our organization’s broad capabilities to deliver forward-thinking and fresh insights for every stage of a CFO’s career—helping CFOs manage the complexities of their roles, tackle their company’s most compelling challenges, and adapt to strategic shifts in the market.
Learn more about Deloitte’s CFO Program.
Endnotes
1 G.C. Kane, D. Palmer, A.N. Phillips, D. Kiron, and N. Buckley, “Accelerating Digital Innovation Inside and Out,” MIT Sloan Management Review and Deloitte Insights, June 2019.
2 Ibid.
3 Deloitte Development LLC, State of Ai in the Enterprise, 2nd Edition, 2018.
4 Deloitte Development LLC, Tech Trends 2020: Ethical technology and trust, January 2020.
5 Deloitte, "AI ethics: A new imperative for businesses, boards, and C-suites," accessed August 30, 2019.
6 Deloitte Development LLC, Tech Trends 2020.
7 Cynthia Dwork and Vitaly Feldman, “Privacy-preserving prediction,” Conference on Learning Theory, 2018; David J. Wu, “Fully homomorphic encryption: Cryptography’s holy grail,” March 27, 2015.
8 Deloitte, Ethics in the age of technological disruption: A discussion paper for the 2018 True North Conference, 2018.
9 Catherine Bannister, Brenna Sniderman, and Natasha Buckley, "Ethical tech, Making ethics a priority in today's digital organization," Deloitte Review, January 27, 2020.