Perspectives

Digital financial services: Can machines be designed to promote customer integrity?

QuickLook Blog

As customers increasingly interact with machines, such as chatbots and digital assistants in banking and other financial services, how will these interactions affect customer honesty and the integrity of the information customers share with their financial institutions?

August 8, 2018

A blog post by Val Srinivas, Banking & Capital Markets research leader, Deloitte Services LP.

I’ve been wondering about customer honesty in the digital age after coming across a recently published academic paper.1 Based on experimental research, the authors concluded that “individuals cheat more frequently when they interact with a machine rather than a person.”

If these research findings are generally true outside experimental settings, what are the implications of this behavior for financial institutions, as customers increasingly engage with digital platforms—and less with humans—across a range of services? The use of machines to serve customers has been prevalent for decades. For instance, Automated Teller Machines (ATMs) have existed for over 50 years now. A variety of digital platforms are common today, including interactive voice response systems, online portals, mobile apps, chatbots, and more recently, digital assistants.

Digital systems have not only improved operational efficiencies but have also enhanced customer experience in meaningful ways. An example is DBS’s Digibank, India’s first mobile-only bank that uses a chatbot powered by Kasisto’s Artificial Intelligence platform as a key aspect of the banking experience.2 Similar platforms are being adopted widely across the financial services industry as chatbots become better at deciphering, anticipating, and responding to customer queries.3 Such chatbots are expected to yield billions of dollars in savings to banks over the next decade and make up a significant portion of customer service interactions.4 Likewise, the use of digital assistants in banking is also expanding at an impressive pace.

tablet

As customers interact more with digital platforms, how will their behaviors, particularly the honest reporting of facts, change?

Customer fraud is well documented in a variety of industries. For instance, in the retail world, theft in self-checkout service is becoming an alarming phenomenon.5 Fraud may comprise only a tiny proportion of the total volume of customer interactions, but such actions have a meaningful negative impact on business profitability and reputational risk.

In financial services, there are many examples of customer fraud: Stolen checks, forged documents, fraudulent loan applications, false insurance claims, payment card fraud, and identity theft. And no doubt, there are varying degrees of customer dishonesty, ranging from large-scale deception, such as money laundering, to “small” lies on credit card applications, such as inflating one’s income or misreporting employment status.

The unfortunate reality is that these types of customer fraud are unlikely to go away anytime soon. In fact, as technologies evolve, we will probably witness new types of fraud in the future. The tendency to engage in lies and deception is intrinsically human.

In the study I cited earlier, the researchers concluded that cheating occurs more frequently even when machines have human features, and that when interacting with people, individuals are less likely to be dishonest due to concerns about social image. A second experiment conducted by the researchers showed that dishonest individuals would rather engage with machines in situations where they have opportunities to cheat.

This experimental research did not account for scenarios where subjects know or understand that machines are effective in detecting dishonest behaviors, and consequently modulate their dishonest tendencies so as not to be caught cheating, or when machines are better suited to elicit honest information.

The unfortunate reality is that these types of customer fraud are unlikely to go away anytime soon. In fact, as technologies evolve, we will probably witness new types of fraud in the future. The tendency to engage in lies and deception is intrinsically human.

Designing machines for customer honesty

The question facing designers of customer-facing digital platforms is the following: 

How can machines be designed in such a way that customer honesty is maximized?

One approach suggested by the researchers is to insert humans in such interactions to minimize dishonesty in the hope that the presence of humans will help influence behavior in a positive way. But this is not practical or economical in many instances.

Imagine the countless scenarios where customers are reporting information to financial institutions, such as when applying for new products, whether they be loans, credit cards or mortgages. It is simply impossible to staff human resources in all these instances when machines can be employed more effectively and at minimal cost.

However, a more viable solution to obviate customer dishonesty in digital interactions may be the machines themselves, which, as we all know, are getting smarter by the day.6 (Blockchain-based truth verification systems may be another answer to mitigate dishonesty.)

Although machine-learning techniques have been used to detect fraud in financial services for some time now, recent developments such as Machine Learning as a Service (MLaaS) and OpenAI are quickly democratizing artificial intelligence and enabling even smaller institutions to adopt machine learning for multiple applications.7

While more research might be required, my personal view is that by marshalling all the data at their disposal intelligent machines will become powerful tools in encouraging customer honesty in a whole host of scenarios. For example, sensor technologies in combinations with the Internet of Things (IoT) and artificial intelligence could be used to combat fraud. GPS data, telematics, and other technologies can give banks or insurance companies real insight into borrowers’ and policyholders’ behaviors, versus the often self-reported data they rely on now.

What are your thoughts? 

Do you think that machines will get to the point where they can anticipate and head off people’s instinct to try to game the system?
Join the conversation on Twitter: @DeloitteFinSvcs.

1 Alain Cohn, Tobias Gesche and Michel Maréchal, Honesty in the digital age, ECON - Working Papers 280, Department of Economics - University of Zurich, February 2018. https://ideas.repec.org/p/zur/econwp/280.html
2 Signe Brewster, “Do your Banking with a Chatbot,” MIT Technology Review, May 17, 2016. https://www.technologyreview.com/s/601418/do-your-banking-with-a-chatbot/
3Jim Marous, “Meet 11 of the Most Interesting Chatbots in Banking,” The Financial Brand; accessed July 23, 2018. https://thefinancialbrand.com/71251/chatbots-banking-trends-ai-cx/
4 Ibid
5 Rene Chun, "The Banana Trick and Other Acts of Self-Checkout Thievery,” The Atlantic, March 2018. Accessed July 24, 2018. https://www.theatlantic.com/magazine/archive/2018/03/stealing-from-self-checkout/550940/
6 Roberto Valerio, “Outwitting fraudsters with machine learning and AI,” The Payers, July 17, 2018; accessed July 23, 2018. https://www.thepaypers.com/thought-leader-insights/outwitting-fraudsters-with-machine-learning-and-ai/773993
Maria Yao, “12 Amazing Deep Learning Breakthroughs of 2017,” Forbes.com, February 5, 2018; accessed July 23, 2018. https://www.forbes.com/sites/mariyayao/2018/02/05/12-amazing-deep-learning-breakthroughs-of-2017/#dfe79d665dbd

QuickLook is a weekly blog from the Deloitte Center for Financial Services about technology, innovation, growth, regulation, and other challenges facing the industry. The views expressed in this blog are those of the blogger and not official statements by Deloitte or any of its affiliates or member firms.

Site-within-site Navigation. Do not delete! This box/component contains JavaScript that is needed on this page. This message will not be visible when page is activated.

Fullwidth SCC. Do not delete! This box/component contains JavaScript that is needed on this page. This message will not be visible when page is activated.

Did you find this useful?