Research shows 53 percent of respondents would not use a company’s products if their data was sold. Build trust by protecting what stakeholders value while still serving their needs.
At the turn of the 20th century, meat-packing companies developed the first industrial assembly line. Lauded for its unmatched efficiency, the move was quickly duplicated by the emerging automotive industry. Then Upton Sinclair’s alarming novel, The Jungle, was published in 1906.1 Based on six months of research, the book told in uncompromising detail the horrors of the meat-packing industry—people working under extreme duress, unsanitary conditions, and covering up and packaging of spoiled meat. The world was left appalled, regulations swiftly followed, and public trust in the industry disintegrated.
Explore 2020 Global Marketing Trends
Download the full report or create a custom PDF
Explore the Marketing & sales collection
Subscribe to receive related content from Deloitte Insights
Download the Deloitte Insights and Dow Jones app
Over the next 100 years, the public elevated trust as a primary determinant of how they assess brands. Today, brand trust is more important than ever for businesses—and it’s all encompassing. Customers, regulators, and the media expect brands to be open, honest, and consistent across all aspects of their business—from products and promotions to workforce culture and partner relationships. And in the era of connected technology and big data analytics, companies must wrestle with another level of complexity: Building a structure that systematically builds trust by protecting customer data from both external cyber threats and unethical internal data misuse (see sidebar, “The future of trust” for more information).
The digital era makes trust a complex issue, fraught with myriad existential threats to the enterprise. Organizations can spend millions to safeguard their information, but one person’s susceptibility to a phishing scheme can undermine the entire effort. Alternately, the information may be safe from an outside actor, but a product team may choose to exploit customer data in a manner that spawns public backlash. And even when designers have the best of intentions at heart, their algorithms can produce unintended biases. For these reasons, and many others, organizations should proactively ensure that their processes, technology, and people are working in concert to maintain the high level of trust expected by their many stakeholders.
Digital transformation has changed how organizations should account for the issue of trust. Organizational trust is a bilateral relationship between businesses and their customers, workforce, partners, and governments. This means companies should build an infrastructure that protects what stakeholders value most, while proactively detecting threats in the domains of cybersecurity, data protection, regulatory compliance, and reputation. Companies that don’t systematically safeguard these domains likely face existential threats that have a bearing on:
For this trend, we focus on two domains of trust that marketers often interact with most: customer data and AI. As customer experience champions, marketers are expected to act as trusted stewards of customer information. This refers to accessing and using customer data in a manner that maintains—and builds—trust with the customer. Similarly, as the marketing function continues to leverage AI to enhance customer experiences, marketers should ensure they are executing this in a manner that doesn’t threaten the trust of the entire organization.
Just like the early assembly lines, customer data and AI can be powerful differentiators, and that’s apparent in their revenue growth. As of 2018, mobile and cloud data traffic—along with the analytical tools designed to extract value from these sources—was valued globally at US$169 billion and is estimated to reach US$274 billion by 2022, nearly equal to Finland’s annual GDP.2 We see the use of these emerging technologies manifesting in our daily lives as well. Social media sites grant “free” usage of services in exchange for allowing advertisers access to users’ personal social media information and activity. Grocery stores hand out customer discount cards to capture and leverage purchase history, often in partnership with outside vendors, and online retailers use, and sometimes sell, data to build more predictive recommendation engines.
In terms of maintaining and building customer trust, the paths organizations pursue through their data and AI strategies can be fraught with potential missteps. In our research, we polled 4,000 global consumers to better understand customer sentiment toward corporate data usage. Figure 1 shows that for a large number of people, trust quickly erodes if they believe that organizations are directly profiting from their data. For instance, 53 percent said they would never use a company’s products if their data is sold for profit and 40 percent believe exactly 0 percent of an organization’s profits should be derived from selling data. However, 27 percent of respondents acknowledged that they never consider how a company uses their data while making purchase decisions (conversely, only 19 percent always consider company data usage).
Taken together, these are important insights that marketers charged with leading the customer message should want to note: In addition to a strong aversion to companies profiting from the sale of consumer data, a substantial part of the respondents are largely unaware of how pervasive the practice already is today. In short, many may feel surprisingly vulnerable to how companies have deployed their personal information.
This may seem foreboding for companies trading in data but leveraging customer data also creates numerous benefits for consumers and business alike. In one study, for instance, 86 percent of customers indicated they would be more likely to trust companies with their information if they explained how it provides for a better customer experience.3 For this reason, it’s increasingly important for brands to get their customer messaging right on how their data and AI strategies provide a fair exchange of value for customer data.
Currently, companies have a high degree of latitude in how they use consumer data. While certainly not recommended, in countries with relatively relaxed privacy regulations, companies can couch permission to sell and use consumer data in dense legal agreements. Or perhaps the agreement could be clear and explicit, but consumers have no choice but to agree if they wish to use a service. Both options expose companies to the real and precedented possibility of a public backlash. Alternately, brands may be putting themselves at greater risk of disruption from a company with a strong purpose tied to data privacy (see our trend on purpose to learn more on how powerful purposes are creating competitive advantages for companies).
In this context, companies should choose their data usages in a thoughtful manner that builds, rather than erodes public trust. A natural first step can be to ensure the data capture and usage align with the core company mission. For instance, JD Wetherspoon, a pub company servicing the United Kingdom and Ireland, recently deleted over 656,000 customer email addresses since it perceived the emails as an intrusive approach to customer interaction that provides little value.4 While this case might seem like an exception, it highlights the importance of not only aligning data collection and usage to company purpose, but also supporting the brand’s relationship with the customer by extension. As countries and regions implement data protection and privacy regulations such as the General Data Protection Regulation (GDPR) in Europe, brands have a chance to get ahead of the bureaucracy by reviewing their own consumer data processes.
Data policies become even more complicated given brands often acquire data from second- and third-party vendors. In these cases, it can be difficult to fully understand how data was acquired and to what the end customer knows they agreed. Nonetheless, it’s important for brands to consider what the end user would want and expect from those possessing their data. Fortunately, solutions are coming to market to make this process more transparent and manageable. For instance, one company is piloting blockchain technology for brands to easily track explicit permissions granted by consumers (see sidebar, “Turn data into currency” for other examples).5 Further, many advertisers are turning toward in-house services to ensure their data is managed and deployed in a manner that is congruent with the company mission.6
Customer data usage doesn’t need to be a one-sided affair that benefits only the business. Instead, it can be viewed and messaged as a mutually beneficial partnership.
Third-party companies are working to put the decision-making power directly into consumers’ hands. This means that people can “opt in” to selling their data in exchange for compensation. Some companies act as liaisons that provide customers with an offer to sell their data usage rights in exchange for cash while others even offer cryptocurrencies as payment. Such moves empower people to choose what data they are willing to share and to whom they sell it for a known price. Though these ideas are more “on the edge,” companies can pilot these methods to see what incentives resonate most with their customer base.
By formally and directly enlisting consumers to participate in the data usage/selling process, companies transparently reveal their intentions, taking the customer along on the data journey with their eyes wide open.
With data as the foundation, brands are leveraging AI to identify and segment audiences, optimize performance, and create better experiences for the customer. But with the complexities involved, two main concerns are being raised around AI trustworthiness:
Regulations such as the GDPR incorporate clauses involving the use of AI7—such as the need to explain to consumers the logic behind automated decision-making. To help organizations maintain a high level of trust around their AI strategies, we recommend the following measures:
Partner in developing your AI strategy. As AI trust issues escalate, new institutions are being formed to help promote ethically charged practices and to assist businesses with following these best practices. For instance, the Algorithmic Justice League partnered with the Center on Privacy and Technology to launch the “Safe Face Pledge.” Organizations can use this platform to publicly commit to not abusing facial analysis technology.8 Further, the league offers to assess code to minimize the opportunity for bias while also providing instruction on inclusive algorithmic design.
Design for relevance, not personalization. Companies readily offer customers special incentives such as free products, discounts, or personalized and relevant services. Though many of these offerings are meant to be relevant, they can feel invasive (e.g., algorithms that detect if a woman is pregnant based on her web traffic data or purchase history). Instead, organizations can pivot their algorithms to provide relevant recommendations based on circumstance (e.g., offer an umbrella on a rainy day rather than an umbrella after someone buys a raincoat). By focusing on relevance, rather than personalization, AI recommendations are likely to seem more helpful than invasive.
Catalyze customer innovation. Companies can demonstrate how AI leads to better innovation, and therefore, better experiences for the customer. For instance, Amazon gleans insights from its purchase data to better build its supplier network to match consumer demand.9 Also, importantly, this data is used at an aggregate rather than an individual level. In effect, it provides customers with a better network to assist them in receiving their ordered goods at an even faster pace.
Company trust, or the lack thereof, continues to make headlines. As data and AI continue to extrapolate, so can their impact on brand trust. Companies are likely to continually and increasingly feel the pressure to demonstrate how they are acting as good stewards of data on behalf of their customers. In this light, they could do well to build a high level of trust with their stakeholders by proactively and transparently demonstrating good behavior.