Skip to main content

Human experience platforms

Affective computing changes the rules of engagement

Computers have long struggled to recognize and account for human emotional factors. In the coming months, watch for companies to tap new AI-powered solutions aiming to help technology respond to humans more appropriately.

Tamara Cibenko
Nelson Kunkel

AS you enter the last leg of a long drive home, a network of cameras, microphones, and sensors embedded throughout your car monitors your facial expressions, voice, and the way you are using the car’s functionality. Analyzing the inputs in real time, your car—using computer vision, voice recognition, and deep learning capabilities—determines that you are getting tired and distracted. In response, these AI-powered tools lower the thermostat and turn up the volume on the radio, and a conversational agent gently suggests you pull over or stop for a cup of coffee at a restaurant three miles ahead.1

These technologies are engaging you—a human driving a car—in human terms. Myriad technologies that detect physical states such as alertness are increasingly being used to infer emotional states such as happiness or sadness. Unlike their machine forebears that set rigid rules of engagement, these systems will follow rules, reading your mood, intuiting your needs, and responding in contextually and emotionally appropriate ways.

Welcome to the next stage of human-machine interaction, in which a growing class of AI-powered solutions—referred to as “affective computing” or “emotion AI”—is redefining the way we experience technology. These experiences are hardly confined to automobiles. Retailers are integrating AI-powered bots with customer segmentation and CRM systems to personalize customer interactions while at the same time capturing valuable lead-nurturing data.2 Apps are designing custom drinks and fragrances for fashion-show attendees based on emotional quotient (EQ) inputs.3 A global restaurant chain is tailoring its drive-through experiences based on changes in the weather.4 The list goes on.

As part of the emerging human experience platforms trend, during the next 18 to 24 months more companies will ramp up their responses to a growing demand for technology to better understand humans and to respond to us more appropriately. System users increasingly expect the technologies they rely on to provide a greater sense of connection—an expectation that should not be ignored. In a recent Deloitte Digital survey of 800 consumers, 60 percent of long-term customers use emotional language to describe their connection to favored brands; likewise, 62 percent of consumers feel they have a relationship with a brand. Trustworthiness (83 percent), integrity (79 percent), and honesty (77 percent) are the emotional factors that consumers feel most align with their favorite brands.5

Historically, computers have been unable to correlate events with human emotions or emotional factors. But that is changing as innovators add an EQ to technology’s IQ, at scale. Using data and human-centered design (HCD) techniques—and technologies currently being used in neurological research to better understand human needs—affective systems will be able to recognize a system user’s emotional state and the context behind it, and then respond appropriately.

Early trend participants recognize that the stakes are high. The ability to leverage emotionally intelligent platforms to recognize and use emotional data at scale will be one of the biggest, most important opportunities for companies going forward. Deloitte Digital research reveals that companies focusing on the human experience have been twice as likely to outperform their peers in revenue growth over a three-year period, with 17 times faster revenue growth than those who do not.6 Moreover, inaction could lead to more “experience debt”7 and user alienation as AI applications make us all feel a bit less human. Chances are, your competitors are already working toward this goal. Research and Markets projects that the size of the global affective computing market will grow from US$22 billion in 2019 to US$90 billion by 2024; this represents a compound annual growth rate of 32.3 percent.8

Time to get started. How will you create emotionally insightful human experiences for your customers, employees, and business partners?

Knowing me, knowing you

In Tech Trends 2019, we examined how marketing teams—by adopting new approaches to data gathering, decisioning, and delivery—can create personalized, contextualized, dynamic experiences for individual customers. These data-driven experiences, embodying the latest techniques in HCD, can inspire deep emotional connections to products and brands, which in turn drive loyalty and business growth.9 The human experience platforms trend takes that same quest for deeper insights and connections to the next level by broadening its scope to include not only customers but employees, business partners, and suppliers—basically anyone with whom you interact.

In addition to data, human experience platforms leverage affective computing—which uses technologies such as natural language processing, facial expression recognition, eye tracking, and sentiment analysis algorithms—to recognize, understand, and respond to human emotion. Affective computing can help us achieve something truly disruptive: It makes it possible for us to be human at scale. What do we mean by that? Right now, true human connections are limited to the number of people we can fit into a room. Technologies such as phones or webcams connect us to other humans but remain only a conduit, and connections made through technology conduits are useful yet emotionally limited.

But what if technology itself could become more human? What if a bot appearing on the screen in front of our faces could engage us with the kind of emotional acuity and perceptive nuance that we expect from human-human interaction? Today, you may walk into a clothing store and barely notice the screens mounted on shop walls, displaying items currently on sale; the ads aren’t particularly relevant, so you don’t give them a second thought. But imagine if you could walk into that same space and a bot appearing on the screen recognizes you and addresses you by name.10 This bot has been observing you walk around the store and has identified jackets you might love based on your mood today and your purchasing history. In this moment, technology engages you as an individual, and as a result, you experience this store in a very different, more human way. AI and affective technologies have scaled an experience with very human-like qualities to encompass an entire business environment.

Designing for humans

The human experience platforms trend reverses traditional design approaches by starting with the human and emotion-led experience we want to achieve, and then determining which combination of affective and AI technologies can deliver them. The big challenge that companies will face is identifying the specific responses and behaviors that will resonate with—and elicit an emotional response from—a diverse group of customers, employees, and other stakeholders, and then developing the emotional technologies that can recognize and replicate those traits in an experience.

Think about the abilities comprising empathy—among them, the ability to relate to others, the ability to recognize ourselves in a storyline, and the ability to trust and feel complex emotions. As humans, we see these abilities in ourselves and, by using our senses, we can recognize them in others. Today a growing number of companies are exploring ways to develop a deeper understanding of the humans who will be using new technologies, and to incorporate these insights into technology designs. They include:

  • Neuroscientific research. This method moves beyond traditional “soft science” market research approaches (surveys, questionnaires, data analysis, etc.) by deploying a variety of sensory recognition technologies to measure brain activity, eye movement, and other physical responses to stimuli. Analysis of this data can give companies a deeper understanding of individual’s unconscious and implicit decision-making processes. (See sidebar, “Neuroscientific methods for measuring thought processes.”)
  • Human-centered design. HCD brings the human being into focus. It starts with the premise that individuals’ beliefs, values, feelings, and ambitions are important because they form the foundation for who those individuals are and what they want from the organizations with which they engage. HCD involves using ethnographic research11 and neuroscience to better understand individuals’ unmet needs and using these insights to improve service design and delivery. Importantly, a design-led approach brings end users into the room with stakeholders to engage in rapid prototyping, testing, and iteration of solutions with the people for whom they are created.12
  • Removing bias and emphasizing values and ethics. For experiences to resonate, they must reflect human values, such as trustworthiness, integrity and honesty—all emotional factors that humans feel about their favorite brands. But in the absence of ethical consensus on so many aspects of cognitive and affective technologies, individual companies on human experience journeys should factor ethical considerations—as well as their organization’s values—into the development of their own AI solutions. As you build human experiences for your customers, employees, and business partners, ask yourself: What does ethical technology mean? How do governance and ethics overlap? Do the algorithms we are creating align with our values and those of society in general? How can you build transparency into AI decision-making?13 And how can you reduce cognitive bias in the development process by having more diverse teams be part of the design?14 (Note: For a deeper dive into the ethical dimensions of technology development, check out the ethical technology and trust chapter of Tech Trends 2020.)

Neuroscientific methods for measuring thought processes

Two decades ago, neuroscience began investigating business-relevant questions by forming links with other disciplines such as economics and behavioral science. Today, this research plays a critical role in the human experience platforms trend. Using the following scientific methods—suggested by Deloitte Neuroscience Institute—to measure conscious and unconscious human thoughts, organizations can gain valuable insight into individuals’ desires and emotions. They can also test the effectiveness of the sensing and analysis tools.

Electroencephalography (EEG). Measures electrical brain activity with highly temporal resolution relating to perception and thinking processes.

Eye tracking. Tracks eye movements and gaze in real time to monitor visual focus (mobile and fixed-to-screen versions).

Facial coding. Measures facial expressions to identify emotional reactions.

Galvanic skin response. Measures skin conductance to monitor physiological arousal in response to external events.

Implicit association testing. Reveals implicit beliefs and attitudes that respondents usually do not report in traditional explicit testing methods like interviews and surveys.

Show more

Implementing the experience

Once targeted experience has been designed using a combination of neuroscience, HCD, and the ethical guidelines and principles, it’s time to implement it. Companies need to leverage human experience platforms that use AI, machine learning, natural language processing, visual recognition, and other technologies that can be combined to make the experience come to life. For example, if an employee contacts an automated internal call center, AI-based voice recognition and natural language processing tools will be able to recognize the nature of the employee’s query from a list of things about which employees typically call. These tools also detect, based on the caller’s tone of voice, that she is agitated. With this information, an AI-powered customer service bot can deliver the scripted response most likely to defuse this specific situation. The script may direct the bot to express empathy. All scripted responses are designed to help AI systems engage the caller in a human way, but also to keep the technology from violating the caller’s or the organization’s values. And the AI needs to know when to pass off the call to a human operator. In this human experience, designers have set up operating parameters but have left it to affective tools and AI, working in tandem, to fill in the blank spaces with the optimum response.

This can pose a whole new set of challenges as choices must be made explicit. Organizations can hire a human call center agent, give them a discretionary budget for returns or waived fees, and assume that they will employ decades of good judgment to dispense it. A virtual agent must be instructed. Moreover, we expect our virtual agents to be unbiased—imagine if the virtual agent were discovered to be consistently waiving fees for one group and not another. Next, virtual agents need context and history. Maybe the first penalty for late payment should be waived, but how about the second? Or the fifth? Finally, AI-powered agents need a set of outcomes for which to optimize. If they optimize for customer happiness, the agents may waive all fees—an outcome that would surely please customers but might not be optimal for the business. Ultimately, building rules to mimic “basic” human intuition is a tall order.

Sample AI technologies supporting human experience platforms

To support the ability to detect stress and emotion in human experience platforms, computers rely on a combination of text analytics, voice analytics, voice recognition and response, video analytics, and more. AI’s increasing ability to use video and voice to measure physical states and detect likely emotional states enables AI agents to respond more appropriately—mirroring mood, gestures, and tone.

Vision systems. Cameras and supporting algorithms to identity people, objects, surroundings, and extrasensory dimensions—thermal signatures, slow motion, ultra-zoom, long distance, and others.

Natural language generation. Generates appropriate responses and vocalizes them into human-like speech.

Natural language processing. Enables the processing of text to understand intent, questions, and queries.

Sentiment analysis. Analyzes text to detect overall sentiment toward topic—positive, negative, or neutral.

Voice recognition. Translates human speech into text for further processing.

Voice stress analysis. Measures relative stress levels to attempt to identify emotional reactions.

Show more

Now connecting

The work of making technology more human is hardly new. Voice assistants that only a few Christmases ago were the coolest gift under the tree are now ubiquitous, and the kiosk bots that engage mall shoppers in amusing ways today will soon be old news. And there are much bigger human experience initiatives underway. We’re already seeing advanced use cases emerge in the biopharmaceutical sector, exploring ways to use augmented reality and virtual reality in care management.15

In the coming months, we expect growing demand for technologies to become more human. We’ve reached a point in the digital revolution at which everyone’s connected to technology but not necessarily to each other. We are disintermediating processes and interactions and engaging directly with machines. It is unsurprising, then, that we crave what we are rapidly losing: meaningful connections. In response, we increasingly expect technology to treat us in more human—and humane—ways. Designing technologies that can meet this expectation will require deeper insights into human behavior, and new innovations that enhance our ability to anticipate and respond to human needs. But the incentive is there. In the near future, human experiences will likely deliver a durable and lasting competitive advantage.

Lessons from the front lines

Digital experience investment strengthens UBS client-adviser connections

In a business built around human interaction and high-touch advisers for high-net-worth clients, UBS seeks to balance the human experience with the digital experience. As part of its digital journey, UBS changed how its high-net-worth and ultra-high-net-worth clients invest and manage their finances, develop investment strategies, and engage with UBS financial advisers.

With a recently launched mobile app, the investment firm’s primary goal was to create a digital experience that felt high-touch and human, according to Kraleigh Woodford, head of digital client experience at UBS Wealth Management USA.16 UBS also wanted it to help deepen relationships between clients and financial advisers by using technology to deliver a more holistic wealth management experience.

UBS knew it needed to boost client-facing technology to incorporate personalized experiences and hands-on interactions. But at the heart of its wealth management business are financial advisers and their teams, who have cultivated deep, long-standing client relationships and connections. Recognizing that any technology solution should not disrupt the adviser-client bond, they sought to create a solution that would supplement and enhance rather than supplant these relationships.

To meet these objectives, UBS required a development process that was emotionally sensitive to the needs of its clients and advisers. The firm adopted an agile, business-led approach to product development, colocating business and technology teams to help ensure successful incorporation of both adviser and client feedback.

UBS leveraged its financial advisers’ deep reservoir of client knowledge, collecting and incorporating their feedback into the creation process. It also brought clients into the design process, seeking to recognize and adapt to customer behaviors and preferences through end-user testing and research. The design team examined investment strategies, buying patterns, personal aspirations, and consumer choices to understand client definitions of wealth, key financial goals, and important milestones and achievements.

At the heart of the app is an AI-driven personalization engine. To calibrate and customize the client experience, the app asks questions that allow clients to share information about their specific interests, concerns, and long-term needs. It also incorporates information about their philanthropic interests, individual goals, and important people and relationships. An algorithm identifies tailored wealth management content so that clients receive a curated selection of investment and financial information.

Research indicated that clients wanted to share more data with advisers but didn’t know the best way—and didn’t want to take up too much of their advisers’ time. To address this challenge, client information and insights flow directly into adviser-facing systems. Using data from the app, advisers can initiate conversations about wealth-building and personal goals and strategies, thereby enhancing ongoing client-adviser relationships.

Woodford says that since launching the app in March 2019, half of the clients using it have shared their interests and concerns to create a more tailored experience; a quarter have shared important milestones in their investment journeys. And UBS has realized a significant uptick in the number of high-net-worth and ultra-high-net-worth clients using the app compared to the web portal. “Our motto was ‘People first, then product.’ This allowed us to balance practical needs such as accelerating growth with emotional needs such as supporting the client-adviser relationship,” Woodford says. “We will continue to focus on this lesson.”

Brain-computer interfaces improve wellness and performance by tracking emotions

During an extended shift, an air traffic controller is handling a high volume of passenger jets and private planes. The day has been chaotic and stressful, with a couple of unexpected police and medical helicopter landings and a drone that deviated from its intended flight path. The controller hasn’t taken a break in several hours and is feeling fatigued. She tries to focus on the radar system, but it sends her a message: “Christina, it’s time for a break. Let’s find someone to cover for you.”

The air traffic controller is wearing brain-sensing earbuds that contain electrodes measuring her brain’s electrical activity. Upon analyzing these electroencephalographic (EEG) signals—the same ones used by doctors to understand brain (dys)function—machine learning algorithms detected increased distraction and stress in the controller’s brain activity patterns and decided that she needed a pause.

A long-established medical and research tool, EEG allows physicians to establish medical diagnoses and enables researchers to understand the brain dynamics underlying human decision-making processes and behavior. It can also help individuals improve wellness and performance, says Professor Olivier Oullier, president of EMOTIV, a leading neurotechnology and bioinformatics company.17 EMOTIV develops EEG-based wearable brain-computer interface systems that can be used to monitor cognitive performance and emotional reactions to inform workplace wellness, learning, safety, and productivity and capture consumer insights.

EMOTIV has miniaturized wireless EEG systems and machine learning–based neurotechnology to develop a wearer-friendly form factor that detects brain activity as accurately as laboratory EEG caps that prevent mobility—and are neither stylish nor comfortable. The company’s MN8 device looks and functions like standard Bluetooth earbuds, but squeezed inside is a mobile EEG lab that can measure and analyze levels of stress and distraction and provide the wearer—or other connected systems—with feedback on how to optimize wellness and performance.

Digital EEG signals are interpreted immediately using real-time analysis; optionally, they can be sent to the cloud for more advanced analysis and storage of the data at scale. EMOTIV’s machine learning algorithms have been trained to identify and classify neuro-markers for different cognitive and affective states by a decade of EEG data sets. Data has been accumulated through both scientific studies involving thousands of volunteers who were taken through various experiences to prompt different levels of the desired brain state18 as well as from nearly 100,000 neuroheadsets owners that volunteered to share their real-life data anonymously with the company.

Insight into exactly how people’s brains are reacting and evolving moment to moment, over time and in context of their actions, can be more valuable than self-reporting of emotions via written survey or verbal response. Self-reporting is important but doesn’t paint a complete picture, since answers represent only a single moment in time and are often influenced by what people think others want or expect them to say.

“Until recently,” Oullier says, “stress, focus, mental fatigue or cognitive load were challenging to measure scientifically and rigorously. In fact, they are cognitive and affective states that can now be detected by EEG neurotechnology. Quantifying these cognitive states in real time and in real-life situations such as in the workplace finally bridges the gap between perception and reality that exists when people self-report what they feel and experience.”

As in the case of the air traffic controller, organizations can leverage real-time analysis of cognitive data to improve individual employee wellness, performance, productivity, and safety by instructing workers to take breaks when they’re tired, changing the difficulty or the format of an interactive training or onboarding process when an employee is unfocused, or switching the employee to a less stressful task.

Companies can also mine the aggregated data to understand behavioral and work patterns. Taking these patterns into account, they can optimize workflows and procedures—for example, by building more breaks into workers’ schedules, changing shift hours to avoid stressful commute times, or moving the time of a meeting that requires high levels of employee attention. “Brain data can shed light on the types of environments and situations that allow employees to flourish so that workplaces can adjust,” Oullier says. “The purpose is to leverage neuroinformatics to personalize the work experience thanks to dynamic workplaces and systems that are more responsive to what employees feel.”

Smart, sensitive, and efficient: A cognitive agent even a curmudgeon can trust

Virtual agents are increasingly the first points of contact for customers or employees who need help or information. When communicating with a machine, few callers expect more than an efficient way to get a fast answer to a simple question. But expectations are changing as some companies look to combine a virtual agent’s efficiency with the problem-solving capabilities and emotional connections that a human agent can provide.

Increasingly, these companies are investing in sophisticated virtual support platforms that incorporate intelligent systems with affective computing—what some call “cognitive agents.”19 IPsoft says, “What’s valuable about a cognitive agent is that it can help build trust,” which encourages humans to use it for increasingly complex issues. The company sees three steps that cognitive agents need to perform effectively to establish trust: demonstrate understanding, classify the issue, and select appropriate next steps.

First, demonstrating understanding—especially when encountering human emotion—is one of the main use cases for cognitive agents. In many enterprise settings, human and cognitive agents are trained to follow a script when responding to questions and requests. Human agents tend to instinctively mirror callers by expressing a sentiment that demonstrates understanding, such as, “I’m very sorry to hear that” or, “Oh, that’s great!” After mirroring the emotion, the agent tries to move on to the next step in the script. Cognitive agents use advanced AI techniques such as sentiment analysis in order to detect and mirror emotion before moving on through the script.

Second, cognitive agents can learn to automatically identify and classify issues using AI text analysis and natural language processing (NLP). Episodic memory enables cognitive agents to recall information that may be required later in the conversation, avoiding the need to assume information or repeat a question. Recent improvements in NLP will equip cognitive agents to handle new phrases, utterances, and colloquialisms. All together, these will help cognitive agents better understand and classify issues.

Finally, as companies learn to trust the ability of cognitive agents to classify underlying issues and select appropriate next steps, fewer are transferring customers to human agents. Not only can a cognitive agent develop skills for handling negative emotions—it can develop the ability to identify ways to increase customer loyalty, such as selecting qualified and amenable customers for upsell and cross-sell opportunities. And cognitive agents can increasingly detect when a customer should be escalated to a human agent, whether based on the company’s policies and rules, detection of extreme negative emotions, or issues arising due to regulatory, audit, adjudication, or judgment calls.

IPsoft has incorporated mirroring into the text and voice recognition capabilities of its cognitive agent, Amelia. Now, the company is working on a next-generation appearance for video calls that will mirror emotions through sympathetic facial expressions. The company is also experimenting with voice and video biometrics to enhance Amelia’s ability to discern human emotions by comparing a person’s voice or expression against a normal baseline; when people feel tense or upset, their voices or expressions typically change.

IPsoft advises organizations that are planning to use a cognitive agent to observe and learn how their most effective human agents work with a diverse range of customers and situations, to study how they follow scripts and respond to customer emotions, and use these insights to train models and create standard operating procedures for their cognitive agent. They also suggest that organizations consider giving experienced human and cognitive agents more latitude to make judgment calls.

It’s possible that with appropriate training, a cognitive agent can perform as well as or, perhaps, even better than a human agent because it is encountering and learning from many more customer situations than a human is likely to encounter. And the agents are available 24/7 and can react to increases in demand by simply adding more computing power. The goal is not to fool customers into thinking they’re dealing with a human agent—it’s about providing faster, more efficient service that builds customer trust and loyalty.

My take

Anil Bhatt, Chief experience officer, Anthem

A few years ago, Anthem adopted a goal of providing world-class consumer experiences while improving members’ health and well-being. We also sought to deepen our relationships with members of our affiliated health plans.

To do this, we needed to provide frictionless, consumer-centric interactions, which required us to enhance and expand our engagement skills as well as increase the level of emotional intelligence with which we approached health plan members.

We’re leaning heavily on advanced predictive data analytics, cognitive technologies, and augmented reality to achieve these goals. For example, to create a more stress-free service experience, we’ve made significant investments in digital assistant technologies, predictive modeling with voice recognition, natural language processing, voice pattern identification, and sentiment analysis to analyze and predict consumer emotion and feedback in real time. This allows us to improve customer service as we engage with plan members across their preferred channels, especially as the ongoing growth of digital voice assistants signals an increasingly voice-oriented future.

With access to detailed data from a consumer’s health insurance plan, our digital assistant provides personalized engagement and helps health plan members complete administrative tasks. The digital assistant technologies can identify when a person is growing frustrated, determine that it’s time to route the customer to a human customer service representative, and does just that. This effort has helped us achieve higher customer satisfaction scores and task completion rates on our various portals and mobile apps, and our customer service center has demonstrated measurable improvements in first call resolution and average handle time.

We’ve also sought to help health plan consumers lower their likelihood of future illness in part by deepening relationships with them. Historically, individuals contacted us primarily to ask administrative questions about plans, premiums, benefits, or claim status. But we have the experience and know-how to have more meaningful member relationships, expanding our role from that of a health care administrator to that of a health care adviser that can help plan members choose healthier lifestyle options and guide them toward appropriate preventative care.

We do this by understanding their current health concerns and challenges and proactively designing and delivering tailored programs that build on our long-standing use of data-driven insights and cognitive technologies. Using predictive data models, we can identify individuals at risk for negative health outcomes and create personalized coaching and intervention programs—including follow-up care, social support, positive messaging, education programs, and health care advice—that help them better manage a disease. We can also turn the case over to a care provider who can work with the person to lead a healthier life.

In addition, augmented reality technology is helping us simplify the consumer experience. We get it: Nobody enjoys reviewing or completing applications and enrollment forms or insurance claims. So we’re testing an AR-based mobile app to reduce some of the apprehension associated with reviewing and completing complex documents and forms. Using a mobile device camera, the app converts insurance-centric language into more commonly used phrases and terms, helping consumers quickly identify important information and visualize where to sign or initial forms. This has helped speed the process of reviewing and completing forms, reducing member frustration.

As Anthem transforms to a digital-first organization focused on enhancing the consumer experience, we’ll continue to explore innovative ways to reduce the stress and frustration in consumer interactions and develop more meaningful relationships with health plan members. By leveraging advanced data analytics, cognitive technologies, and AR to deliver valuable, frictionless experiences and interactions, we can become a trusted health care partner that enables healthier lifestyles.

Executive perspectives

STRATEGY

CEOs and other strategy leaders immerse themselves in every aspect of the client experience to become the ultimate client champions and end-user ethnographers. What are customers’ most subtle habits, desires, and subconscious concerns? With a nuanced understanding of these factors, CEOs can champion intuitive human experiences by pushing data to its maximum limit and setting the highest standards for execution. While machine learning and AI promise better signal detection, they are far from a complete solution to this challenge. Human experience platforms, by combining context (status of current account/order/payment), emotional state (inferred by sentiment detection, voice stress analysis, facial expressions, and more), and interaction proclivities (inferred by customer history) can help optimize the end-user experience across channels and interactions. This, in turn, can help create a more consistent human experience for all.

FINANCE

Targeted investments in technologies that continually improve the user experience can offer a clear value proposition. AI-based technologies are improving the detection of human emotion through sentiment analysis, voice stress measurement, and facial expression detection. And machine learning can help identify the likely cause for user contact—or even suggest a proactive outreach. With these changes, intuitive bots can increasingly handle user contacts that traditionally required human agents. Consider funding exploratory efforts or insisting that IT help other leaders identify potential use cases, benefits, and ROI. Expect use cases to proliferate as enterprise functions recognize opportunities to reposition products and services by making user experiences more emotionally intuitive and context-appropriate. Some of these opportunities will likely involve transforming existing business strategies and value streams. At this stage, the value propositions that human experience investments offer become more complex—and more compelling.

RISK

The kind of intuitive, emotionally intelligent human experiences to which leading organizations are shifting will affect ongoing risk management efforts in areas such as operations, marketing, finance, and management. The big difference going forward will be data—enormous volumes of highly personal data that reveal people’s emotional states, the real-time contexts of their interactions, and their life stories. In this environment, the potential for fraud and identity theft may grow. How can cyber and risk leaders adequately protect myriad types of data that organizations have never captured before, such as information generated by eye-tracking platforms and exoskeleton gait analysis? Likewise, human experience data introduces a web of ethical issues. Human experience data will be harvested, analyzed, aggregated, and used in various ways to support differing enterprise strategies. Are there limits to the kind of data companies will not collect? Are there limits to the ways in which companies will use collected data? And who owns aggregated data? Expect the scope and complexity of risk management to grow as leaders try to answer these and similar questions.

Are you ready?

  1. What experience do you want your customers, employees, and partners to have when they engage with your organization? What company values do your experiences convey?
  2. Which of your existing customer digital interactions could be pilots for demonstrating emotional and contextual understanding with affective computing?
  3. How are you piloting human-centered design, ethical technology, and neuroscientific research capabilities to shape the development of your human experience platforms?

Bottom line

The human experiences platforms trend is fueled by a growing demand from system users that technologies engage us in more meaningful, human-like ways. In the coming years, we expect this demand to become a nonnegotiable expectation. Today, trend pioneers are integrating affective computing, AI, and neuroscientific research into their strategies and systems to transform the rules of user engagement. In the near future, “emotionally intelligent” technologies and tactics will likely give rise to new business models and ways of working. When that day comes, companies that didn’t get around to developing their own human experience platforms could find themselves at a significant competitive disadvantage.

Show more

Learn more

  1. Emotion-driven engagement: Learn why the ability to use emotional data at scale represents one of today’s biggest business opportunities.
  2. AI & cognitive technologies: Explore how cognitive technologies can help leaders make wise strategy and technology choices.
  3. Paying down the experience debt: Read how leading brands use their values to elevate the human experience.

Senior contributors

Human experience platforms

Scott Mager
Principal
Deloitte Consulting LLP

Steve Rayment
Partner
Deloitte Touche Tohmatsu

Robbie Robertson
Partner
Deloitte Touche Tohmatsu

Tânia Conceição
Manager
Deloitte & Associados, SROC S.A.

Executive perspectives authors

Strategy

Benjamin Finzi
Managing director
Deloitte Consulting LLP

Finance

Ajit Kambil
Managing director
Deloitte LLP

Moe Qualander
Principal
Deloitte & Touche LLP

Risk

Deborah Golden
Principal
Deloitte & Touche LLP

 

Cover image by: Vasava

  1. Affectiva.com, “Emotion AI overview: What is it and how does it work?”; Rana el Kaliouby, “Building emotionally aware cars on the path to full autonomy,” Venture Beat, February 11, 2017.

    View in Article
  2. Laduram Vishnoi, “How AI changed customer service in the IT industry,” Entrepreneur Magazine, February 17, 2018.

    View in Article
  3. Julia Muro, “At New York Fashion Week, this hi-tech experience steals the show,Forbes, September 4, 2019.

    View in Article
  4. Anthony Ha, “McDonald’s is acquiring Dynamic Yield to create a more customized drive-thru,” TechCrunch, March 25, 2019.

    View in Article
  5. Tim Greulich et al., Exploring the value of emotion-driven engagement, Deloitte Digital, May 8, 2019.

    View in Article
  6. Amelia Dunlop et al., We’re only human: Exploring and quantifying the human experience, Deloitte Digital, August 7, 2019.

    View in Article
  7. Ibid.

    View in Article
  8. Research and Markets, “Affective computing market by technology, component, vertical, and region, Global Forecast to 2024,” November 2019.

    View in Article
  9.  

    Angel Vaccaro et al., Beyond marketing: Experience reimagined, Deloitte Insights, January 16, 2019.

     

    View in Article
  10. Companies have used some elements of biometric advertising for several years. Shawn Patrick, “Twelve years later, ‘Minority Report’ advertising is here,” Recode, April 28, 2014.

    View in Article
  11. Kelly Moran, “An ethnographic approach to software,” Methods and Tools, Fall 2015.

    View in Article
  12.  

    Tiffany Fishman et al., Elevating the human experience, Deloitte Insights, October 30, 2019.

     

    View in Article
  13.  

    Nitin Mittal, Dave Kuder, and Samir Hans, AI-fueled organizations, Deloitte Insights, January 16, 2019.

     

    View in Article
  14.  

    Kavitha Prabhakar, Kristi Lamar, and Anjali Shaikh, Innovating for all: How CIOs can leverage diverse teams to foster innovation and ethical tech, Deloitte Insights, November 18, 2019.

     

    View in Article
  15. Alex Keown, “Could augmented reality benefit patient experience in managing healthcare?,” BioSpace, July 5, 2018.

    View in Article
  16. Kraleigh Woodford (head of digital client experience, UBS Wealth Management USA), phone interview, October 9, 2019.

    View in Article
  17. Olivier Oullier (president of EMOTIV), phone interview, November 20, 2019.

    View in Article
  18. EMOTIV, “The science behind our technology,” accessed December 9, 2019.

    View in Article
  19. Chris Butler (chief product architect, IPsoft), interview, November 27, 2019.

    View in Article

Did you find this useful?

Thanks for your feedback

If you would like to help improve Deloitte.com further, please complete a 3-minute survey