Affective computing: Emotion-aware technology and the evolution of human-centered design

Affective computing can improve public services, but government leaders should put in place guardrails to address ethical concerns

Alan Holden

United States

Ben Szuhaj

United States

In the government service center of a bustling city, Lisa nervously approaches a kiosk to apply for a housing aid program. She feels her anxiety spike. She has always found the complex forms and rigidity of processes like these overwhelming and confusing. But today, she notices something different: an interactive interface glowing with a warmth that feels almost human.

As Lisa begins her interaction, the system senses the stress in her voice and hesitation in her responses. It immediately simplifies the language of its content, slows its pace, and offers words of encouragement.

"Don't worry, Lisa,” the machine reassures her, in a soothing voice imbued with empathetic undertones. “We’re going to make this as simple as possible, and I’ll be here to guide you through the entire process.”

As Lisa engages with the kiosk, data on her emotional responses is captured and combined with data from similar interactions occurring at kiosks across the city. An analytics platform identifies trends in the data, which government policymakers examine to shape future policies and improve government websites and mobile applications.

These are not scenes from a far-off, utopian future. They’re the budding realities of a not-too-distant tomorrow, one in which government harnesses technology that captures and analyzes human emotional data to build empathetic digital interfaces, create two-way responsive policy feedback mechanisms, and transform the citizen-government relationship.

Welcome to the world of affective computing—an interdisciplinary field that combines technology, computer science, psychology, and cognitive science to interpret, analyze, and simulate human emotions. Affective computing is evolving rapidly, and it’s a field that offers opportunities to help enhance public sector service delivery. However, government leaders should be mindful of the risks and ethical challenges that come with it.

Bridging the emotional gap: Why affective computing matters now

In December 2021, US President Joe Biden issued an executive order focused on improving citizens’ customer experience with the federal government.1 The directive mandates federal agencies such as Health and Human Services, Veterans Affairs, Education, and Homeland Security to improve customer experience on platforms used for services such as tax filing, travel, retirement, health, and disaster management.

One of the primary methods public sector leaders have traditionally used to better understand their stakeholders’ needs and improve product and service delivery is human-centered design (HCD). However, while HCD approaches—such as user research, ethnography, usability testing, and rapid prototyping—can provide rich, qualitative insights, they also have limitations. For example, they are often time-consuming, expensive, not quantifiable, and may not be feasible on a large scale. They can also be subject to bias, relying on stakeholders’ ability to understand what they want and on the ability of those conducting the research to consistently and accurately translate qualitative data points into actionable insights.

HCD can also struggle to accurately capture subtle changes in users’ emotional states throughout an interaction with a platform, application, or experience, particularly when those interactions are highly automated or involve little human-to-human interaction. The growing use of artificial intelligence in government systems underscores the need for these systems to be not only efficient but also emotionally intelligent.

Taken together, the rise of AI, the need for context-aware technologies, and government efforts to improve service delivery are contributing to an increased need to capture more dynamic information about stakeholders’ experiences at a greater scale than ever before.

And this is where affective computing can be a game-changer.


“Machines can’t respond intelligently to human emotions unless they can perceive those emotions and know how to react to them. And responding intelligently to human emotions drives a variety of desirable benefits, including better physical and mental health, communication, productivity, effectiveness, and decision-making.”2

—Dr. Rosalind Picard, director of affective computing research, Massachusetts Institute of Technology Media Lab


Affective computing technologies may represent the next evolution of human-centered design, capable of capturing exponentially more data points about an individual’s emotional state while decreasing the time required and need for human-to-human interaction. When used strategically, they can help governments better understand the emotional needs of their citizens and provide a more empathetic and effective public service delivery. And when used in tandem with AI, they can help AI agents provide more human-like, emotion-aware interactions that can result in higher degrees of user satisfaction.

How affective computing works

An affective computing system uses sensors to gather data from inputs such as facial expressions, tone of voice, and body language, which are then processed using advanced machine learning algorithms to derive insights about the user’s emotional state.

Affective computing as a category includes technologies with four overarching capabilities:

  • Emotion data capture involves collecting data that reflects an individual’s emotional state. Sources of data include facial coding and eye tracking for recording where the viewers’ eyes focus and for how long, GSR (galvanic skin response) for measuring skin conductivity and excitement, EEG (electroencephalogram) cap for recording brain waves, ECG (electrocardiogram) for recording heart rate, and voice analysis.
  • Emotion analytics is the process of analyzing this data to understand the individual’s emotions.
  • Emotion response involves systems responding appropriately to an individual’s emotions.
  • Emotion simulation involves systems mimicking human emotions.

Affective computing technologies can understand a broad range of human emotions, including basic emotions like happiness, sadness, anger, fear, surprise, and disgust, and more nuanced emotional states and mood changes.3 Driven by advancements in sensors and processing, and the rise of devices like smartphones and wearables with emotion-sensing capabilities, the affective computing market has seen a boom. Valued at US$28.6 billion in 2020, it is projected to reach US$140.0 billion by 2025.4 The demand for emotion-aware user experiences, which affective computing can deliver, is an important factor in this growth.

Current and future use cases of affective computing

Affective computing is already transforming how many organizations engage with their customers and manage their workforces. Consider the advertising industry. A study of 100 advertisements revealed that ads with above-average emotional responses generated a 23% lift in sales, and a 2022 study found “emotional attachment” was the biggest driver of value across 59% of customer groups.5 This could be why a company that uses emotion AI technology to measure ad effectiveness claims its technology is used by 70% of the world’s largest advertisers to understand viewers’ reactions to content and experiences, maximizing brand return on investment.6

Governments have similarly started exploring how affective computing solutions can drive improved mission outcomes. For example, when the United States Special Operations Command needed to vet hundreds of Afghan commando recruits without jeopardizing the safety of US personnel, it worked with a third party to use affective computing–based voice analytics software to screen 715 recruits in just 20 hours. The solution had an accuracy rate of more than 95%, with 2.4% false positives and no false negatives.7 In 2020, the National Institute of Health’s National Library of Medicine published a study on assessing the severity of depression in adolescents based on vocal and facial modalities. Such use cases can help agencies tackle mental health issues.8

Looking to the next five years and beyond, the use cases for affective computing in the public sector are expected to continue to grow and potentially include:

  • Personalizing interactions and experiences: The private sector has harnessed affective computing to create personalized shopping experiences. Government service centers can do something similar. The Department of Motor Vehicles, for example, could use this technology to customize individual visits, reduce wait times, and improve perceptions by adjusting procedures based on the visitor’s emotional state. Taking this a step further, government websites could dynamically adapt content and services to each citizen’s needs, providing a truly bespoke digital experience that evolves with societal trends and individual feedback. Soon, affective computing could be integrated with AI-powered virtual assistants to create highly personalized, emotionally intelligent interactions where the unique needs of each citizen could be anticipated and addressed.
  • Driving increased compliance: Government agencies could use affective computing to gain insights into citizens’ emotional states to tailor communications and foster trust. For instance, the Internal Revenue Service could employ this technology to analyze taxpayers’ emotional responses to different outreach initiatives, then dynamically adjust messages to enhance compliance, and simplify the tax submission process. In the future, affective computing could be integrated with predictive analytics to proactively identify and address potential noncompliance triggers, preemptively addressing issues before they escalate.
  • Designing new products, policies, and services: Government agencies could harness affective computing to design and test public services or infrastructure projects, helping them resonate positively with citizens and meet their emotional and functional needs. As simulation technologies advance, governments could use affective computing in virtual environments to prototype, test, and iterate on large-scale urban planning projects, policies, and services before implementing them in the real world. In a few years, affective computing could be used to create digital twins of entire communities, allowing governments to simulate the emotional and behavioral impacts of policy decisions, urban developments, and emergency response scenarios, optimizing for citizen well-being and community resilience.
  • Preventing and detecting fraud: Affective computing’s ability to analyze vocal and facial cues can be a powerful tool in the fight against fraud. Agencies like the United States Social Security Administration could integrate this technology to verify identities and detect deceptive behaviors during online applications or interviews, helping to enhance security and reducing fraudulent claims. As AI and machine learning capabilities advance, affective computing could even predict and prevent complex fraud schemes by analyzing behavioral patterns and emotional signals across multiple channels in real time. In the days to come, affective computing could be used to help proactively monitor government systems and transactions, flagging anomalies and high-risk activities for immediate investigation, revolutionizing the way agencies combat fraud and safeguard public funds.
  • Optimizing workforce readiness: In high-stress professions like law enforcement, defense, and emergency response, affective computing can play a role in monitoring emotional and cognitive states, helping to ensure optimal performance, increasing readiness, and promoting mental health. With the help of affective computing–enabled sensors and Internet of Things, organizations can create citywide or nationwide networks that support the mental readiness of their critical workforce, adapting in real time to ensure maximum efficiency and safety, or monitor the mental health of individuals at risk of self-harm. Over the next five years, the integration of affective computing with advanced simulation and virtual reality environments may be possible, allowing government agencies to assess the emotional and cognitive impacts of high-stress scenarios, and develop targeted interventions to enhance resilience and decision-making skills.
  • Transforming training, learning, and development: Affective computing can revolutionize government training programs, providing customized learning paths that adapt to the emotional and cognitive states of employees, thereby improving efficiency and job satisfaction. Furthermore, the integration of affective computing with augmented and virtual reality could create immersive, emotionally responsive training environments, enhancing skill acquisition and performance in complex tasks. In the near future, we may see AI-powered “emotional coaches” that can leverage affective computing to provide personalized guidance and support to government employees, helping them navigate the challenges of high-stress roles and fostering a more resilient and empathetic workforce.

Pairing affective computing with other emerging technologies

Affective computing’s potential to revolutionize the public sector likely multiplies when combined with other emerging technologies to drive sweeping changes in government services and usher in a new era of more empathetic, responsive, and human-centric public service delivery.

Generative AI: Generative AI can create new content from the patterns it learns from input data.9 But what if the input data was based on actual emotional response to data? In the immediate future, the fusion of affective computing and generative AI could help facilitate deeper processing of emotional undertones in citizen communications, propelling government services toward not only efficiency but also emotional connectivity, which could decrease response times and elevate citizen satisfaction and engagement. In the long term, AI models integrated with affective computing could revolutionize public service delivery, dynamically adapting to the emotional state and situational context of each citizen and crafting personalized interaction pathways, ushering in an era of “predictive service delivery” where the government can anticipate citizens’ needs and seamlessly provides the required support.

Mixed reality: In the future, government agencies could potentially blend AR and virtual reality with affective computing to transform their approaches to training, educational initiatives, and citizen interaction. By capturing emotional data, organizations could create training environments that evoke realistic stress and emotional response to enhance learning outcomes and improve preparedness by providing a safe space to experience and react to stress. Additionally, imagine a scenario where citizens use virtual reality headsets to virtually explore and emotionally interact with proposed urban development plans. Their emotional responses, captured and analyzed through affective computing, could provide city planners with valuable insights, helping to ensure that public spaces are designed to foster positive emotional experiences and meet the community’s needs.

Digital twins: Digital twins—specific virtual representations of physical things or experiences—when integrated with affective computing, could significantly impact urban planning and public service delivery. By creating digital replicas of cities or public spaces infused with real-time emotional data from citizens, planners, and policymakers could simulate and analyze how changes in the environment affect public sentiment. This integration could enable a more nuanced understanding of the public’s emotional response to urban changes, leading to design decisions that promote positive emotional well-being and public satisfaction. Looking ahead, the potential for governments to shape digital twins of entire cities using affective data could lead to comprehensive models that dynamically adapt urban environments and services to optimize public well-being. For instance, lighting, public transit schedules, and even parks and public spaces layouts could be adjusted in real time based on the collective emotional data of the city’s residents.

These cutting-edge integrations represent the next wave of technological innovation in the public sector, offering governments the opportunity to provide more empathetic, responsive, and human-centric services. But this may require public sector organizations to invest in robust emotional data analytics infrastructure, develop ethical AI frameworks that prioritize emotional intelligence, and foster AI systems that can seamlessly integrate with existing digital services.

Affective computing risks, considerations, and ethical concerns

Despite its promise, affective computing also raises ethical concerns that require careful thought and strategic action. Prominent among them are privacy and surveillance issues, as these technologies inherently involve collecting and analyzing personal emotional data. Potential biases in emotion recognition algorithms also pose significant concerns, given the potential for these biases to propagate discrimination or unfair treatment.10 Furthermore, the risk of overreliance on automation, thereby diminishing the human element in decision-making and personal interaction, should be carefully studied.

These and other concerns have already prompted policymakers to act. The European Union Artificial Intelligence Act, set to take effect in 2024, introduces a sliding scale of regulations based on an AI system’s risk, with applications that might infringe on privacy or manipulate behavior facing strict control.11 As governments explore affective computing’s potential, they should uphold privacy and security standards that enable trust in affective computing applications. Some guiding principles that government leaders can consider to mitigate perceptions of manipulation when deploying affective computing technologies include:

  • Transparency: Government organizations should clearly communicate to users when and how their emotional data is being collected and used. Transparency can help build trust and prevent perceptions of manipulation.
  • Consent: They should ensure explicit user consent is obtained before collecting emotional data. Users should have the option to opt out at any stage.
  • Purpose limitation: Agencies should use emotional data only for the specific purposes stated to the user and avoid repurposing data for unintended uses.
  • Bias mitigation: They should continuously monitor and address biases in emotion recognition algorithms to ensure fair and equitable treatment of all users.
  • Human supervision: Agencies should adopt a human-in-the-loop approach to decision-making processes influenced by affective computing in which they proactively identify critical decision points in which human judgment is essential to ensure ethical judgment and personal interaction are preserved.
  • Ethical guidelines: They should develop and adhere to ethical guidelines that prioritize user well-being, privacy, and autonomy over technological capabilities.

In addition to adhering to these principles, public sector organizations should consider initiating pilot projects to evaluate the effectiveness, ethical implications, and societal benefits of affective computing, as well as investing in training and collaborating with experts in the field. While affective computing can present promising opportunities to enhance public services, it also introduces ethical complexities. Government agencies should ensure responsible use of this technology and focus on harnessing its benefits while managing its risks.

The dawn of affective computing in government: Get ready for what’s coming

As the sun rises in her city, Lisa strolls through the local park that hosts a series of interactive kiosks, like the one she first encountered at the government service center. As she passes by one, it greets her warmly: “Good morning, Lisa.” Later in the day, at city hall, government officials review the sentiment analysis from social media and direct feedback from the park’s kiosks. They’ve already seen significant improvement in public sentiment toward the park. Its redesigned features, such as the closure of the main thoroughfare to cars at select hours and the addition of mood-driven lighting, have fostered a sense of safety and enjoyment. The park, once underutilized, now buzzes with community activity from dawn to dusk.

With advances in affective computing, it’s possible to imagine this scenario playing out sooner rather than later. OpenAI released ChatGPT in November 2022, and it received one million visitors in its first five days. In less than a year, it hit 100 million weekly users with over two million developers working on its application programming interface.12 It’s a testament to the rapid advancement of AI that there is now a shift from purely transactional interactions with machines to more conversational, and even relationship-based, exchanges.13

In the next three to five years, model interaction design is expected to be a pivotal field of study and an essential organizational capability. Why? Because it seems likely that just as modern organizations have distinct-yet-collaborative units, such as human resources or finance, the organizations of the future could have specialized, distinct-yet-collaborative AI models to help them achieve their missions. Of course, it’s possible that organizations completely restructure given the rise of AI, but in the medium term, it may be more likely that the deployment of AI capabilities across an organization (which is already taking place) is informed by familiar organizational structures, ones that made sense prior to AI but that will likely require redesign now.

In this future, affective computing could be embedded in specialized AI models across organizations to create more comprehensive, predictive, and personalized outcomes. For example, in a future military application, this approach could enhance training and operational readiness. Real-time emotional data from soldiers in the field could be analyzed to manage stress and optimize performance, triggering immediate interventions like mission adjustments or support resources if certain thresholds are met. This affective data could also inform the organization’s AI model responsible for generating training scenarios. The model could tailor training simulations to replicate and address field stressors more effectively, creating a feedback loop that can continually refine training programs. This interconnected AI system could not only enhance soldier welfare but also help ensure continuous improvement in mission preparedness and execution.

To embrace this affective computing–enabled future and effectively integrate this technology into public services, government leaders can consider taking these steps.

  • Initiate small-scale pilot programs to test and gather data on affective computing applications.
  • Provide training for employees on these technologies and their ethical considerations.
  • Collaborate with tech companies and academic institutions to stay updated on advancements and best practices.
  • Create robust ethical guidelines focusing on privacy, consent, and bias mitigation.
  • Incorporate affective computing capabilities into existing digital platforms and services.

Affective computing has the power to be more than a tool; it can be a partner, helping us create a world that is not only more empathetic, responsive, and inclusive, but also safer and more enjoyable for all.

BY

Alan Holden

United States

Ben Szuhaj

United States

Endnotes

  1. The White House, “Executive order on transforming federal customer experience and service delivery to rebuild trust in government,” December 13, 2021. 

    View in Article
  2. Rosalind W. Picard, Affective Computing (Cambridge, Massachusetts: The MIT Press, 2000); the authors interviewed Dr. Picard for this piece and are grateful for her contributions to their conversation and, moreover, to the field.

    View in Article
  3. Guanxiong Pei, Haiying Li, Yandi Lu, Yanlei Wang, Shizhen Hua, and Taihao Li, “Affective computing: Recent advances, challenges, and future trends,” Intelligent Computing 3 (2024): p. 0076.

    View in Article
  4. MarketsandMarkets, “Affective computing market by technology (touch-based and touchless), component (software (speech recognition and gesture recognition) and hardware (sensors, cameras, and storage devices and processors)), vertical, and region – global forecast to 2025,” June 2020.

    View in Article
  5. Nielsen, “We’re ruled by our emotions, and so are the ads we watch,” January 2016; Shep Hyken, “Guest post: The biggest value driver that is not on your journey map,” accessed July 25, 2024.

    View in Article
  6. Affectiva, “Affectiva introduces new functionality to enhance media analytics insight,” accessed July 25, 2024

    View in Article
  7. Clearspeed, “United States Special Operations Command, Afghanistan,” 2020; results are compared to the existing analytical process.

    View in Article
  8. Muszynski, Michal, Jamie Zelazny, Jeffrey M. Girard, and Louis-Philippe Morency, “Depression severity assessment for adolescents at high risk of mental disorders,” In Proceedings of the 2020 International Conference on Multimodal Interaction (2020), pp. 70–78. 

    View in Article
  9. Alan Holden, Ben Szuhaj, Joe Mariani, and Tasha Austin, “Designing for the public sector with generative AI,” Deloitte, April 2023.

    View in Article
  10. Joy Buolamwini and Timnit Gebru, “Gender shades: Intersectional accuracy disparities in commercial gender classification,” In Conference on Fairness, Accountability and Transparency (2018), pp. 77–91.  

    View in Article
  11. European Parliament, “EU AI Act: First regulation on artificial intelligence,” June 18, 2024.

    View in Article
  12. John Porter, “ChatGPT continues to be one of the fastest-growing services ever,” The Verge, November 6, 2023.

    View in Article
  13. Abhijit Gangoli, “How AI is transforming relationship intelligence in sales today,” Demand Farm, accessed July 2023.

    View in Article

Acknowledgments

Cover image by: Sonya Vasilieff