How nudge theory and design thinking can help your government IT project succeed has been added to your bookmarks.
Technology adoption is ultimately about human behavior. Learn how organizations can use a "people-first" approach and infuse behavioral insights into the mix to improve outcomes.
Technology makes our lives better. Except when it doesn’t.
Of course, these technologies need to offer an improvement in their own right. This is especially true of new digital technologies that are introduced into our lives—or workplaces—and don’t stick. In the end, no matter what the theoretical benefits, if a new technology isn’t adopted in a way that actually improves efficiency and effectiveness in the organization, it represents an IT implementation failure.
In most cases, adopting a new technology requires that people change their habits and behaviors—something that isn’t always easy to accomplish. Government, in particular, struggles to ensure that new applications, once built, actually get used. “How do we adopt new technology?” asks Massachusetts CIO Bill Oats. “Painfully.”1 Too often, this results in a host of all too familiar issues: manual workarounds, unrealized benefits, schedule delays, budget overruns, and delayed or revisited design decisions.
Whether new systems that will be used by public employees, or citizen-facing applications that are intended for use by the general public, there are far too many examples of IT failures in government that resulted in whole or part from poor adoption. The intended users, in essence, reject the new technology, and either ignore the new system and stick to existing workflows, or otherwise “work around” the new technology.
Consider the popular example of computers in the classroom: The promise was that computers would enable teachers to offer students new, more dynamic learning experiences. However, a study from the Organization for Economic Co-operation and Development (OECD) shows that classroom computers have made no difference in improving a student’s mastery of reading, mathematics, or science.2 Another UK study revealed that 450 million pounds a year are spent on educational technology that goes virtually unused.3 These two studies provide evidence of technological “tissue rejection” in today’s classrooms.
Simply unlocking new technologies is not enough. Teachers still need guidance on how to properly integrate these tools into changing how they teach. Further, the benefits (over, for example, using textbooks) need to be clearly demonstrated. Absent these considerations, integrating new technologies can lead to neglect.
Public agencies and private companies around the world face similar struggles implementing large-scale changes. Approximately 70 percent of all large-scale changes, many of which involve integrating new technologies, fall short of their long-term objectives.4 That means for every Netflix, which successfully moved much of the world toward streaming television, there are almost three computers-in-the-classroom examples that fail to achieve their intended outcomes.
To make matters more complicated, the public sector has even more hurdles to overcome than does the private sector.5 Relatively few government leaders have experience driving large, technologically intensive change efforts, and government organizations can be constrained by additional red tape regarding procurement practices, employee personnel, and budgeting.
In an effort to better understand why so many technology initiatives fail to stick, we researched and analyzed a diverse set of case studies from the public sector.6 Interestingly, both the successful and unsuccessful cases apply many of the classical change management practices we have come to know. That is, they assess the cost-benefits of using the technology, develop project milestones for development, communicate with employees and end users, and rely on a series of extrinsic motivators (carrots and sticks) to assure adoption.7 Nevertheless, many of these projects still fail to meet their goals.
What’s the missing piece?
Many government projects aiming to digitize operations and services struggle with user adoption because they don’t adequately take into account user needs throughout the development process. In particular, they often fail to consider how people actually think and act.
So we turned to the field of behavioral science for answers. Behavioral science combines insights from psychology, economics, and neuroscience to explain how people choose to act under specific circumstances. More than six decades of research in the field reveals that people often act irrationally, despite their best efforts to do the opposite. This holds true for program designers as well. By not putting the end user first, many programs are designed in a manner that fails to resonate with how the human mind works.
Throughout our case explorations, we observed three major behavioral science themes that contribute to technology rejection among government employees and the citizens and businesses they are trying to serve:
These key reasons behind failed digital adoption have little to do with the technology itself and more to do with the behavioral hurdles that prevent people from willingly undertaking new action.9 But these hurdles can be overcome. Government program administrators can kindle buy-in by leveraging behavioral science-based design principles that put the human before the technology.
In the following case examples, we highlight how effective digital enablement was a product of program designers better aligning to people’s intrinsic motivations, providing greater transparency, and/or applying core tenets of choice architecture (that is, how we organize options) to make the desired outcome the easiest choice. (See the sidebar, “Three ways to make technology adoption stick” for more information.)
Intrinsic motivation. Motivating people goes beyond a carrots-and-sticks mentality. People are often intrinsically motivated—they do things because they want to rather than needing an external reward to change behavior. This is the same reason that people learn to play an instrument or spend a Saturday helping a family member move (even if pizza and beer are offered). Government employees who go the extra mile can be motivated by the knowledge that they are helping the citizens they serve. A social worker will likely be more willing to invest in learning to use a new case management system if she is convinced it will help her to better serve the families she works with. Or a citizen may choose to reduce littering out of a sense of civic pride.
Transparency. Related to the issues of black boxes, people want to understand the “why” behind an action. Giving employees and end users a line of sight into what decision makers are thinking and/or communicating the progress achieved can engender buy-in, trust, and goodwill. For example, for police officers to support wearing body cameras, they will want to know not only the rationale for the cameras but also see evidence that the cameras actually make a real difference. Another illustration of this concept comes from an experiment involving travel sites “searching” for the best airfare prices.10 People trusted the algorithm more when they saw which sites were actively being analyzed for the best airfare prices vs. simply showing a list of the best prices (even when the transparent sites took slightly longer to produce results).
Choice architecture. Creating easier choice environments involves understanding the intentional and unintentional barriers that direct behavior. Seemingly mundane decisions like room design can create small moments of “friction” in undertaking the right course of action. In their book Nudge, Richard Thaler and Cass Sunstein discuss how the layout of a cafeteria can greatly influence people’s eating decisions.11 These vary from where fruits and vegetables are positioned to the size of plates (which subtly sway portion sizes). In a similar vein, they highlight how states that make organ donation the default have significantly higher rates of participation (while still giving citizens the freedom to opt out of the program).
For many medical professionals, patient record-keeping is an ever important but not regularly discussed topic. Electronic health records (EHR) offer medical practitioners an integral lens into a patient’s history—comprehensive, convenient, and (hopefully) accurate beyond the status quo of handwritten entries. A vital part of that medical history involves vaccinations.
In the past, medical practitioners entered vaccine product information into the patient’s chart, then more recently, manually into an EHR system. However, relevant details such as lot number, expiration date, and product ID are printed in small font that could be difficult for practitioners to read and interpret.12 Considering everything else that practitioners need to balance in their daily work, it is not surprising that these manual entries are time-consuming and prone to data entry errors. But this information can be critical: In times of a vaccine recall or disease pandemics, the ability to quickly and accurately identify these vaccines—and who received them—is paramount to the health of patients. Additionally, many EHRs have integrated warning messages to alert a practitioner when they may be about to administer the incorrect vaccine; these features only work if the data entered is accurate.
Two dimensional (2D) barcoding technologies have become ubiquitous as a means to make entering this data a more seamless and accurate process. These 2D barcodes have a greater capacity to capture pertinent vaccine data, compared to linear barcodes, and can be scanned to capture and load information into the EHR. The Centers for Disease Control and Prevention (CDC) has partnered with a variety of health facilities since 2011 to pilot the implementation of 2D barcode scanning. When used by medical practitioners during these pilots, scanning 2D barcodes on vaccines has resulted in time savings for practitioners and significantly improved data quality and completeness, when compared to manual entry.13
Despite the benefits of data entry via 2D barcode scanning technology, full utilization of this technology during early pilots was not universally adopted.14 Even when vaccines had the 2D barcode printed on the vial or syringe, many practitioners still resorted to manually entering this information. In one early pilot, 2D scanning compliance was frequently less than 25 percent.15
Why was this? Through a series of interviews, surveys, and observations, several barriers became evident. The combination of an imperfect process available at that time and a limited understanding of the benefits (or priority for change) likely reduced the use of this technology.16 Examples include:
In order to maximize utilization of 2D barcode scanning to enter vaccine data, the CDC recently partnered with a large health care system and included four behavioral science-inspired adherence strategies.17
First, every pilot site received training on how to use the scanners and why scanning was important to incorporate into their work. In taking the time to train practitioners and discuss implications for inclusion of this technology, the team went beyond the “how” and provided transparency into “why” scanning was important. As a comparison group, approximately one-quarter of the pilot sites received the training with no other adherence strategies.
Nearly one-half of the pilot sites also incorporated a “commitment card” component at the end of the training. The commitment card is a behaviorally inspired device that speaks to people’s intrinsic motivations. When people are asked to articulate a plan, they are more likely to follow through with it.18 In this case, practitioners signed a card pledging their commitment to scan vaccines in order to protect the safety of their patients and provide further explanation of why they considered scanning important. (See figure 1 for an example of the commitment card, which also included a freeform text box.)
Another strategy appealed to intrinsic motivations by providing reports of scanning rates for individual practitioners and their sites overall. While these reports provided greater transparency, behavioral research also suggests that peer comparisons, in the form of social proof, can be highly motivating because few would want to appear “behind” their peers.19 In addition to the training, approximately one-quarter of the pilot sites received these scanning rate reports, while another one-quarter received both the commitment cards and scanning reports.
This pilot produced compelling results: Overall, pilot participants scanned at much higher rates than in previous pilots (94 percent of nearly 68,000 vaccines administered during the pilot were scanned), but the inclusion of adherence strategies further maximized use. Both the commitment cards and scanning reports, and the combination of these strategies, significantly improved scanning rates vs. simply providing training. 20 These high adherence rates indicate that practitioners are not only going through the motions, but are changing habits, which are sustainable beyond the timeframe of the pilot.
Other behavioral insights were also gleaned from the pilot. For example, the importance of the choice architecture became clear. On-site pilot visits and practitioner feedback revealed a number of unique barriers created by the location of the scanner and the ensuing protocol for usage. In turn, scanners that were physically located in a space that aligned with a practitioner’s process (such as their walking path to obtain vaccines) provided a better, lower-friction design that increased buy-in and adherence.
Now, availability of 2D barcoding on nearly all vaccines administered during this pilot period reduced the cognitive overload of needing to take the extra step to check for the 2D barcode as practitioners now expect it to be on the vial or syringe. This enabled habit formation for 2D scanning to be the vaccine entry process used (with rare exceptions of 2D barcodes not being present). Leadership buy-in was also critically important. When leaders would visit a facility and demonstrate their own commitment to vaccine 2D scanning, higher scanning rates followed. This cultivated a culture of acceptance—that scanning is the norm and expectation for all practitioners.
Additional, critically important benefits realized during the pilot included improved accuracy and completeness of vaccine records, time savings, and practitioner safety and satisfaction.21
Benefits of such technology can only be realized when that technology is actually used. Implementing these design improvements created positive and impactful change that boosted adoption rates.
Most states provide several avenues for their citizens to apply for state-sponsored benefits programs. For programs ranging from unemployment insurance to the Supplemental Nutrition Assistance Program (SNAP), applicants in need of these benefits rely on quick and timely processing to determine their eligibility.
While applications can be completed online, many people prefer to fill out paper applications. In these cases, state employees need to convert paper applications into electronic information. But with such a large variety of benefits programs available, along with a diverse makeup of households applying, entering this information can be a cognitively intensive task for those charged with this work.
With more than a dozen different programs that require eligibility determination, a western state human-services agency accumulated a six-month backlog of paper applications that needed to be entered electronically.22 There were three reasons why the backlog occurred:
While these issues have less to do with technology itself, they provide key lessons in program adoption and change management that can be applied on a wider scale.
State officials recognized that these backlogs needed to be resolved quickly. To accomplish this goal, a cross-functional group of program leaders formed to take on a proactive approach to addressing these common hurdles to backlog remediation.23 This approach combined analytical insights with leading research on positive psychology and behavioral nudges.
Using predictive algorithms helped the leadership team understand how many backlogs were expected in the coming weeks, days, and even hours. They could then adjust staff accordingly and when needed, supplement with temporary workers to lessen the load. To address the cognitive overload issue, they built profiles that highlighted what each worker was most efficient at resolving. They then rearranged the choice architecture of the employee work environments in several ways:
Connecting employees’ work to the people they serve can be a powerful intrinsic motivator. To make information entry feel less like a thankless task, the state applied behavioral nudges grounded in the University of Pennsylvania's research on positive psychology. In one study, Wharton Professor Adam Grant demonstrated how call-center employees fundraised more money for college scholarships after meeting with scholarship students for only five minutes.24 After these short meetings, callers spent twice as much time on the phone and averaged $317 more a week in fundraising.
In this agency’s case, program leaders rearranged the incentives system to place value on a more meaningful metric. Where rewards were previously given based on the number of backlogs resolved, leaders now tracked and rewarded employees based on the number of citizens helped, which was incorporated directly into the predetermined lists. This helped in two ways: It relinquished the issue of only working on the easier backlogs and it more directly connected employees to the people they serve. They also used the data analytics to produce large aggregate dashboards that updated in real time, demonstrating how many citizens the employees helped in each respective program.
These changes created a more agile and transparent environment for employees, one where they could see the value they brought to their state in their work. Within 10 weeks, the state was able to completely eliminate six months of backlogs while also increasing employee morale and providing a greater sense of purpose.
After the Affordable Care Act (ACA) was passed, 14 states, Washington, DC, and the US federal government began to develop health care exchanges for citizens to assess and purchase health care insurance. The most well-known of these exchanges is the national platform, Healthcare.gov.
One of the biggest catches to this undertaking was that the exchanges were expected to go live on October 1, 2013, but production could not start until the Supreme Court upheld the ACA on June 28, 2012. Jim Wadleigh, the former CIO and now CEO of Connecticut’s exchange, explained that they had 10 months to complete what was essentially a three-year technology project.25
As millions of Americans were relying on these exchanges to work effectively, every exchange was under the microscope of the public eye. To achieve success, the implementation teams had a number of dimensions to consider.
Connecticut was one of the 14 states that elected to develop its own exchange, Access Health CT.27 A primary advantage Connecticut had was top talent: Hartford, Connecticut is home to one of the largest professional insurance networks in the world. As a result, the Access Health CT leadership team was able to hire not only masters of policy but also talent from across the insurance industry, which helped them accomplish several key objectives.
These leaders saw the project as a large IT rollout first and a “policy project” second.28 In assembling their team, they wanted to eliminate a very common behavioral phenomenon, confirmation bias, which is what happens when people unconsciously value information that supports their beliefs or opinions too much and disregard data that goes against them. They hoped to cancel out these natural biases by bringing in an eclectic group of experts and having them work together.
To eliminate the black boxes of policy leadership, they scaled down the requirements to the bare minimum. They also removed complex features and deferred up to 40 percent of the project to make sure the exchange could run as efficiently as possible for the end user.
Of course, this begs the question, “What if they eliminated too much?” To minimize this possibility, they ran a number of user experience tests two months before launch to ensure they had the right choice architecture in place to make the user experience easy. These tests included user testing, stakeholder testing, and various forms of war-gaming to uncover what could go wrong. One important outcome was they found out requiring login information before people could explore eligibility and pricing options was problematic, so they eliminated this step. In hindsight, other states that required a login found this feature to be a major bottleneck.
In the weeks that followed the launch on October 1, it became clear that Connecticut’s approach was the right one. Day one had 45,000 unique visitors. And while other states were struggling with enrollment, Connecticut had one of the highest per capita sign-up rates with 198,000 citizens purchasing their insurance through the exchange within six months.
The key to this success is credited to leaders emphasizing user design first and the technology second.29
Many citizens, especially in large cities, can feel like silent bystanders regarding local government issues and problems that need fixing. Many would avoid calling in a small issue, such as a street light being out, wondering, “What’s the point?” Or they wouldn’t act for fear of bureaucratic red tape: calling city hall, getting transferred two times, and waiting on hold for five minutes for something that may never get resolved.
For many years, this was typical of how issues were resolved in the city of Boston. A 10-digit number was the primary means to report issues and public confidence in the system was low.30
For many citizens, government work feels like a black box of activity. Tax dollars go in, ambiguous results come out. Even when people use specific government services and programs, they are generally unaware of doing so. In one study, people were asked if they ever used a government social program. More than half (57 percent) said they never did. But after reviewing 21 unique programs such as social security, Medicaid, and Pell grants, it turned out that 96 percent of participants had taken advantage of a government program at some point in their lives.31
At the local level, this same effect can manifest when it seems like issues are inadequately addressed. Harvard Business School Professor Michael I. Norton explains that people usually only notice things when they go awry; a driver who has traveled on perfectly paved roads for miles hits one pothole and is left with the overall impression the roads are in poor condition.32
Inertia also plays a role. Many people regularly do not engage with government because they don’t believe it will matter or get any attention. And so they don’t report issues at all.
What would it take to make government services feel more accessible to residents and restore their confidence that work is being done on their behalf?
In 2009, Boston looked for a better way to connect residents’ nonemergency requests (such as filling potholes, removing graffiti, and fixing broken lights) to the work that the city does on their behalf.
To achieve this outcome, they launched a free app called Citizens Connect, which utilized two change-by-design concepts: a better choice architecture and greater transparency. Rather than dialing a 10-digit telephone number, the app empowered people to easily make nonemergency requests. Here’s how it worked: If a resident who used the app came across a pothole, they could then take a picture of it, provide a short note explaining the problem, and submit both to the city via the app. In the spirit of transparency, the city employee who later handled that job would take a picture of the fixed pothole and upload it to the app.
Citizens Connect also provided transparency at the macro level. Anyone with the app has the power to view all open tickets pinned on a map and see how many were resolved each day. These visuals provided a window into all of the work Boston was conducting on its citizens’ behalf.
Fast forward to 2015 and Citizens Connect transformed into Boston 311, a multi-platform service in which citizens can voice issues via the app, online, through social media, or by phone. More features were added to make government interactions easier. For those who still preferred to call in issues, the 10-digit phone number was replaced with a shorter 311 nonemergency number.
The app has changed how citizens interact with their local government. More than 1.3 million issues have been reported through the app and more than 95 percent have been addressed by local government service providers.33
The recurring theme throughout all four cases has little to do with technology. Instead, it is about putting people first. Effective change management of digital enablement means incorporating the user experience throughout the entire process, underpinned by a sound understanding of how behavior is shaped at the individual, team, and organization level. For without user adoption, the technology itself is useless.
The individuals most impacted by a new technology implementation tend to be less concerned with the latest and greatest technology available. In each of these cases, the organizations or government agencies achieved success by using behavioral-science tenets as the foundation for understanding how people best handle change, and then designing experiences for end users that reflected those lessons.
To emulate this success in your organization, consider these three questions before your next technology implementation:
Real change happens through designs that make life easier, fill the end user with a greater sense of purpose, and offer a better line of sight into why new behaviors should be adopted in the first place. For public sector leaders, great design can circumvent political red tape; often, it involves a shift in focus to a more human-centric approach for programs that are already underway.