How higher education can realize the potential of Generative AI

Higher-ed leaders need to foster a culture of change—one that guides students, faculty, and staff to embrace the transformative power of gen AI over its perceived threats

Tamara Askew

United States

Roy Mathew

United States

Tiffany Fishman

United States

Danylle Kunkel

United States

Bob Caron, Sc.D.

United States

Generative AI represents a seismic shift for higher education institutions, ushering in a level of change comparable to that of the dawn of the internet. Indeed, the technology is so potent—with numerous potential impacts on teaching and learning, operations, and far more—that many administrators and faculty have been more apprehensive than excited about its potential effects on campuses and classrooms. But few disagree that when used effectively, gen AI can be a powerful and intelligent collaborator for students, faculty, and staff

Given the rapid innovation curve of gen AI, it is important for higher education leaders to understand and begin using the technology. Paul LeBlanc, president emeritus of Southern New Hampshire University, sees the knowledge economy “going through a radical reinvention, which means our graduates will soon (‘soon’ as in now) need different skills in different areas of work, and universities will have to rapidly remake themselves for that new reality.”1 The question is whether leaders are ready to take advantage of the opportunities.

For colleges and universities, staking out a leadership position is critical, both for their own survival and for the sector’s success beyond its deepest-pocketed elite institutions.2 Schools are competing for resources and students not only against each other but also against a declining perception of higher education’s social and economic value.3 To maintain global leadership, postsecondary education in the United States should embrace pedagogical transformation—with a boost from emerging technologies.

For a sector that has traditionally been resistant to change, transformation will not come easy. Faculty reactions to gen AI vary widely, from resistance due to its potential impact on academic integrity to excitement about the potential advancements it could bring to teaching and research. Even where faculty enthusiasm outweighs legitimate concerns, reshaping pedagogy requires considerable effort, and rapid adoption of new technologies may not be well tolerated by some faculty members. Reshaping behavior may be even harder at the organizational level, where the prevailing culture and decades-old processes and systems work to reinforce old ways of working, no matter how urgent the need for change.4

While classroom engagement with gen AI has inspired any number of headlines, the technology can have a far deeper impact on higher education as part of a broad organizational transformation.5 As in every sector, leaders need to work to advance smart tech while always keeping in mind how proposed changes might affect administrators, faculty, staff, and students as individuals, and what it might take—psychologically, emotionally, or politically—to make the changes stick.

The emergence of gen AI has challenged leaders everywhere to embrace its potential benefits while preserving and enhancing human potential. Higher education can play a unique role in envisioning a future in which society trusts humans and machines to work together while leveraging technology to make education more accessible, affordable, and rewarding. 

This article explores how postsecondary institutions can safely navigate this journey and effectively use gen AI to support administrators, staff, faculty, and students in easing their burdens and enhancing their work.

Adopt a use case–driven approach to identify appropriate AI tools

Forward-thinking colleges and universities are looking for ways to incorporate and embrace gen AI and traditional AI tools in ways that can enhance efficiency and improve outcomes across the academic enterprise, from operations and research to student services and learning (figure 1).6

Colleges and universities can harness the potential of gen AI by pinpointing existing processes and use cases where this technology can be integrated. By identifying the right gen AI and traditional AI tools and embedding them within current systems and processes, institutions can enhance and optimize their operational workflows.

Different automation tools have different strengths and weaknesses—and gen AI is no exception

Achieving more efficient and effective operations with gen AI requires leaders to understand which functions are best performed by people, technology, or a combination thereof. Gen AI combined with more traditional machine learning can efficiently process, analyze, and summarize content and large data sets to identify patterns and key insights. This may include generating reports, analyzing student sentiment, and even predicting student success or future enrollment. Human judgment is still necessary for tasks with high variability or social components—for example, managing tasks that require adaptability, empathy, and ethical reasoning.

Figure 2 illustrates some of gen AI’s strengths and weaknesses as it exists today. Moving from right to left on the creative difficulty axis spotlights creative tasks, such as preparing complete reports, that previous generations of AI tools could not handle but gen AI can. On the other hand, moving from bottom to top on the accuracy axis shows gen AI’s shortcoming: The technology will offer an answer to nearly any question, but it may not always be correct—an unacceptable outcome when results need to be accurate.

The following categorizations can help higher education leaders make informed, strategic decisions about how to implement gen AI in their institutions.

  • Tasks with moderately high creative difficulty, moderate context variability, and moderate accuracy could be good candidates for gen AI—for example, creating marketing content, organizing brainstorming and innovation sessions, generating user stories, recording regulatory compliance, and summarizing reports, among many others.
  • Tasks with high accuracy and low context variability, such as data entry, are likely appropriate for other forms of automation, ranging from robotic process automation to physical robots to other machine learning–based applications.
  • Humans outperform AI at dealing with tasks that have high context variability, especially activities with a strong social aspect (tasks that require emotional intelligence, building relationships, or providing support like coaching and mentoring students), complex decision-making—especially for ambiguous situations (such as identifying a new academic program), strategic planning, and creative problem-solving (like increasing student engagement), among others.

Higher education leaders should strategically orient their AI investments toward the broader outcomes they wish to achieve

It’s important to remember that most work activities involve more than one task—a fact ever visible in a field in which most faculty are simultaneously teachers, advisors, and researchers. Work activities that create value for the institution are likely to involve multiple tasks and even different types of tasks that are variously amenable to different automation tools.

Leaders can align their AI investments toward three broad outcomes by using a task-level analysis, which can inform how colleges and universities deliver value.

1. Efficiency: Automating individual tasks can help improve overall efficiency. Leading research universities are increasingly looking to automation tools to help with grant applications and administration, beginning with AI tools locating and sifting through available grant and funding opportunities to identify appropriate and likely prospects. Robotic process automation coupled with generative AI can complete the first draft of data entry on application forms, while intelligent optical character recognition can quickly decipher dense sections of award notices. This approach reduces the time and effort required for data entry, allowing principal investigators to focus on more strategic tasks.

2. Effectiveness: Adapting workflows to incorporate a suite of automation tools, each taking on the tasks to which they are best suited, can boost institutions’ ability to accomplish their mission. Take accessibility, for example: Advancements in educational technology, particularly by incorporating gen AI applications, have the potential to improve accessibility for 20% of undergraduates and 11% of graduate students with a disability, while traditional modifications—tweaking systems not designed to accommodate exceptions—may not fully eliminate either existing barriers or those that become visible later.7

The transformative impact of generative AI on higher education, particularly through AI-based tutoring systems, is garnering recognition from significant educational bodies, including the US Department of Education.8 These advanced systems offer personalized, step-by-step guidance and feedback, enabling a tailored learning experience that adjusts to the unique needs of each student. Historically, educational models required students to conform to a standard method of instruction, which was not always effective for every individual. However, the advent of generative AI tools in education allows for a more inclusive approach where students can engage with material in a way that best suits their learning styles, significantly broadening the likelihood of academic success.

Beyond individualized instruction, AI applications can enhance collaborative learning and empower educators to integrate teaching strategies informed by cognitive science. For many instructors, this will require a transition, a transition many are already making, shifting from the traditional lecture-and-listen model toward a more dynamic, interactive approach.9

Making education more accessible for diverse learning styles

Supporting instructional excellence. Integrating advanced language models into learning systems can considerably facilitate the incorporation of Universal Design for Learning principles.10 These models help educators adapt course materials, assessments, and learning tasks to accommodate diverse student needs, thereby promoting a more inclusive learning environment. AI applications can assist in formulating course policies that offer students flexibility in demonstrating their learning, fostering intrinsic motivation, and developing self-assessment skills in various ways.

Enhancing learner engagement. Gen AI tools play an important role in transforming the learning experience for all students through simplifying complex language or generating customized practice problems. These tools can facilitate a more engaging and supportive educational environment by introducing adaptive feedback mechanisms that allow students to learn at their own pace and in ways that best suit their individual learning styles. While these advancements are beneficial for the entire student body, they are particularly effective for students with neurodiversity and cognitive or learning disabilities.11 For these students, AI-driven tools can significantly reduce barriers to learning, ensuring that education is accessible and tailored to meet a wide range of learner needs.

3. Efficacy: Beyond making current processes more efficient and effective, colleges and universities can use gen AI tools to address thorny problems such as loneliness and student mental health.12

In recent years, loneliness on college campuses has emerged as a pressing issue, with US Surgeon General Vivek Murthy targeting college environments on a 2023 nationwide tour that underscored the vital importance of social bonds and brought to light a paradox: Numerous students feel deeply isolated despite being part of a bustling campus community. Many struggle to express their authentic selves and forge meaningful relationships.13 With loneliness associated with significant health risks as well as disengagement, educational institutions are aiming to foster supportive, community-centric environments.

By helping students connect with peers who share interests or challenges, AI-based technologies can alleviate feelings of isolation and nurture a sense of community. Duke University’s tech-enabled QuadEx residential housing system, for example, aims to support students in maintaining a stable community network, particularly during transitional periods that can intensify feelings of loneliness.14 For a student in any campus situation, an AI-driven platform, interacting with the student, can help introduce the student to existing like-minded groups or even facilitate the creation of new groups such as study circles, wellness and exercise communities, and business networks.

The significance of change management in AI implementation

Notwithstanding the potential benefits of AI implementation, apprehension is widespread among administrators, staff, and faculty.15 Concerns thus far have centered around job displacement, ethical challenges, and the breakneck pace of technological evolution.

Staff fear job displacement, fueled by studies predicting substantial risks of automation in roles such as data management, student advising, and grading.16 This anxiety is compounded by media narratives and academic discourse that speculate on leaders turning to AI-based technologies to make staff positions redundant.17

Beyond the fear of job loss, AI use carries profound ethical and privacy implications. Integrating the technology involves handling vast amounts of data, including sensitive student information, which raises questions about data security and the risk of privacy breaches. The potential for inherent biases in AI algorithms could lead to discriminatory practices, underscoring the need for stringent ethical standards in AI deployments. Concurrently, the rapid pace of AI development highlights a troubling skills gap, with many staff members feeling ill-equipped to adapt to new technologies. This scenario exacerbates fears of obsolescence and underscores the challenges in maintaining ethical governance over AI, as it outstrips existing policies and guidelines (see “A framework for trustworthy AI”).

A framework for trustworthy AI

Gen AI has powerful capabilities, but with the benefits come implications for risk, trust, and AI governance. As higher education leaders identify gen AI applications that enhance existing AI programs or lead to novel use cases, they will need to take a closer look at the legal and ethical implications gen AI can create. The challenge is to identify what is known about AI risk and governance and how it can be used to guide the trustworthy application of gen AI. What are the concerns around output accuracy, reliability, and hallucinations? How can intellectual property and sensitive data be protected? What will the impact of regulations be, and how can institutions align people, processes, and technologies to prepare for them?

 

Trustworthy AI results from how people, processes, and technologies function together across multiple dimensions of trust depicted in figure 3. 

Show more

Organizational transformations of this magnitude necessitate equally significant changes in habits and behaviors among faculty, staff, and students. This transition is not always straightforward, as seen in various initiatives aimed at digitizing operations, which often struggle with adoption due to a lack of consideration for user needs throughout the development process. Specifically, these projects frequently overlook users’ real-world cognitive and behavioral patterns, leading to resistance and cognitive overload.

Key behavioral challenges in adopting AI in higher education

The academic environment is dynamic and demanding, and introducing AI technologies into it can lead to cognitive overload, where too many new requirements cause users—whether faculty, staff, or students—to inadvertently skip essential steps or tasks. Leaders can also unwittingly foster resistance to change by not clearly communicating the rationale and benefits behind the adoption of new technologies. In higher education, with faculty and staff deeply entrenched in methods that have historically been successful for them, unclear communication about AI tools’ benefits and workings may hinder their acceptance. Open and transparent communication that explains the transformative power of AI can help mitigate these concerns. This includes clearly articulating the goals and expected outcomes, and making these goals relevant and tangible for stakeholders.

Leaders should not underestimate the power of inertia. Behavioral science suggests that individuals generally prefer the path of least resistance, adhering to familiar habits and practices. In the context of higher education, shifting from traditional methods to AI-driven approaches can seem daunting. As behavioral economist Richard Thaler suggests, making the transition as effortless as possible can significantly enhance adoption rates.18

Enhancing AI adoption through strategic change management

Given the disruptions caused by AI integration in higher education, effective change management is crucial to navigate the transition smoothly and ethically. As AI reshapes job roles and operational paradigms, educational institutions must consider how to proactively manage these changes to mitigate employee fears and resistance. This entails not only keeping staff and faculty informed and involved in the transformation process but also providing necessary training and support to bridge the skills gap. A strategic change management approach should also focus on fostering a culture that views gen AI as an augmentative tool rather than a replacement. By prioritizing transparency and maintaining open lines of communication, leaders can cultivate a more resilient and adaptable educational environment (figure 4).

Incorporating a “behavior-first change” strategy—integrating insights from anthropology, behavioral economics, neuroscience, and psychology—can significantly enhance an institution’s change management approach.19 This understanding helps leaders navigate the complexities of human behavior, ensuring that strategies are not only theoretically sound but also practically effective in encouraging the adoption of new AI tools.

By focusing on intrinsic motivators such as a sense of purpose, autonomy, and mastery, leaders can drive deeper engagement and acceptance of AI tools. These motivators encourage staff to embrace AI to enhance their capabilities and achieve greater professional fulfillment, rather than viewing the technology as a threat to their livelihoods.

Enhancing communication and transparency

Prioritizing transparency and maintaining open lines of communication are essential in cultivating a more resilient and adaptable educational environment. By keeping all stakeholders, including faculty and students, informed and actively involved in the AI integration process, leaders can manage expectations and build trust—both critical to the successful adoption of new technologies.

As AI redefines job roles and rewrites the rulebook on operational efficiency, colleges and universities may find themselves at a crossroads. To fully make traditional AI and gen AI work across all parts of the institution, leaders must not only navigate these turbulent changes but also champion them. This calls for a strategic, inclusive approach to change management that prepares the entire workforce for a new era of digitization and actively involves them in the journey.

By understanding and addressing these behavioral and cognitive challenges, higher education institutions can enhance the successful adoption of AI technologies. This approach not only facilitates smoother transitions but also maximizes AI’s potential benefits, creating more innovative and efficient educational environments.

Tactical strategies for effective change management in AI implementation

Personalizing AI tools. Allowing educators and administrators to tailor AI technologies to fit with their existing workflows can significantly enhance initial acceptance and long-term integration. Personalization helps in aligning new tools with current practices, making the change feel less disruptive.

 

Deploying pilot programs. Before a full-scale rollout, offering pilot programs can help in adjusting the technology according to real-time feedback and reducing hesitancy among potential users. These trials serve as a practical demonstration of benefits and allow users to experience the advantages firsthand without the commitment of a complete overhaul.

 

Minimizing perceived risks. Addressing potential risks associated with AI, whether they are performance-related, financial, safety concerns, or psychological impacts, is important. Clear communication about support structures, training programs, and the availability of resources to assist with the transition can alleviate fears and build confidence among stakeholders.

Show more

Making change management work

Create a narrative that resonates. Leaders should aim to clearly articulate the expected outcomes of AI implementations and illustrate how these align with the institution’s educational and research missions. This level of transparency is fundamental to mitigating concerns related to gen AI adoption. Leaders should explain how these technologies function, their implementation plans, and their potential practical impacts on various academic and administrative roles. Citing previous successful technological integrations can build confidence in the institution’s ability to manage technological change and set realistic expectations for what new AI tools might achieve.

Indeed, transparency is critical: Leaders should have open discussions about the strengths and weaknesses of specific tools being considered. This includes addressing the tools’ technical capabilities, integration challenges, and potential risks, and how they compare to traditional methods. Stakeholders need a clear vision of how gen AI tools might specifically benefit academic work or administrative tasks. For instance, leaders can explain how adopting an AI tool can automate routine data collection and analysis, enabling researchers to focus on complex analysis and interpretation.

Promote a culture of innovation. Fostering a culture of innovation is essential in enhancing stakeholder readiness. This involves promoting an open environment in which stakeholders are encouraged to experiment with AI tools and voice their ideas along with their concerns about particular applications. To cultivate this culture, leaders should visibly support AI initiatives, celebrate AI-driven achievements, and facilitate cross-departmental collaborations on AI projects. For example, empower faculty who are eager to adopt and integrate gen AI into their classrooms to pilot, test, and scale tools and approaches to the broader academic community.

Adapt to feedback. Readiness also requires establishing mechanisms for continuous feedback from all stakeholder groups. Leaders should actively solicit this feedback through surveys, focus groups, and feedback sessions to monitor the adoption process and gauge the ongoing alignment of AI implementations with stakeholder needs. Colleges and universities should be prepared to adapt their AI strategies based on this feedback to overcome barriers to adoption and enhance the effectiveness of AI applications in meeting educational and operational goals.

By prioritizing human factors and embedding change management principles throughout the gen AI implementation process, administrators can overcome the natural human tendency to resist change and enhance overall acceptance and success.

Road map for getting started

In crafting an AI strategy, higher education leaders should lay out a short- and long-term road map, beginning with a clearly articulated vision for how AI will add value to their campuses in the future. For leaders considering committing their college or university to more extensive gen AI applications, it is essential to start with a foundational step: clearly defining the challenges and goals gen AI is meant to address. This involves convening key stakeholders—faculty leaders, administrators, and IT experts—to articulate a vision for the future utilizing gen AI that aligns with the institution’s educational mission and objectives.

Integrating TritonGPT: UC San Diego’s strategic approach

TritonGPT is an AI-driven support system developed by UC San Diego, running NVIDIA H100 clusters in the institution’s supercomputer center, leveraging Llama 3.1 and other open-source models to provide personalized assistance for all 37,000 employees, and enhancing critical business processes across its research, administrative, and educational activities.20

 

UC San Diego’s implementation of TritonGPT exemplifies a strategic and inclusive change management approach. The university engaged stakeholders early, ensuring input from faculty, staff, and students. Clear communication through town halls, newsletters, and a dedicated web portal kept everyone informed.

 

To facilitate adoption, UC San Diego offered comprehensive training and support, including workshops and one-on-one coaching. A phased rollout began with pilot programs, allowing for real-time feedback and iterative improvements. Robust feedback mechanisms and strong leadership ensured continuous refinement and alignment with strategic goals. By balancing innovation with cultural sensitivity and ongoing evaluation, UC San Diego effectively integrated TritonGPT, enhancing the educational experience and operational efficiency.

Show more

To help ensure broad support, visioning workshops can be instrumental. These sessions should include top academic and administrative leaders, focusing on how AI applications can enhance learning outcomes, streamline administrative processes, and enrich the overall educational experience. The goal is to secure buy-in by linking gen AI initiatives directly to the institution’s strategic goals.

As colleges and universities progress forward, it is important to gauge the institution’s readiness for AI integration. This might involve conducting internal reviews, engaging with other colleges and universities that have successfully implemented gen AI tools, and evaluating the current technological infrastructure, as well as the ability of the institution to upskill its workforce to utilize these tools. Understanding these elements helps in planning realistic AI integration strategies.

With a clear vision and readiness assessment in place, the next step is to identify specific AI applications through a use case–driven design process. Collaboration across academic departments and administrative units is vital to pinpoint areas in which gen AI can quickly add significant value, guided by the established vision and goals. This phase should also include developing a detailed timeline, outlining the steps toward gen AI adoption, including establishing a process of ongoing evaluation and adaptation: auditing technological capabilities, reassessing roles, and ensuring that robust change management processes are in place to support ongoing gen AI integration.

Identifying areas where gen AI can be quickly and effectively implemented—quick wins—can demonstrate early benefits and help sustain momentum. It is also important to establish clear metrics and goals to track progress and measure the impact of gen AI initiatives across student success, operational efficiency, and research. Testing assumptions and potential applications through prototypes, too, allows institutions to refine their focus, prioritize the most effective uses, and prepare for broader implementation. This step is necessary to ensure that the AI solutions developed are practical and effective in real-world educational settings. And all insights and plans should be consolidated into a compelling gen AI business case. This document is needed for gaining the support of trustees, faculty representatives, and other decision-makers—and for ensuring that all stakeholders understand the potential benefits and implications of gen AI initiatives.

Arizona State University’s strategic integration of generative AI

Institutions must strategically parse the onslaught of available and emerging solutions and tools that address both the academic and administrative sides of the academy. This involves a comprehensive assessment of the specific needs and objectives of different departments, identifying the most relevant and impactful technologies. Arizona State University has stood up a sensing team that plays a role in this process by evaluating the capabilities and potential applications of various AI tools, ensuring that they align with the institution’s goals and objectives.21 Additionally, the team facilitates training and skill development programs to equip faculty and staff with the necessary skills to effectively leverage these technologies, thereby enhancing both academic and administrative efficiencies.

Show more

It is important to note that making existing processes more efficient will only take institutions so far. Realizing the full value gen AI offers likely requires a wholesale reevaluation of how institutions accomplish their mission. Starting with the end in mind, institutions should assess if gen AI and traditional AI tools provide a better path for reaching their end goals. Pursuing both incremental improvements and broader-scale changes to how institutions deliver value is necessary for transformation. Institutions are not optimized machines, but rather complex, adaptive ecosystems. When one area makes a change, other areas throughout the institution may also have to change in order to keep the whole system operating smoothly. But institutions cannot change without convincing their people to also change and adopt behaviors that will support the larger organizational transformational efforts. And small, quick wins are critical to convincing them.

As the institution develops and implements gen AI solutions, it is paramount to maintain a trustworthy framework focused on transparency, privacy, security, reliability, and other essential components. This involves training teams on leading practices and safe, ethical AI use, and embedding these principles in the development process. Collaborations with human resources, shared governance, and other administrative departments are essential to help ensure that gen AI initiatives align with the institution’s values and culture. Similarly, security, legal, and privacy teams should be brought in to help safeguard data and ensure it is used as intended.

If you want to go fast, go alone; if you want to go far, go together: The need for cross-sector collaboration

The wide array of opportunities and challenges that accompany gen AI transformation begs for a collaborative approach. By pooling resources and expertise for greater impact, colleges and universities can codevelop gen AI policies, practices, and solutions that benefit multiple institutions and allow for a more unified advancement in educational technologies.

 

To facilitate innovation that can scale across the sector, a variety of federal funding opportunities are available. North Carolina State University leveraged a US$1 million ExLENT grant to expand its AI Academy apprenticeship program. Similarly, the National Science Foundation promotes cross-institutional partnerships through its EPIIC grant program, encouraging universities to form consortia that leverage shared expertise to advance AI research and applications.22 The TRAILS initiative secured a US$20 million National Science Foundation grant to combine multidisciplinary expertise from the University of Maryland, Cornell University, George Washington University, and Morgan State University to help determine what trust in AI looks like, how to develop trustworthy technical solutions for AI, and which policy models can help to build and sustain public trust.

Show more

From the outset, leaders should consider how AI tools can be scaled across the campus using existing technological infrastructures. This approach allows for the effective expansion of AI applications, enhancing educational and administrative functions across the institution.

Checklist for getting started

With endless applications and use cases to consider, institutions should first take a systematic approach to identifying the needs and opportunities that fit their culture and technological landscape.

  • Develop an AI strategy. Before investing in point solutions and applications, first define the priority problems to be solved and the desired outcomes. Bring key stakeholders together to define a vision, goals, and guiding principles around the use of AI and what it should achieve for the institution.
  • Define a use case–driven design process. Colleges and universities will need to think critically about how AI will be used across the institution. Work across departments to establish specific use cases and applications for AI that align with the defined vision, guiding principles, and desired outcomes. From there, leaders should engage functional teams and departments across the institution to document a tactical road map and timeline to define the journey toward AI adoption and integration. Leaders should consider the following activities to support this step.
  • Experiment with prototypes. Pressure-test assumptions, quick-win ideas, and use cases with prototypes. This can help an institution refine its focus on a Phase 1 AI implementation and further prioritize the most meaningful applications.
  • Build with confidence. Employ an ethical framework to help ensure inclusion and accessibility during the configuration and build processes. Build these principles into the training development to provide clear explanations, leading practices, and policies around the safe and ethical usage of AI. Partner with human resources, shared governance, and other administrative departments around the university that can advise on how AI will align and support the organization’s culture.
  • Scale for enterprise deployment. When implementing AI, consider how the new application can be deployed across the academic enterprise using an existing cloud infrastructure. Thinking bigger (even if you are starting small) allows you to scale AI across the institution as more use cases are folded into the strategy.
  • Drive sustainable outcomes. As AI becomes an embedded feature across the enterprise, it may necessitate taking a closer look at the institution to better enable, grow, and maintain AI systems.
Show more

Looking ahead

We are entering a new era defined by the augmentation of human intelligence with gen AI—one that will be consequential for the trajectory of higher education in the United States. When thoughtfully paired with complementary AI tools and human judgment, gen AI can open transformative possibilities for colleges and universities across the entire academic enterprise. 

The path to adopting this new technology can be daunting and the risks associated with it significant, and yet, for many colleges and universities, standing still means falling behind. As renowned Silicon Valley analyst Mary Meeker observes, “The university of the future will not look like the university of today ... universities will find that AI can be a market share tailwind or a headwind—some will rise to the occasion, others will not.”23

Higher education leaders cannot successfully lead their institutions into this new era without fostering a culture of change for their people and adopting behaviors that will support the institution’s broader transformation efforts.

by

Tamara Askew

United States

Roy Mathew

United States

Tiffany Fishman

United States

Danylle Kunkel

United States

Bob Caron, Sc.D.

United States

Endnotes

  1. Paul LeBlanc, “When everything changes: AI and the university,” The Pie, June 18, 2024.

    View in Article
  2. Jon Marcus, “Colleges are now closing at a pace of one a week. What happens to the students?,” The Washington Post, April 26, 2024.

    View in Article
  3. “Universities need to find, create, and sustain their differentiators—their best-in-class programs and advantages that attract students—or risk losing market share in an increasingly transparent and AI-enabled world.” Mary Meeker, “AI & universities,” Bond, July 2024. 

    View in Article
  4. Clark Dana, Burke Soffe, Jeff Shipley, Frank Licari, Ross Larsen, Kenneth Plummer, Seth Bybee, and Jamie Jensen, “Why do faculty resist change?,” MedEdPublish, April 8, 2021; Jill Anderson, “Higher education’s resistance to change,” Harvard Graduate School of Education, Nov. 9, 2023.

    View in Article
  5. Susan D’Agostino, “Why professors are polarized on AI,” Inside Higher Ed, Sept. 13, 2023.

    View in Article
  6. Taylor Swaak, “AI will shake up higher ed. Are colleges ready?,” Chronicle of Higher Education, Feb. 26, 2024.

    View in Article
  7. National Center for Education Statistics, “Students with disabilities,” 2023; Cornell University Center for Teaching Innovation, “AI & accessibility,” accessed June 21, 2024.

    View in Article
  8. US Office of Educational Technology, “Artificial intelligence and the future of teaching and learning,” May 2023.

    View in Article
  9. Anthony Hernandez, “How AI can make classrooms more accessible,” Google, Oct. 18, 2023.

    View in Article
  10. CAST, “The UDL guidelines,” accessed July 25, 2024.

    View in Article
  11. Anne-Laure Le Cunff, Vincent Giampietro, and Eleanor Dommett, “Neurodiversity and cognitive load in online learning: A focus group study,” PLoS One 19, no. 4 (2024): p. e0301932.

    View in Article
  12. Gallup and Lumina Foundation, State of Higher Education 2024, May 8, 2024.

    View in Article
  13. A Gallup poll targeting college students found stress to be their predominant concern, with loneliness not far behind: Thirty-nine percent of students reported feeling isolated. US Surgeon General, “Our epidemic of loneliness and isolation,” May 2, 2023; Johanna Alonso, “The new plague on campus: loneliness,” Inside Higher Ed, Nov. 8, 2023.

    View in Article
  14. Duke University. “Student affairs,” accessed Sept 16, 2024.

    View in Article
  15. Lauraine Langreo, “No, AI won’t destroy education. But we should be skeptical,” EducationWeek, Aug. 31, 2023.

    View in Article
  16. Chris Vallance, “AI could replace equivalent of 300 million jobs – report,” BBC News, March 28, 2023.

    View in Article
  17. Brown, A., Organizational Change in Higher Education, Journal of University Management 9, no. 3 (2020): pp. 56–67.

    View in Article
  18. Athanasios Katsikidis, “Embracing new technology and the ‘nudge’,” Kathimerini, July 25, 2023.

    View in Article
  19. Bruce Chew et al., “Behavior-first government transformation,” Deloitte, Aug. 25, 2020.

    View in Article
  20. UC San Diego’s Blink, “TritonGPT: AI powered support.” Aug. 21, 2024.

    View in Article
  21. Annie Davis, “A new collaboration with OpenAI charts the future of AI in higher education,” Arizona State University News, Jan. 18, 2024

    View in Article
  22. Emily Groff, “Cornell partners in new $20M NSF institute for trustworthy AI,” Cornell Chronicle, May 4, 2023.

    View in Article
  23. “Universities need to find, create, and sustain their differentiators—their best-in-class programs and advantages that attract students—or risk losing market share in an increasingly transparent and AI-enabled world.” Mary Meeker, “AI & universities.”

    View in Article

Acknowledgments

The authors would like to thank Cole Clark, Amanda Charney, Carly Gordon, Madeline Pongor, Justin Freeman, and Mary McCall Leland for their contributions to this report.

Cover image by: Sofia Sergi