Skip navigation
EAB Logo Navigate to the EAB Homepage Navigate to EAB home
Blog

7 missteps university leaders must avoid in their AI approach

June 13, 2023, By Afia Tasneem, Senior Director, Strategic Research

Every historic innovation that has shaped the way people access and use knowledge, from the printing press to the internet, has brought transformative changes to universities. AI is the latest and one of the most revolutionary among these breakthroughs, making it a strategic concern that requires leadership attention.

In recent conversations with presidents, provosts, chief strategy officers, chief business officers, and chief information officers, we’ve identified seven missteps university leaders are making in their AI approach.

1. Dismissing AI as just another hype

Many higher education leaders view AI as a passing trend, often drawing parallels to the hype surrounding the Massive Open Online Courses (MOOCs) in the 2010s. Despite much initial excitement, the MOOCs ultimately only had a modest impact on the education ecosystem.

However, it’s critical to recognize that AI is unlikely to follow a similar trajectory, especially given its extensive reach and diverse applications. Much like the internet and the personal computing revolution, AI’s influence on higher education is poised to be disruptive and enduring.

Several factors position AI as a particularly potent technological innovation including:

Unprecedented user growth

ChatGPT’s rapid and diverse user growth has been the fastest in tech history. Only two months after launch, ChatGPT reached an impressive 100 million active users—a feat that took Facebook four and a half years to achieve. Now, the tool boasts 1.5 billion monthly visits to its webpage.

What’s even more notable is the sustained usage after users’ initial exposure to the platform. A recent Educause quick poll survey revealed that while 67% of higher education staff members are already using ChatGPT for their work, a remarkable 96% of this group foresee ongoing use in the future. Anecdotally, student usage is even higher. The fact that everyone—students, faculty, staff—is (secretly) using ChatGPT speaks to the usefulness of AI.

Broad applications

AI holds the potential to transform nearly every aspect of campus life, ranging from teaching and learning to student services, research, enrollment and marketing, advancement, HR, finance, facilities management, and IT. It can enable personalization at scale, develop intelligent and accessible systems, and boost efficiency in administrative tasks for faculty, staff, and students. For more information on early opportunities with AI on campus, check out our blog post.

Integration with popular applications

Companies are rapidly integrating AI language models into various tools and platforms—including Microsoft Office Suite, YouTube, Kayak, and Instacart—and revolutionizing the way we order food, accomplish tasks at work, purchase travel tickets, and seek entertainment. The ubiquitous nature of AI is making it impossible to ignore and redefining expectations for customer service, business workflows, and personal productivity.

Unlocking personalized learning

AI offers both democratization and personalization, meeting every student where they are in their learning journeys. In April 2023, Khan Academy announced “Khanmigo”, an AI-powered tutor that provides personalized one-on-one tutoring and real-time guidance to students as they conduct tasks from writing papers to coding software.

2. Thinking we can distinguish between AI-generated and human-generated work

Concerned about academic integrity, many academic colleges have turned to plagiarism detection services in student assignments. Despite these measures, savvy students often find ways to cheat these software programs. And as AI continues to advance, differentiating between AI-generated and human-generated content will only become harder over time.

The endeavor to detect AI use can result in false positives, as demonstrated by a faculty member at Texas A&M who erroneously failed his students based on his AI detection results, even temporarily withholding graduation diplomasEmbracing policies of academic integrity that penalize AI use will also ultimately embrace biases like unfairly targeting non-native English speakers, whose work might be easier to detect, when everyone is using AI.

Rather than chasing this unachievable goal of stopping student AI usage, institutions must redesign assessments (e.g., oral examinations to evaluate foundational skills) as well as use new kinds of assignments that foster higher-order skills. Realizing this potential, however, will require instructors to rethink learning objectives and understand the capabilities and limitations of AI.

The Sentient Syllabus Project, an academic collaborative exploring the use of AI in higher education, challenges instructors and students alike to observe the principle that “an AI cannot pass a course.” For instructors, this means devising assignments that incorporate AI but push beyond its limits. For students, the challenge is to provide insights that AI cannot.

For example, AIs might generate sample texts that students fact-check, critique, and improve or tutor students in ways adapted to individual needs. An intelligent agent might even take part in a group project and offer viewpoints that students haven’t considered. In short, using AI may prove to be the best way to teach students to surpass AI—a critical skill for all in an AI-driven world.

3. Not preparing students for the AI-fluent workforce

The overemphasis on preventing “AI cheating” is also leading schools to ignore the broader impact of AI on the workforce. AI tools are already a staple in many workplaces, and AI’s presence will only grow over time. The CEO of a private company recently shared they are now expecting quicker turnarounds from junior-level staff because of the efficiency gains that come with AI use, especially for relatively simple tasks.

This is also true in higher ed institutions. Southern New Hampshire University’s creative team used ChatGPT to compose a 30-second commercial and determine the shot selection for its production. According to their chief marketing officer, the script required minimal adjustments, and eight of the ten suggested shots aligned perfectly with what her team had in mind. Remarkably, ChatGPT completed both tasks within a couple of minutes.

Since students must learn how to use AI to prepare for the workforce, educators must adapt their teaching and evaluation methods to accommodate AI. Institutions should teach students how to critically evaluate the accuracy, relevance, and potential biases of AI-generated content while leveraging the benefits of AI tools such as efficiency, personalized learning experiences, and enhanced research capabilities.

4. Waiting for vendors before making an institutional plan for AI inclusion in operations

Many vendors expect to build AI-based features within the next 12 months. And new vendors are entering the market in droves. Some senior leaders point to vendor adoption of AI as the holdup to broader utilization. However, institutions should avoid solely relying on vendor pitches for AI solutions. This can lead to redundant or suboptimal purchases on campus. After all, university strategy must guide product purchases, not product features or vendor pitches.

Instead, higher education leaders should proactively develop an institutional AI plan tailored to their unique needs and goals. This can start with initiating conversations with major technology partners (particularly those involved in enterprise resource planning, learning management systems, and Business Intelligence) and understanding their roadmaps. But ideally, institutions should also clearly articulate their needs to inform vendor product development or even co-build products with vendors. As one CIO mentioned: “Vendors will build anything we want, but we need to tell them what we want.”

5. Adopting a piecemeal approach to AI

Many universities are forming taskforces and working groups, led by faculty or staff members, to address specific aspects of AI, such as its impact on writing assignments. While these smaller taskforces are praiseworthy and provide an excellent starting point, it is essential to also establish president-led, university-wide taskforces that can systematically uncover AI’s impacts across campus. Universities must ensure that these taskforces include diverse representation from faculty, staff, and students who can contribute various perspectives on AI. A comprehensive plan can also prevent siloed thinking and redundant and unnecessary product purchases throughout the campus.

In fact, considering the potential for AI’s rapid and extensive influence across campus, it could be advantageous for universities to pursue system-wide or consortium-based collaborations. Such collaborations can help address common challenges, reduce costs, and foster a broader ecosystem for AI innovation in higher education. For example, the State Board of North Dakota requested the heads of all institutions within the North Dakota University System (NDUS) to form a taskforce to evaluate the risks of AI and discuss the potential for applications on campus. This taskforce is being led by President Andrew Armacost from the University of North Dakota who has planned a few seminars to discuss AI with other NDUS presidents.

Early movers have also started to allocate resources, such as a dedicated AI fund (e.g., the $10 million fund initiated by the University of Southern California in March 2023), to support AI-related projects, research, and infrastructure. These strategic investments can help universities establish themselves as leaders in AI adoption and innovation as well as enable the development of a comprehensive approach to AI on campus.

6. Assuming leaders need a formalized strategy before discussing AI with the campus community

It’s a guarantee that many people on campus are already using AI on campus to do their work. But many senior leaders have been silent on the subject. One CBO even reported as much, sharing that not a single person was discussing the impact of AI on his campus. If this is truly the case, leaders must understand why. Staff or faculty may not be talking about AI because they fear losing their jobs. Students may be plagued by plagiarism concerns. By not initiating the conversation on AI, campus leaders could inadvertently drive utilization underground.

Progressive leaders are fostering AI conversations through workshops, focus group sessions, and events such as town halls organized by the president’s office. At Georgia Tech, students showcased their AI projects to the board of trustees. Open dialogues can also help identify campus power users and AI experimenters on campus who can help inform the university’s AI strategy. This could be a starting point for a more formal strategy, or it could help leaders spotlight campus innovations.

7. Failing to raise campus awareness about AI risks, especially public ones like ChatGPT

AI models often rely on vast amounts of data to learn and generate content. This raises concerns about privacy and data security, as sensitive information can be inadvertently shared or exposed.

AI models can also generate responses that may be inappropriate, offensive, or biased, leading to ethical issues. For example, there have been instances where AI models have acted like a “racist, misogynist creep, led to wrongful arrests, and amplified bias in staff recruiting​​.”

It is important to be clear-eyed about the fact that these risks will not prevent campus communities from using a product that serves as a personal assistant to them in their daily lives. But training and awareness campaigns can mitigate these risks substantiallyWhen possible, institutions should also create avenues for leveraging AI capabilities safely. For example, institutions can partner with vendors or other institutions to develop their own instances of AI chatbots with built-in guardrails. Organizations like Morgan Stanley and PwC have followed this approach, creating proprietary AI chatbots for internal use.

Afia Tasneem

Senior Director, Strategic Research

Read Bio

More Blogs

Blog

Why data governance matters for university strategy—and why most data governance projects fail

Getting the most out of your data begins with good governance.
Data & Analytics Blog
Blog

Bringing a digital strategy to life: expert advice from the University of Greenwich’s Director of Information and Library Services

It’s one thing to acknowledge the need for digital innovation and digital dexterity in response to student expectations…
Blog

Is your data ready to support COVID-19 recovery?

Ask these three questions to evaluate your data readiness to respond to COVID-19, across key principles of accessibility,…