Three Approaches to Academia’s Challenge of Developing Workable AI Policies
As the population of traditional college students continues to dwindle due to numerous factors (demographics, politics, alternative learning opportunities, etc.), lifelong learning has become more than a slogan for tertiary-level institutions that struggle to maintain viable cohorts of learners for the range of disciplines they house. A Google Scholar search for “lifelong learning” produces almost a million results, pulled from such publications as the International Journal of Lifelong Education, the Journal of Adult and Continuing Education, and Adult Education Quarterly.
For my part, I joined the ranks of nontraditional students this past spring, when I began an online, full-time graduate degree program in business analytics offered by an accredited US university while also working full-time as the vice president of academic affairs and provost at an open-enrollment, government-funded tertiary institution in the Caribbean. When I began that program, I had no idea that come May, I would transition to a staff position at the University of North Carolina (UNC) at Charlotte as the director of university accreditation and assessment systems management. This unexpected career detour has given me a unique, insider’s view of how three different tertiary institutions have approached the seismic introduction of generative AI.
A full professor with nearly three decades of teaching experience at the undergraduate and graduate levels in the US and overseas, I am no stranger to the classroom and the struggles educators face when combatting academic dishonesty. And although OpenAI launched ChatGPT in November 2022, it wasn’t until the following semester that the first anguished cries arose from my humanities faculty, when they began encountering assignments that clearly weren’t written by the students who submitted them but also didn’t include the usual telltales of internet-assisted plagiarism.
As provost, I wanted to balance sympathy for my faculty’s pain with the understanding that this new technology was undeniably a game-changer for both education and the workplace. Even in the Caribbean, politicians and employers had begun raising concerns about higher education’s relevance. I have always believed that preparing our students for the working world is central to our role as educators. And I had witnessed education’s many adaptations to changing technology during my decades as an educator. So, I encouraged my faculty to find ways to alter their teaching to embrace and leverage generative AI, and I did so by sharing links to articles and webinars and inviting dialog on the topic. By then, however, I knew I would be leaving the institution, and developing the policies that would officially guide the students and faculty in their use of these new computing tools would fall to my successor.
Meanwhile, my transition from the Caribbean to UNC Charlotte was completed in May 2023. One of the first communications I encountered was a directive from the university’s provost, Alicia L. Bertone, declaring, “These are exciting times! AI tools like [ChatGPT] are here to stay and will rapidly evolve in the coming months and years.” Dr. Bertone’s enthusiasm for the new technology was tempered by recognition of the importance of educators’ good judgment: “Faculty may include language in their syllabi addressing if, how and when the use of generative AI tools such as ChatGPT are permitted.” Far from dictating a blanket policy, the provost welcomed advancements in technology while supporting instructors’ academic freedom. I encourage you to read Dr. Bertone’s full message.
UNC Charlotte’s Division of Academic Affairs compiled one set of AI resources for faculty and another for students to aid each segment in their integration of AI. The faculty resources included suggested language to include in syllabi, guides for supporting AI-based student learning, and even a sample email for starting a conversation about a student’s suspected academic misconduct. The student materials included UNC Charlotte’s Code of Student Academic Integrity and a link to the Writing Resource Center. You can access those resources at this link.
Then on May 18, UNC Charlotte’s Center for Teaching and Learning Innovation (CTL) hosted a full-day, in-person program that gathered “a community of educators to explore how AI can help us meet our student success goals.” This forum for discussion and the exchange of ideas was open to faculty, staff, and administrators. The event attracted a capacity crowd and offered hands-on labs, lightning talks on a variety of subjects, remarks from the provost, and presentations by various academic thought leaders. You can access slides, session descriptions, and speaker bios here.
In July 2023, to extend its impact beyond the UNC Charlotte campus, UNC Charlotte’s School of Professional Studies offered a professional certificate in Next Generation Learning with Generative AI Tools. One hundred fifty participants representing nine countries registered for this five-week online program. The interactive workshops included such titles as “Prompt Engineering with ChatGPT,” “Supporting Self-Directed Learning with ChatGPT,” and “Ethical Implications of Generative AI.” For persons interested in acquiring a digital badge, there was a capstone project that gave participants experience in identifying and proposing a generative AI initiative for their institutions.
Naturally, I was excited to see how the ethical application of generative AI might enrich my graduate studies in business analytics, and in July I had a good discussion during an online class period with my instructor and fellow classmates. All of us were enthusiastic about the benefits and opportunities of AI and how it would certainly change the practice of business analytics. We each felt honored to be at the cutting edge of this new technology and looked forward to exploring it together. But within a week of this lively discussion, that instructor shared an official zero tolerance directive from that university, prohibiting the use of AI tools. Specifically naming ChatGPT, the directive further described various consequences that included expulsion from the institution.
After the instructor sent that directive to the class, all discussion of generative AI screeched to a halt.
My first reaction was shock, which transformed rapidly into righteous indignation. I certainly understood the need to ensure that students are actually completing their own coursework, but to imply that neither students nor educators could effectively navigate opportunities to learn about and incorporate such an important technology—particularly at the graduate level—felt like a personal affront. I sent letters of protest to my instructor, my academic advisor, and that institution’s provost, expressing my dismay at the blanket policy and my concern about being deprived of a chance to explore this technology. I received no written replies.
So, in the course of a few months, I’ve witnessed three different responses to the advent of generative AI. The first institution is in a holding pattern as its educators work out for themselves how they will deal with the opportunities and challenges that generative AI is raising in their individual classrooms. The second institution is actively embracing the new technology and has even established itself as a resource for a global audience of educators. The third institution has effectively renounced the technology’s use. Each approach carries important implications for students and educators that will shape how employers, politicians, and the general public view the effectiveness and relevance of higher education.
Meanwhile, a wave of new tools, apps, and integrations are embedding AI deeper into the fabric of computing, work, and education. With every search engine adopting generative AI features into every inquiry, it may not even be possible for students to avoid AI interactions as they complete their studies. At what point will zero tolerance policies become unenforceable and devolve into absurdity?
In Plato’s Phaedrus, Socrates says of writing, “This invention will produce forgetfulness in the minds of those who learn to use it, because they will not practice their memory.” While Socrates thought writing made his pupils too lazy to remember facts and concepts, we now consider it an essential skill for every educated individual. Each generation brings its challenges to academia. As educational leaders, we must all strive to learn from the trailblazers and develop innovative policies that prepare our students and educators to respond to rapid technological change. If we don’t, we risk reinforcing a growing sentiment that higher education is becoming increasingly irrelevant and archaic.
J. D. Mosley-Matchett, PhD, is director of university accreditation and assessment systems management at the University of North Carolina at Charlotte. Prior to joining Charlotte’s Office of Assessment and Accreditation team in May 2023, Dr. Mosley-Matchett was vice president of academic affairs and provost at the University College of the Cayman Islands.