LOADING

Type to search

How We Can Turn AI into an Opportunity Rather Than a Threat

Curriculum Planning and Development

How We Can Turn AI into an Opportunity Rather Than a Threat

In dozens of discussions I’ve had about artificial intelligence this year, faculty members have offered variations of a single lament:

I wish someone would just tell us what we need to do.

They don’t really mean that, of course, but their uncertainty reflects the larger concerns that all of us in higher education face. Generative AI has added yet another burden to change-weary faculty members who were already struggling with pandemic fatigue and challenges brought on by online and hybrid classes, mobile technology, shifting student needs, and instant access to information.

Unfortunately, the need for change will only grow as technology, jobs, disciplines, society, and the needs of students evolve. Seen through that lens, generative AI is really just a messenger, and its message is clear: A 19th-century educational structure is ill-suited to handle changes brought on by 21st-century technology. We can either move from crisis to crisis, or we can rethink the way we approach teaching and learning, courses, curricula, faculty roles, and institutions.

In a forthcoming article in Change magazine, several colleagues from the Bay View Alliance and I offer steps institutions can take to integrate generative AI into teaching and learning. Those steps—from accepting AI as a tool for learning to developing education-focused AI tools—offer a framework for integrating generative AI into courses and curricula. Those suggestions are only a beginning, though. Academic leaders must create strategies for significant change if our institutions are to thrive amid rapid social and technological change. They must also speak up and help frame the integration of generative AI into teaching and learning as a much-needed opportunity to make our systems more flexible.

Academic leaders should speak up about generative AI

Generative AI can seem like something beamed in from science fiction, making faculty, staff and administrators feel lost. Teaching centers have provided many workshops, examples, and ideas on how to proceed. Online communities have been created to share ideas on teaching with generative AI. Some universities have also provided frameworks for using AI. All that has been helpful, but it’s not enough. Leaders need to break the silence and weigh in on generative AI. They shouldn’t issue edicts, but they should help guide conversations and reassure instructors. Faculty and departments need opportunities and time to explore generative AI and to engage in discussions about how it is changing their disciplines and how curricula might change in response. Leaders should also work to create consensus around common disciplinary guidelines on the use of generative AI in classes. A patchwork of individual policies has created confusion and uncertainty among students and faculty—confusion that won’t go away on its own.

Identify faculty leaders to help

Faculty are hungry for examples of ways to use generative AI effectively in teaching and learning. Ideas abound, but faculty lack the time to keep up with AI-related material. To help with that, departments, schools, and colleges should identify faculty members who feel comfortable with generative AI and can help colleagues better understand how to use it. Offer them time or other compensation to delve deeper into AI and to act as guides. Faculty are more likely to listen to advice that comes from colleagues than from administrators or outside sources, so identifying faculty AI leaders can help promote discussions. Provide time at meetings not only for updates but also for sharing examples of effective use of generative AI and ideas on how curricula might need to change.

Don’t try to do this alone

None of us has the expertise to address every aspect of generative AI, so tap into the many resources available on and off campus.

  • Representatives of teaching centers can guide adaptation of assignments and curricula.
  • Staff members from writing centers and libraries can provide insights from helping students negotiate a confusing collection of course policies on generative AI and from hearing how students have been using AI.
  • Instructional designers and educational technology specialists can provide advice on how to adapt online and hybrid courses and how to use learning management systems effectively.
  • Organizations like the POD Network; UNESCO; the Modern Language Association and the Conference on College Composition and Communication; and the Center for Innovation, Design, and Digital Learning have created working groups or issued reports about the use of generative AI. Similarly, most educational conferences have added sessions related to generative AI.
  • Students are also an important constituency in discussions about AI and learning, curriculum, and academic integrity. In August, a group of students primarily from the University of Illinois Urbana-Champaign organized the AI x Education Conference to explore the impact of AI on education. More recently, Stony Brook, the University of Wisconsin–Whitewater, and Georgia Tech, among other institutions, have included students in panel discussions about AI. Those types of discussions are important because we need students’ perspectives on finding an effective way forward.

Build in key events to help create a climate of trust

AI detectors may sound like a good idea, but like plagiarism checkers, they treat symptoms rather than address the underlying problem. Instead of spending countless hours tracking down academic misconduct, we need to take a hard look at why students are drawn to generative AI and how we can create an atmosphere of honesty and trust. The motivations to cheat are diffuse and complex: Students feel intense pressure to maintain high grades. They often don’t see the relevance of courses. They have jobs and family obligations. They increasingly see a degree as a consumer product rather than a challenging process of learning. Rarely do we talk with students about any of that. Only by building a sense of community, belonging, and trust can we encourage students to avoid the shortcuts and focus on the long-term value of learning and integrity. This can happen at orientation, in first-year experience courses, and in introductory courses within majors, but helping instructors adopt inclusive, flexible practices and helping students feel a part of a broader academic community will be crucial to building trust.

Provide safe, equitable tools

Surveys suggest that far more students than faculty are using generative AI. Many faculty I talk with would like to integrate AI into their teaching, but institutions have been slow in providing tools that meet privacy standards. Institutions can’t prevent students from using new technology. They can commit to providing equitable access to technology in timely ways, though, so that faculty aren’t perpetually several steps behind students. University-created and vetted tools also ensure that all students have access to technology, not just students who can afford to pay. Ideally, universities should also recruit faculty to experiment with new software and hardware and identify digital tools that can improve teaching and learning, save faculty time, and engage students in meaningful ways. EDUCAUSE has advocated for a variation of that approach with AI. The Technology Innovation in Educational Research and Design initiative at the University of Illinois Urbana-Champaign is a good example of an interdisciplinary approach to experimenting with technology for teaching. That sort of forward-looking experimentation can help universities better harness technology and perhaps temper the sky-is-falling mentality that generative AI has brought on.


Doug Ward, PhD, is associate director of the Center for Teaching Excellence and an associate professor of journalism and mass communication at the University of Kansas.

Tags:

You Might also Like