Skip to main content
Usable Knowledge

Academic Resilience in a World of Artificial Intelligence

Guidance for educators as generative AI changes the classroom and the world at large
A human brain made of colorful, tiny boxes.

The emergence of generative artificial intelligence (AI) such as ChatGPT and other large language models (LLM) has brought a flurry of headlines and concerns about its impact across a wide array of fields, including education.

While some have moved to embrace the breakthroughs of AI in the classroom, there remains necessary skepticism about its use and the potential pitfalls associated with its proliferation. As consumer interest in AI has grown, its users have found ways to make it write essays and answer questions based on its available data. But the writing and answers are far from perfect, and concerns about misinformation, data privacy, and AI “thinking” replacing human learning are top of mind for teachers and administrators.

For most educators, the response to AI’s emergence must fall between an outright ban of its use and allowing its use across all areas of learning. Harvard Graduate School of Education released its own preliminary guidelines for dealing with AI in the classroom this summer, outlining potential pitfalls and offering suggestions to its faculty and students to help understand how best to use AI as a teaching tool.

“We didn’t want to be so restrictive that we would prevent students from gaining experience using these tools, or not let them use them in ways that can enhance learning rather than detract from it,” says Professor Martin West, academic dean at HGSE. “But we also wanted to convey to students that some uses of generative AI can undermine their learning. Particularly, when the tools are used to do the cognitive work of thinking for students rather than to support their learning.”

Not every school district or company has guidelines in place ahead of the new school year. Best practices will continue to change as the technology — and its limits and uses — continues to evolve. As educators begin to think about guidelines for their schools and classrooms, experts have suggested some starting points for understanding what AI has already changed about educating, and how best to avoid problems to make AI an aid, not a hindrance, in the classroom.

Learn how AI actually works to demystify its impact and potential.

HGSE professor Chris Dede and postdoctoral fellow Lydia Cao, both of Project Zero’s Next Level Lab, recently published Navigating A World Of Generative AI: Suggestions For Educators. In it, the writers noted that the product of AI is different from human thought, “lacking essential qualities such as comprehension, self-awareness, emotions, embodiment, ethics, values, and culture.”

“As an analogy, we can think of AI as moonlight and human accomplishments as sunlight. As the moon reflects the radiance of the sun, AI reflects what humans are capable of — both truthful insights and biased misinformation,” wrote Dede and Cao. “AI is trained using existing data from the worldwide web, which leads to the potential problem of ‘garbage in, garbage out,’ as well as pervasive issues of AI “hallucinations” where it generates responses that sound plausible but are factually incorrect, such as fabrication of citations of research articles that do not exist.”

Create a curriculum that is process-oriented, not product-oriented.

AI should not be used to replace the “thinking” or ability to recall facts that product-oriented learning often asks of students. Rather, Dede and Cao suggest developing process-oriented learning approaches that encourage students to find answers with their own logic and reasoning.

“We need to equip students for a world full of uncertainties and challenges and prepare them to do the complex work of problem-solving,” they write. “Embracing a messy process rather than becoming good at following “cognitive recipes.”

Make the curriculum more resilient to AI’s advances.

This could mean shifting back to more oral exams in the classroom or pivoting lessons to feature more in-class written work. But the researchers say teaching students the current limits of AI can also be an important lesson in both digital literacy and how learning actually works in humans. Understanding “hallucinations” and misinformation coming from AI can help them better use the technology as a learning tool and understand how they themselves learn.

“This provides an opportunity for learners to hone their critical thinking and judgment skills by evaluating AI-generated content through reflection and discussion,” write Cao and Dede. “In this way, AI is not doing the thinking for the learners, but supporting them to think better.”

Clarify policies and practice.

In the short term, institutions need to let teachers and students know what’s allowed and what isn’t when it comes to AI usage. Policies must keep in mind some major points like:

  • Whether using AI as part of coursework violates academic policies
  • How to cite AI usage in work, if allowed
  • Risks of incorrect information, “hallucinations” or biases in AI-generated work
  • Intellectual or copyright concerns
  • Risks of sharing sensitive information, which may be added to LLM databases

“Our first step was to provide guidelines that clarify what’s appropriate in using these tools in academic work,” said West of HGSE. “And at the center of that is, when in doubt, ask the instructor. Because uses will emerge uses we hadn’t been able to contemplate when writing those guidelines.”

Be malleable.

Education is driven by the needs of its students, who will need to understand how to use AI across a variety of sectors in the future, says West. So, educators need to focus on teaching students how to use it, what to avoid, and ultimately remind them what AI cannot yet replace.

“The more productive way forward is for educators to focus on demystifying AI, emphasizing the learning process over the final product, honoring learner agency, orchestrating multiple sources of motivation, cultivating skills that AI cannot easily replicate, and fostering intelligence augmentation (IA) through building human-AI partnerships,” write Dede and Cao. “Through these approaches, educators can harness the benefits of AI while nurturing the unique abilities of humans to tackle big challenges in the 21st century.”

Usable Knowledge

Connecting education research to practice — with timely insights for educators, families, and communities

Related Articles