Russell P. Johnson is Assistant Director of the Undergraduate Religious Studies Program and Core Sequence in the University of Chicago Divinity School and a CCTL Associate Pedagogy Fellow. His teaching includes courses on nonviolent direct action, argumentation and epistemology, and religion and film, and he has taught courses such as “Villains: Evil in Philosophy, Religion, and Film,” “Truth, Half-Truth, and Post-Truth,” “Star Wars and Religion.” He has contributed an essay, “First Impressions: Expectations and Tone in Syllabus Construction” to the CCTL column Teaching Matters, and “On ChatGPT: A Letter to My Students” to the Divinity School digital magazine Sightings.
Have generative AI tools made you think differently about how you approach teaching?
Generative AI has helped me be more explicit with students and with myself about the goals of my assignments. If you had asked me three or four years ago, “why do you give students five-page papers,” I would have said something similar to what I say now, but without really having thought about it. The availability of AI has forced me to reflect on the aims of assignments and explain to students why these exercises are useful for their learning.
Has AI changed how you interact with your students?
I have a spiel that I go through on the first day with students where I talk to them about the goal of written papers. A big part of what I’m trying to do is humanize the writing and feedback process to remind students that there’s a human being who’s dedicating time to giving constructive feedback on their papers. And if their paper is written by a LLM, that’s not a “victimless crime.” I care about them and their learning--that’s why I give them these assignments--and I want to provide them feedback that helps them grow as persuasive writers and critical thinkers. If they use AI when they are not supposed to, they’re cheating themselves out of a learning opportunity; if they use AI, they’re also making me provide feedback without a purpose. I explain to students that it sucks to have it in the back of my mind as I’m reading student papers – was this written by an AI? Am I wasting my time? It has changed the grading process by introducing this whole new element of doubt.
How are you addressing the use of AI with your students? Are there tools or resources that you've found useful?
We talk about why they have writing assignments, what the goals are, what I’m hoping they’ll take away from this experience, and what skills are being incorporated into the writing process. I let them know that five years from now, what’s important is not that they remember the Bhagavad Gita, for example, but that they know how to bring clarity to a text like it. To some extent we practice that skill together in the classroom during our discussions, but learning to do so on one's own is important too. I think of myself like a personal trainer: I’m making you do difficult things, providing accountability, asking you to trust me, and giving you feedback on your performance. A few years ago, I wouldn’t have shared that with students, but I do now because I want them to buy into the process. I don’t want them to use AI to bypass their learning.
What is the biggest challenge that you've experienced when trying to integrate AI into your teaching?
The biggest way it impacts my teaching is in the feedback process. It is challenging to decide where to give feedback on what they’re doing and how they can improve it. I also struggle with how to raise any suspicions I have of AI use, and what to do about it if I suspect that they didn’t do the work themselves.
What are students learning about their own learning and about AI in your class?
I always say, “it’s not like I don’t have enough five-page papers to read.” What I value is not the final product of a written assignment as much as the process they go through to complete it. I’m having to be more explicit about why we’re doing these exercises, whether it’s papers or otherwise. Because of the temptation of AI and the shortcuts it offers, students have to be more reflective about their learning, too. They already have to prioritize their time investments in each class. They have to be aware of possible shortcuts. There seems to be more reflection about teaching and writing among students, and the advantages of generating content yourself versus having a machine generate it for you. I hope that students are thinking more about creativity and skill-building than they were before.
What advice would you give or what resources would you recommend for those interested in using AI in their teaching?
The CCTL has been a place to have conversations about AI’s impacts on teaching and on student learning, and it’s been helpful to think together with them about alternative assignments and navigating questions around AI.
If you decide that AI contributes to student learning in a particular assignment, then by all means include it. But talk with students to differentiate the process versus the product, and AI’s role in each. Some students will use it to complete assignments that feel like busywork, or that aren’t meaningful to them, so make sure your students know the reasons behind the work and that they understand the goals of each assignment. I’ll be interested to read future research about assignments where students use AI versus those who don’t, and the impact on learning outcomes so we can continue conversations about the relationship between AI and student learning.