A Generative AI Guide for Teachers
Bryan Hiatt, Assistant Professor at Frederick Community College
With the release of GPT in January 2023, it’s fair to say that teachers of writing face new challenges. This is perhaps an understatement of epic proportions. These new challenges are ENORMOUS, given the power and pull of generative AI on students. It’s also fair to say that students have been using the tool and teachers are moving toward establishing strategies for acceptable uses in their classes. As part of this process, I came across the work of Andrew Piper on Twitter. He is a writing professor and director of .txtlab at McGill University in Canada and author of Enumerations: Data and Literary Study.
His list of 11 points (all in bold), published on Twitter, provides a useful starting point for instructors. I provide additional commentary/response after each point, as it relates to instructors at Frederick Community College. All of Piper’s content is quoted and in bold for emphasis.
“1. Make #GPT part of the classroom. Students know about it, students are using it, bring it into the discussion. Don’t pretend it doesn’t exist!” (Piper).
Bryan: I love this point because it allows teachers to be active and say “hey I understand what’s going on and I want to work with you in this new world order” of generative AI. To pretend that it doesn’t exist and go on like it’s business as usual is a missed opportunity.
“2. Take GPT for a test run on your material. How does GPT do answering your questions? Test it on class discussion questions, tests, essays. Get comfortable with what it sounds like but also how your assignments fare in a GPT world” (Piper).
Bryan: This has proved to be a very interesting experience on a few different levels for me. 1) I’ve been using GPT to generate discussion questions, I’ve created post discussion in-class quizzes that we answer together and I let students know that they came from GPT, 3) I’ve used GPT to answer journal and discussion questions and I’ve discovered that some of these generic prompts need revision. Engaging in this exercise is important in understanding how students might use the tool. It also helps us (as teachers) to clarify what we really want students to learn from a specific assignment.
“3. Assess GPT for accuracy with your students. Students are often unaware of the hallucination problem, i.e. that GPT makes up information (including citations). Show them when it goes off the rails and when to trust it (if ever)” (Piper).
Bryan: Another important point. I’m considering discussing the example in class of the lawyer who used GPT to submit a legal brief and was called out for much of the info being wrong (the judge was not happy), and the college professor who asked GPT if essays in his class were generated by AI. The answer from GPT was yes, according to reports, and the instructor failed many students in their class. But it turns out GPT was incorrect in several circumstances, since it’s not designed to make such judgements. So the hallucination factor is real.
“4. Use GPT as a tool of critical reflection. Sometimes I ask GPT questions and then we analyze the answers together. It can be productive to look at someone else’s writing (even if someone else is an AI)” (Piper).
Bryan: This advice gets into the idea of modeling behavior. In this case students would be using GPT as a reflective tool to make decisions, and it would allow for groups of students to weigh in on content.
“5. Decide on your AI writing policy. (See mine at the end of this thread.) AI can be used for brainstorming, writing, and editing. Decide how you feel about the use of AI for each of these activities” (Piper).
LINK: https://twitter.com/_akpiper/status/1665751815072219139/photo/1
Bryan: In the summer of 2023 after a term of using and understanding GPT, having an AI writing policy moving forward is critical. This lets students know about their responsibilities as members in a writing community. It also defines our responsibilities as teachers, working and guiding students through the ethical ways of researching and writing for ENGL101 and ENGL102. Our job as teachers has ALWAYS been about helping students be deliberate/exacting in their use of sources. This is no different. I have some thoughts about this in the LINKED DOCUMENT HERE.
“6. Treat AI writing like any other form of plagiarism. AI generated text is citable discourse, i.e. students are not allowed to include it in their assignments without proper citation. It doesn’t have to be complicated. If you’re concerned, then just do more in class writing” (Piper).
Bryan: This idea resonates with me on many levels, especially the “it doesn’t have to be complicated” part. Since we are talking about “citable discourse,” this is something that we can quantify and connect back to the idea of how GPT may sometimes hallucinate and get things wrong. We can begin to place it within a toolbox of potential uses for students.
“7. Remember there are a spectrum of opinions out there. Some say we should be training students to utilize AI to solve problems. Thus, they should be learning with AI in all of their assignments, including writing” (Piper).
Bryan: Since points 7-8 go together, comments are after point 8.
“8. Others feel that AI pollutes the growth and development of student thinking. We need to keep it out of their hands until they’ve demonstrated proficiency in a given area. We don’t have data yet to support either of these positions. For now keep an open mind in both directions!” (Piper).
Bryan: I think this point is about having an open mind about GPT. Since we are in the first year of its widespread use, there’s so much we don’t know yet. It’s important, then, as we continue to just keep learning and understanding how our students are using GPT and how we can use it as a learning tool in the classroom, and begin to generate best practices for students to use in their own writing processes.
“9. Modularize your assignments so you can see student work at every stage. This is more work but break down writing into discrete parts so they can engage in more long term reflective and constructive thinking (i.e. the hallmark of writing).” (Piper).
Bryan: What Piper describes here is ENGL101 and 102, and the idea of breaking a larger assignment down into component parts should not be new to instructors of our courses. In my own class, each essay has 4-5 parts that we do over several weeks. By doing this, I’m seeing their work from the start providing feedback progressively. And if there are issues, then it’s easier to address in a minor assignment where the stakes are lower.
“10. Share your experiences. Everyone is in the dark on this. We’re all affected. Share successful and unsuccessful exercises and share when you’ve had concerns about the authenticity of student work. What led to it and what did you do?” (Piper).
Bryan: This part has been important for me personally as a teacher, and having conversations with colleagues to sound out ideas, ask about their experiences, as a way to piece together strategies for moving forward. I’ve been teaching at FCC since 2006, and many ENGL101 faculty I’ve worked with have been doing it longer. I can’t recall in this span where we’ve ever encountered a new technology that has put us all in the dark so quickly, as Piper has suggested. So work it out with your colleagues, talk to them as a way to understand more broadly how we’re collectively experiencing this.
“11. Be sympathetic towards the ambiguity of the situation! Students face a very uncertain intellectual & disciplinary landscape when it comes to #AI (teachers too!). Give them the benefit of the doubt. You are helping them (and you) become more comfortable with this technology.” (Piper).
Bryan: The idea of being sympathetic to students with AI aligns with our department’s goals on Culturally Responsive Teaching and Learning. Since this is also new, it is good advice to give students the benefit of the doubt, and to find ways to work with them when AI use falls outside of the categories you’ve established with students.