This article aims to help faculty learn more about how to develop an approach to managing generative AI in their courses, including guidelines for how to develop a generative AI policy.
Five Steps to Managing Generative AI in Your Classroom
Step 1: Refine your teaching and learning goals and learn about Generative AI
If you suspect that your current assignments can be easily completed using Generative AI, consider revisiting core aspects of your teaching. Start with the big picture and ask: What do you REALLY want students to get out of your class? What do you want them to know, or be able to do, in one, five, or ten years? If you'd like a sounding board,
book a meeting with a member of the Learning Design and Digital Innovation Team. You may find that by refining your goals, and the structures that support them, you have already begun to address the challenges of Generative AI. If at this point, you still know very little about what generative AI is capable of, spend time with the
AI Pedagogy Project’s AI Guide. It includes an embedded ChatGPT 4.0 page so that you can try it out yourself. For further experimentation, prioritize Google's
Gemini App (using your Union.edu account), or the Gemini powered
NotebookLM, both of which have enterprise-grade data protection. Gemini and NotebookLM are the only genAI apps that ITS supports.
Step 2: Test your assessments in a genAI tool.
Copy and paste your assignments into one of the genAI tools (keeping in mind that students may use a variety of them: Gemini, ChatGPT, Claude, etc.) and look at the results. Would work produced in this fashion meet your desired learning goals? If a student were to use the same tool, would they be tempted to submit the results as their own? Would you be able to tell it was generated by genAI?
Warning: Generative AI Checkers DO NOT WORK
After some testing, you might think: Aren't there tools to detect AI generated assignments? Yes. But they are unreliable, they reproduce bias, and are not effective. Avoid them. Instead, spend your time reshaping assessments and maintaining an open dialogue with your students about these tools.
Step 3: Decide how to define appropriate usage of generative AI (with student input)
Based on the results of Step 2, think about your position on student use of genAI for these assignments. Consider setting aside part of a class to request student input on the use of Generative AI and develop a policy as a group. This kind of transparency can build trust and encourage students to be responsible for their learning.
Using the stoplight as a metaphor, the following categories can help guide your thinking:
- Red: No genAI in my course—it would hurt student learning too much.
- Yellow: genAI may be used in certain ways for some assignments, but not others.
- Green: Students can use genAI in any way they want.
Step 4: Develop assessments and a genAI policy.
At this point you will likely need to revise current assessments or develop new ones in addition to defining your AI policy.
The genAI policy for your class should clearly define what is allowed and what is not, and your rationale (and/or the rationale proposed by students). Below are links to three sample statements in separate Google Docs, along with guidelines for revising and developing assessments, that you may copy and customize as you like. Keep in mind that when you include students in the process of developing these statements, the language may need to change.
Note: Unclear policies create problems for the Honor Council if there is any concern with an honor violation. For more information, read the Union College Honor Council Guidance on AI Generated Content and Academic Integrity
Step 5: Talk about genAI with your students.
Regardless of your position about generative AI, your students are thinking about it and most likely trying it out. AI researcher Ethan Mollick reports that 82% of college undergraduates have used genAI. That number has surely grown.
You don’t need to be an expert in genAI to model how to think critically about emerging topics in our society. What questions are you asking? What questions is your discipline asking? Are there meaningful examples of how it is being used that you could examine in class? Do students understand the privacy and ethical implications of genAI? Consider sharing this guide produced jointly by Elon University and AAC&U and facilitating a discussion.
Finally, a genAI policy in your syllabus is not enough. For students to really understand, it is important to talk about the policy before students begin work on assessments.
Privacy Considerations, Specifically with ChatGPT
When faculty are designing learning activities for courses, keep in mind the only GenAI app ITS supports is Google's Gemini App, as well as Gemini powered NotebookLM, both of which have enterprise-grade data protection.
ITS does not currently engage in a contract with OpenAI (or other entities) to "License" ChatGPT for the College due to its prohibitive enterprise cost. ITS normally covers things like protecting FERPA data as part of the process of entering into contracts with vendors. Therefore, the various terms and policies on OpenAI's website will apply to faculty AND students on an individual basis. Therefore, if faculty want to require students create a ChatGPT account (or another similar platform), faculty should not only inform students of the limitations of such platforms (i.e., they can provide inaccurate or biased information, fabricated quotes, etc.) but also potential data privacy concerns. Everyone who chooses to create an account should review the terms and privacy policy to fully understand the provisions permitting the sharing of information. It is important to note that there is no recognition of potential FERPA data being entered into the system, so, therefore, there are no precautions being taken to keep that data confidential nor identifying data in that fashion.
ChatGPT has noted the following privacy concerns that students need to be aware of (along with faculty):
- Data Collection: ChatGPT collects and stores information about the user's interactions, which could include sensitive information such as personal details or confidential academic information. ChatGPT ignores "Do Not Track" settings.
- Data usage: ChatGPT may use the data collected for research, analysis or commercial purposes.
- Data security: There is always a risk of data breaches and unauthorized access to the information stored by ChatGPT.
- Data retention: ChatGPT may retain data for an indefinite period of time, which could lead to privacy issues in the future.
If you haven't created a ChatGPT account yet, individuals are required to enter a phone number to use for verification (can receive SMS) which some people may not be comfortable providing.
INSTRUCTIONAL DESIGN PERSPECTIVE RE: DATA PRIVACY
If faculty decide they'd like students a generative AI chat tool, the College encourages the use of Google's Gemini App with enterprise-grade data protection that ITS provides students through their MyApps login. If faculty decide to use ChatGPT (or a similar, "free" generative AI tool), then they should intentionally design an alternative way of completing the assignment that removes any requirement that students must create a personal account as a part of their grade. Students should be informed of the privacy considerations mentioned above and given the agency to decide whether or not they feel comfortable creating the account or not.