Teaching and Learning Resources on Generative AI

This article aims to help faculty learn more about how to develop an approach to managing generative AI in their courses, including guidelines for how to develop a generative AI policy for students.

Five Steps to Managing Generative AI in Your Classroom

 

Step 1: Learn more about genAI, teaching, and learning.
If you have already used genAI tools such as ChatGPT and are familiar with the capabilities, limitations, and concerns in education, you can skip this. If you could benefit from more knowledge, the AI Pedagogy Project’s AI Guide can get you up to speed with how genAI works and what educators should know. The guide includes an embedded ChatGPT 4.0 page so that you can try it out yourself. 

 

Step 2: Test your assessments in a genAI tool.
Copy and paste each of your assignments into one of the genAI tools (ChatGPT, Gemini, etc.) and look at the results. If a student were to hand this in, would your student be tempted to submit the results as their own? Would you be able to tell it was generated by genAI? Would work produced in this fashion meet your desired learning goals?

 

Step 3: Decide if using genAI is ok with you. 
Based on the results of Step 2, decide if students using genAI for these assignments is acceptable. Using a stoplight metaphor, which best describes your thinking?
 
  • Red: No genAI in my course—it would hurt student learning too much.
  • Yellow: genAI use may be used for some things, but not others. 
  • Green: Students can use genAI in any way they want.

 

Step 4: Develop assessments and a genAI policy.

Based on your thinking about how genAI can help or hurt student learning, you will likely need to revise current assessments or develop new ones. 

The genAI policy stated in your syllabus should clearly define what is allowed and what is not, and your rationale. Below are links to three sample statements in separate Google Docs, along with guidelines for revising and developing assessments, that you may copy and customize as you like:

Note: Unclear policies create problems for the Honor Council if there is any concern with an honor violation. For more information, read the 2024-2025 Union College Honor Council Guidance on AI Generated Content and Academic Integrity 

 

Step 5: Talk about genAI with your students.

Regardless of your position about how genAI can influence student learning, your students are thinking about it and most likely trying it out. In a recent post, AI researcher Ethan Mollick reports that 82% of college undergraduates have used genAI. That number has surely grown.  

You don’t need to be an expert in genAI to model how to think critically about emerging topics in our society. What questions are you asking? What questions is your discipline asking? Are there examples of how it is being used well that you could examine in class? Do students understand the privacy and ethical implications of genAI? Consider sharing this guide produced jointly by Elon University and AAC&U and facilitating a discussion. 

Finally, a genAI policy in your syllabus is not enough. For students to really understand, it is important to talk about the policy before students begin work on assessments.

 

Warning: Generative AI Checkers DO NOT WORK 

They are unreliable, they reproduce bias, and are not effective. Avoid thinking they will be an aid to you. Instead, spend your time reshaping assessments and maintaining open dialogue with your students about generative AI.

 

Privacy Considerations, Specifically with ChatGPT

When faculty are designing learning activities for courses, keep in mind ITS does not currently engage in a contract with OpenAI (or other entities) to "License" ChatGPT for the College due to its prohibitive enterprise cost. ITS normally covers things like protecting FERPA data as part of the process of entering into contracts with vendors. Therefore, the various terms and policies on OpenAI's website will apply to faculty AND students on an individual basis. Therefore, if faculty want to require students create a ChatGPT account (or another similar platform), faculty should not only inform students of the limitations of such platforms (i.e., they can provide inaccurate or biased information, fabricated quotes, etc.) but also potential data privacy concerns. Everyone who chooses to create an account should review the terms and privacy policy to fully understand the provisions permitting the sharing of information. It is important to note that there is no recognition of potential FERPA data being entered into the system, so, therefore, there are no precautions being taken to keep that data confidential nor identifying data in that fashion.

ChatGPT has noted the following privacy concerns that students need to be aware of (along with faculty):

  • Data Collection: ChatGPT collects and stores information about the user's interactions, which could include sensitive information such as personal details or confidential academic information.  ChatGPT ignores "Do Not Track" settings.
  • Data usage: ChatGP may use the data collected for research, analysis or commercial purposes.
  • Data security: There is always a risk of data breaches and unauthorized access to the information stored by ChatGPT.
  • Data retention: ChatGPT may retain data for an indefinite period of time, which could lead to privacy issues in the future.

If you haven't created a ChatGPT account yet, individuals are required to enter a phone number to use for verification (can receive SMS) which some people may not be comfortable providing.

INSTRUCTIONAL DESIGN PERSPECTIVE RE: DATA PRIVACY

If faculty decide they'd like students to use ChatGPT (or a similar, "free" generative AI tool), faculty should intentionally design an alternative way of completing the assignment that removes any requirement that students must create a personal account as a part of their grade. Students should be informed of the privacy considerations mentioned above and given the agency to decide whether or not they feel comfortable creating the account or not.