Teaching and Learning Resources on Generative AI

This article aims to help faculty learn more about how to develop an approach to managing generative AI in their courses, including guidelines for how to develop a generative AI policy.

  1. Five Steps to Managing Generative AI in Your Classroom
  2. Beyond Policy to Rethinking Assessment
  3. Faculty Pause: Your Use of GenAI in Teaching
  4. Privacy Considerations, Specifically with ChatGPT

Five Steps to Managing Generative AI in Your Classroom

The rise of generative AI tools (like ChatGPT, Claude, and Midjourney) requires intentional policy choices. Use these five steps to help integrate AI use thoughtfully and effectively into your course design.

Step 1: Define learning goals

Before setting a policy, articulate the specific knowledge, skills, and values you want students to master in your course. Ask yourself:

  • What skills are essential for this course (e.g., critical analysis, primary source reading, professional writing, quantitative problem-solving)?

  • Does the use of AI enhance or undermine the practice of these essential skills?

If you suspect that your current assignments can be easily completed using Generative AI, consider revisiting core aspects of your teaching. Start with the big picture and ask: What do you REALLY want students to get out of your class? What do you want them to know, or be able to do, in one, five, or ten years? If you'd like a sounding board, book a meeting with a member of the Learning Design and Digital Innovation Team. You may find that by refining your goals, and the structures that support them, you have already begun to address the challenges of Generative AI. If at this point, you still know very little about what generative AI is capable of, spend time with the AI Pedagogy Project’s AI Guide. It includes an embedded ChatGPT 5.0 page so that you can try it out yourself. For further experimentation, prioritize Google's Gemini App (using your Union.edu account), or the Gemini powered Gems and NotebookLM, both of which have enterprise-grade data protection. Gemini, Gems, and NotebookLM are the only genAI apps that ITS supports.

 

Step 2: Review existing assignments (test your assessments in a genAI tool).
Evaluate your current assessments against your learning goals (from Step 1). Assessments that primarily test recall or basic application may need to be redesigned to focus on higher-order thinking, synthesis, and analysis, which are harder for current AI models to replicate effectively.
Suggestion: copy and paste your assignments into one of the genAI tools (keeping in mind that students may use a variety of them: Gemini, ChatGPT, Claude, etc.) and look at the results. Would work produced in this fashion meet your desired learning goals? If a student were to use the same tool, would they be tempted to submit the results as their own? Would you be able to tell it was generated by genAI? 

Warning: Generative AI Checkers DO NOT WORK 

After some testing, you might think: Aren't there tools to detect AI generated assignments? Yes. But they are inaccurate, they reproduce bias, and are not effective. Avoid them. Instead, spend your time reshaping assessments and maintaining an open dialogue with your students about these tools.

 

Step 3: Decide how to define appropriate usage of generative AI (with student input) 
Based on the results of Step 2, think about your position on student use of genAI for these assignments. Consider setting aside part of a class to request student input on the use of Generative AI and develop a policy as a group. This kind of transparency can build trust and encourage students to be responsible for their learning. 
 
Using the stoplight as a metaphor, the following categories can help guide your thinking:
 
  • Red: No genAI in my course—it would hurt student learning too much.
  • Yellow: genAI may be used in certain ways for some assignments, but not others. 
  • Green: Students can use genAI in any way they want.

Your choice should be rooted in your learning goals and disciplinary context. Note on Disciplinary Context: Our metaphor aligns with the guidance offered by the Association of Writing Across the Curriculum (AWAC) in their "Decision Making Framework" (see Appendix C). The key is to know your reasons for adopting a specific approach, as the choice is fundamentally pedagogical.

Policy Approach AWAC Terminology Purpose & Rationale
Red Light Policy The Restrictive Approach Minimal or No AI Use Allowed. This approach is appropriate when the assignment's core learning goal is mastery of a foundational process (e.g., source citation, traditional drafting, complex problem-solving steps) that would be compromised by AI assistance.
Yellow Light Policy The Critical Engagement Approach AI Use Allowed Under Specific Conditions. This approach uses AI as a site of inquiry. It allows students to use AI for low-stakes tasks (e.g., outlining, brainstorming, revising) and requires them to critique and evaluate the AI's output, cite its use, and demonstrate their critical intervention.
Green Light Policy The Productive Integration Approach AI Use Encouraged or Required. This approach treats AI as a professional tool. Assignments shift focus to asking higher-order questions, refining AI output, and integrating it into a larger, complex project where AI serves as a powerful starting point or collaborator.

 

Step 4: Develop assessments and a genAI policy.

At this point you will likely need to revise current assessments or develop new ones in addition to defining your AI policy. 

The genAI policy for your class should clearly define what is allowed and what is not, and your rationale (and/or the rationale proposed by students). Below are links to three sample statements in separate Google Docs, along with guidelines for revising and developing assessments, that you may copy and customize as you like. Keep in mind that when you include students in the process of developing these statements, the language may need to change. 

Note: Unclear policies create problems for the Honor Council if there is any concern with an honor violation. For more information, read the Union College Honor Council Guidance on AI Generated Content and Academic Integrity 

 

Step 5: Practice, Refine, & talk about genAI with your students.

Your AI policy is not set in stone.

  • Incorporate Practice: If using a Yellow or Green Light approach, build in low-stakes assignments that teach students how to use AI ethically and effectively as a tool in your discipline.

  • Share Your Findings: We encourage you to share your results, assignment designs, and challenges with the Learning Design and Digital Innovation group so we can continue to refine our guidance across the college.

Regardless of your position about generative AI, your students are thinking about it and most likely trying it out. AI researcher Ethan Mollick reports that 82% of college undergraduates have used genAI. That number has surely grown.  

You don’t need to be an expert in genAI to model how to think critically about emerging topics in our society. What questions are you asking? What questions is your discipline asking? Are there meaningful examples of how it is being used that you could examine in class? Do students understand the privacy and ethical implications of genAI? Consider sharing this guide produced jointly by Elon University and AAC&U and facilitating a discussion. 

Finally, a genAI policy in your syllabus is not enough. For students to really understand, it is important to talk about the policy before students begin work on assessments.

 

Additional Resource for WAC Faculty

If you are teaching a course aligned with Writing Across the Curriculum (WAC) principles, you may wish to review the full document that informed this framework: AWAC Statement on AI and WAC

Beyond Policy to Rethinking Assessment

Having a course AI policy is a good start. You also may have found that assessments you've relied on for years are no longer valid in an age of AI. If you have not spent time thinking about the assessments in your course in the past few years, now is the time!

Begin by reading:

Then, browse strategies:

  1. Rethinking assessment strategies in the age of artificial intelligence (AI) (Charles Sturt University)
  2. Course and Assignment (Re-)Design (University of Michigan)
  3. Integrating AI into Assignments to Support Student Learning (Derek Bruff)
  4. Key Features of High Impact Practices (George Kuh) | High Quality Impact Practices (Peter Felton)

 

Faculty Pause: Your Use of GenAI in Teaching

As genAI tools become more available, it’s important for faculty to reflect on how and why they use these tools in their own teaching practice. Whether creating lessons, grading, or designing whole courses, thoughtful use helps preserve the human-centered values of teaching and learning.

Begin by exploring:

Then, go through the following reflection prompts:

  1.  Where in my teaching/assessment workflow do I currently use (or plan to use) generative AI?
    → e.g., drafting lesson slides, creating quizzes, grading short responses, generating rubrics.

  2. What am I hoping to achieve by using AI in that space (e.g., save time, increase consistency, enhance creativity)?
    → List benefits

  3. What might I be risking by outsourcing or automating that part (e.g., loss of personal feedback, decreased student‐instructor connection, bias/unintended messages about value of student work)?
    → Reflect on how you’ll monitor for those risks.

  4. What guardrails will I put in place to ensure the human element remains central?
    Examples: human review of all AI outputs, explicit disclosure to students, iterative drafts with instructor feedback, reflection prompts for students about how AI was used.

Privacy Considerations, Specifically with ChatGPT

When faculty are designing learning activities for courses, keep in mind the only genAI app ITS supports is Google's Gemini App, as well as Gemini powered NotebookLM and Gems–all three have enterprise-grade data protection.

ITS does not currently engage in a contract with OpenAI (or other entities) to "License" ChatGPT for the College due to its prohibitive enterprise cost. ITS normally covers things like protecting FERPA data as part of the process of entering into contracts with vendors. Therefore, the various terms and policies on OpenAI's website will apply to faculty AND students on an individual basis. Therefore, if faculty want to require students create a ChatGPT account (or another similar platform), faculty should not only inform students of the limitations of such platforms (i.e., they can provide inaccurate or biased information, fabricated quotes, etc.) but also potential data privacy concerns. Everyone who chooses to create an account should review the terms and privacy policy to fully understand the provisions permitting the sharing of information. It is important to note that there is no recognition of potential FERPA data being entered into the system, so, therefore, there are no precautions being taken to keep that data confidential nor identifying data in that fashion.

ChatGPT has noted the following privacy concerns that students need to be aware of (along with faculty):

  • Data Collection: ChatGPT collects and stores information about the user's interactions, which could include sensitive information such as personal details or confidential academic information.  ChatGPT ignores "Do Not Track" settings.
  • Data usage: ChatGPT may use the data collected for research, analysis or commercial purposes.
  • Data security: There is always a risk of data breaches and unauthorized access to the information stored by ChatGPT.
  • Data retention: ChatGPT may retain data for an indefinite period of time, which could lead to privacy issues in the future.

If you haven't created a ChatGPT account yet, individuals are required to enter a phone number to use for verification (can receive SMS) which some people may not be comfortable providing.

INSTRUCTIONAL DESIGN PERSPECTIVE RE: DATA PRIVACY

If faculty decide they'd like students to use a generative AI chat tool, the College encourages the use of Google's Gemini App with enterprise-grade data protection that ITS provides students through their MyApps login. If faculty decide to use ChatGPT (or a similar, "free" generative AI tool), then they should intentionally design an alternative way of completing the assignment that removes any requirement that students must create a personal account as a part of their grade. Students should be informed of the privacy considerations mentioned above and given the agency to decide whether or not they feel comfortable creating the account or not. 

 

Print Article

Related Articles (2)

As part of the College’s ongoing effort to harness the power of AI to improve productivity, the Google Gemini app (gemini.google.com) is available for the Union community. Gemini provides access to Google AI models, which will allow you to chat with the Gemini app to brainstorm ideas, answer questions, create and summarize content, get feedback and more.
NotebookLM is an AI-driven platform that transforms your course documents into interactive knowledge bases. It allows both you and your students to generate targeted summaries. ask nuanced questions about the content, discover meaningful connections between texts.