Emerging artificial intelligence tools are presenting opportunities and challenges in higher education (and beyond) over their increasing ability to produce new "original" works of text, art, code and more. This article aims to help faculty and students understand what these tools are and how we can responsibly approach their use from a teaching and learning context.
WATCH
Advances in artificial intelligence raise new ethics concerns, PBS News Hour
This short 8 minute news report from January 10, 2023 does a good job unpacking the newest developments in artificial intelligence tools and how they are changing the way we live and work. As you watch, think about how these advancements are presenting both opportunities and challenges for the way we teach and learn.
READ
What are Generative Artificial Intelligence tools and more specifically ChatGPT?
There are new generative AI tools that we know students everywhere are starting to use around their education. This section aims to provide some guidance, support, and an opportunity to participate in thinking about how to navigate these tools.
Recently, a new set of tools have become more widely available that utilize artificial intelligence to generate entire written and visual works from a prompt. Most known among them is the tool, ChatGPT which allows users to put in a prompt and generate original text from the program. These tools can make it quite easy for people to enter in a prompt (a question, a string of words, a description) and for the tool to generate a largely clear and coherent work (essay, short story, poem, annotated bibliography, code, image, etc). Read Avanti Khare, Union College student and Sci-Tech Editor for the Concordiensis, explain more in her January 12, 2023 article, Behind the viral AI writer ChatGPT.

What do Generative AI Tools mean for teaching and learning?
There are many possibilities in how to use these tools related to education and work. We prompted ChatGPT to see how well it could explain how ChatGPT is affecting teaching and learning:

However, it is not surprising that these tools may inadvertently disrupt the ability to demonstrate learning effectively and are raising concerns with educators. Again, we turned to ChatGPT to see how well it could articulate what is troubling teachers:

Privacy Considerations, Specifically with ChatGPT
When faculty are designing learning activities for courses, keep in mind the perspective that ITS will not be engaging in a contract with OpenAI to "License" ChatGPT for the College. Currently, OpenAI is not extending that option to colleges and universities. ITS normally covers things like protecting FERPA data as part of the process of entering into contracts with vendors. Therefore, the various terms and policies on OpenAI's website will apply to faculty AND students on an individual basis. Therefore, if faculty want to require students create a ChatGPT account (or another similar platform), I encourage you to not only inform students of the limitations of such platforms (i.e., they can provide inaccurate or biased information, fabricated quotes, etc.) but also potential data privacy concerns. Everyone who chooses to create an account should review the terms and privacy policy to fully understand the provisions permitting the sharing of information. It is important to note that there is no recognition of potential FERPA data being entered into the system, so, therefore, there are no precautions being taken to keep that data confidential nor identifying data in that fashion.
ChatGPT has noted the following privacy concerns that students need to be aware of (along with faculty):
- Data Collection: ChatGPT collects and stores information about the user's interactions, which could include sensitive information such as personal details or confidential academic information. ChatGPT ignores "Do Not Track" settings.
- Data usage: ChatGP may use the data collected for research, analysis or commercial purposes.
- Data security: There is always a risk of data breaches and unauthorized access to the information stored by ChatGPT.
- Data retention: ChatGPT may retain data for an indefinite period of time, which could lead to privacy issues in the future.
If you haven't created a ChatGPT account yet, individuals are required to enter a phone number to use for verification (can receive SMS) which some people may not be comfortable providing.
INSTRUCTIONAL DESIGN PERSPECTIVE RE: DATA PRIVACY
If faculty decide they'd like students to use ChatGPT (or a similar, "free" generative AI tool), faculty should intentionally design an alternative way of completing the assignment that removes any requirement that students must create a personal account as a part of their grade. Students should be informed of the privacy considerations mentioned above and given the agency to decide whether or not they feel comfortable creating the account or not.
Example Statements for Syllabus
To that end, you may want to revisit your syllabus to include some mention of your own course’s considerations around the use of this tool. Here are some potential example statements you may adjust in service of what is best for your students and desired learning goals for your class. If you do add a ChatGPT (or artificial intelligence generative tool) policy to your syllabus and don't mind sharing it with others, please share it with snyderd2@union.edu and we will add it to the list below.
Pedagogical Advice
Below you will find some ideas of how you can think about these tools, as well as concrete ideas for assessments from your colleagues at Union College (click on the double arrow to expand the section). If you have ideas you are willing to share with others, please send them to snyderd2@union.edu and we will add it to this section.
Marianna Ganapini, Assistant Professor in Philosophy

Is ChatGPT for everyone? Seeing beyond the hype toward responsible use in education
Montreal AI Ethics Institute article by Marianna Ganapini, Pamela Lirio, and Andrea Pedeferri.
Jan. 3, 2023
ChatGPT is the latest Open AI chatbot, able to interact with human agents conversationally. You can ask questions, many of which will be answered in seconds. Syntactically this chatbot writes like a pro: its sentences are usually well-structured and grammatically correct. The tone of its writing sounds – for the most part – professional, courteous, and well-polished. Often, the answers generated sound legitimate: it feels like ChatGPT knows what it’s talking about!
But is this AI ethical? Can it be used responsibly? What harm might it generate? CLICK HERE TO CONTINUE TO READ FULL ARTICLE
Mark Dallas, Associate Professor of Political Science

I've been trying to keep up with this space, but it changes so fast! Beware: GPT-4 will be released in the next 3 to 6 months and people are saying that it will be SUBSTANTIALLY better. This is just the beginning.
Very briefly, these are a few headline points that I've garnered from a variety of different sources:
- Faculty could go to the [chatGPT] website and enter their take-home exam or essay questions to familiarize themselves on the type of answers they receive back which can help determine how easily it can return answers and the quality of the answers.
- Talk to students about it instead of pretending it does not exist. And talk to them about your expectations and do's and don'ts. This is to make sure they are aware of any rules you have.
- Possible methods to counteract the technology:
- The more specific the question, the harder for the AI to generate meaningful responses.
- As Peter Bedford said, requiring citations (but keep an eye on the citations, as I've read articles in which it can also return citations. But it often makes up the citations out of thin air! There is no such journal article, but it makes it look like there is)
- It has trouble with questions with pictures and graphs, etc. [Might help with art history or economics, etc]
- Consider in-class exams, oral exams or assessment where progress is tracked over time and multiple assignments (sort of like WAC classes in which writing assignments build on the prior ones). [This requires a larger overhaul of the course]
- It is trained on information before 2021, so it will have a weaker response to more recent events. This could change quickly, however.
- Possible methods to incorporate the technology: Plug your essay questions into ChatGPT, get a result from it and then give the result to students. Then:
- ask students to annotate the ChatGPT essay with their thoughts on its strengths/weaknesses.
- ask student to "correct" the essay for errors, or to interpret the essay.
- ask students to write a rebuttal (if argumentative in nature), or expand upon the various thoughts in the essay.
These methods will focus students on critical thinking.
[WARNING: students could potentially plug some of these back into ChatGPT, so you may have to check this also]
I'm less familiar if these methods will work in quantitative work or coding in CS, but I've heard some of them can work.
Hope some of this helps for Winter term at least! If GPT-4 is released in Spring term, this may need to be reworked. I'm guessing that educators will eventually have to move towards the "incorporate the technology," but this will be controversial. IMO, it is going to transform education and for the better after a period of adjustment. But, just my opinion and too complex to explain here.
Joe Johnson, Director of Writing Programs

Joe Johnson has provided the Union Community with a January 30, 2023 statement from the Association for Writing Across the Curriculum* (AWAC) Executive Committee on Artificial Intelligence Writing Tools in Writing Across the Curriculum Settings.
For over a decade, researchers and entrepreneurs have been developing Artificial Intelligence text generators. In recent years, tools such as OpenAI’s GPT-2 and GPT-3 or Chat GPT have become sophisticated enough to produce texts that some readers find difficult to distinguish from texts produced by human writers. This development raises practical, pedagogical, and ethical concerns, including in academic settings.
A fundamental tenet of Writing Across the Curriculum is that writing is a mode of learning. Students develop understanding and insights through the act of writing. Rather than writing simply being a matter of presenting existing information or furnishing products for the purpose of testing or grading, writing is a fundamental means to create deep learning and foster cognitive development. Learning to write within a field or major is also one of the most critical ways that emerging scholars and professionals become enculturated in a discourse community. We are concerned that relying on AI text generators limits student learning and enculturation.
Our Position
As scholars in the discipline of writing studies more fully explore the practical and ethical implications of AI language generators in classroom and other settings, we underscore this: Writing to learn is an intellectual activity that is crucial to the cognitive and social development of learners and writers. This vital activity cannot be replaced by AI language generators.
That said, we understand that institutions, departments, and faculty will have to decide locally what role AI text generators should play in their situations. Some learning communities might reject these technologies outright, including them, for example, in campus policies about plagiarism. Other communities might find productive pedagogical roles for this technology; indeed, some writing teachers are having students explore and experiment, in a critical fashion, with AI writing: its potential for aspects of the writing process, its limitations, its ethics, its costs.
Furthermore, in some professional fields, AI tools have been available for years, and professors in those fields have incorporated attention to them in teaching.
Context, Past and Future
The history of writing is marked by changes in technologies that have shaped how people write and what writing can accomplish: from clay tablets to papyrus, quill pens to pencils, handwriting to typing to texting, words to image to design to multimodality, physical library to the world wide web. On one hand, AI text generators are yet another technology with potential uses in various invention, drafting, and editing processes. On the other hand, their potential autonomy from human writers makes them qualitatively different from previous technologies.
While exclusively having AIs generate writing does not engage students in an essential mode of learning, it is also clear that writing scholars and WAC faculty should explore whether–and, if so, how–AI text generation tools might be integrated into writing pedagogy. The WAC Clearinghouse hosts a page of useful resources: AI Text Generators and Teaching Writing: Starting Points for Inquiry. We might pose these research questions: Might the acts of critiquing, rewriting, or discussing AI-generated text foster growth? Are there scenarios where student writing might productively be complemented, supplemented, or assisted by AI language generators? Can this happen in ways that do not preempt student learning?
It is premature to provide answers to such questions, which need thoughtful investigation. We look forward to that research.
Reaffirming Best Practices
Current AI discussions remind us, yet again, of long-established best practices in Writing Across the Curriculum, grounded in research and extant for decades: designing meaningful and specific assignments that foster learning and develop skills; focusing on processes and practices such as peer-response and revision; encouraging writing in multiple genres, including ones connected to specific disciplinary practices.
We recommend fostering the kind of deep learning and cognitive development that students gain through writing to learn and through learning to write in specific situations.
*About AWAC and this Document
The Association for Writing Across the Curriculum (AWAC) is an international organization that brings together the intellectual, human, political, and economic capital of the Writing Across the Curriculum (WAC) community to grow WAC as a global intellectual and pedagogical movement. AWAC promotes initiatives that support students’ writing across their academic careers, faculty development related to student writing and writing pedagogy, and research into writing across domains (e.g. disciplines, professions, communities, and academic levels) and transnationally.
Joining the Executive Committee (Doug Hesse, Justin Rademaekers, Ann Blakeslee, Laurie Britt-Smith, Karen Moroski-Rigney, Sherri Craig, and Paula Roskinksi) in drafting this statement was Stacey Sheriff. We sent a draft version to AWAC members in mid-January, and their thoughtful comments informed revisions. Contact: Doug Hesse, AWAC Chair, at dhesse@du.edu.
Denise Snyder, Director of Learning Design and Digital Innovation

“We do not learn from experience... we learn from reflecting on experience.”
John Dewey
The rise of generative AI tools is an opportunity for faculty to take a step back and examine the instructional design of their courses, and more specifically, their assessments. I encourage faculty to ask themselves, am I structuring assessments OF understanding or am I structuring assessments FOR understanding? It is a slight, yet powerful tweak in perspective that can yield the construction of learning activities that AI generation tools find difficult, if not impossible, to complete (at least on their own). What I am really talking about is going back to the basics of what good instructional design looks like. Well designed assessments can promote learning, as well as measure it.
What does that mean? Well, assessment isn't always simply evaluating the learning that has already happened, it can also help learning happen—this is what is meant by “assessment FOR learning.” Part of good instructional design is knowing how to design both assessments FOR learning and OF learning.
This is where faculty can get creative and go beyond the traditional research paper and/or exam and create multiple, engaging, formative opportunities for students to actively make their thinking visible, while also including space for students to reflect, revisit, and revise their thinking. Requiring metacognitive reflection not only helps defeat the use of generative AI tools, more importantly, it makes learning "stick". Attending to metacognitive thinking in your assessments can improve transfer of understandings to authentic, real world scenarios well beyond the walls of the classroom.
The Learning Design and Digital Innovation is here to help faculty rethink assessments at their convenience. We also encourage faculty to apply to our summer FDI course incubator, where faculty will receive support from colleagues, instructional designers, librarians, learning technologists, and more to help (re)design and develop courses.
If you are interested in reading pedagogical advice ChatGPT came up with, take a look:

Curated Resources
We will continue to compile a list of articles and resources to share with the Union College community about this topic on this Google Document. If you come across items you'd like to add, please send them to snyderd2@union.edu.
If you are having difficulty or you have unanswered questions, please contact the Help Desk through the ITS Service Catalog or call (518) 388-6400.