by Adam Lenz
Why Using A.I. Is Complicated to Talk About
In my own work supporting students and student employees, trying to decide how and when to talk about the use of artificial intelligence (A.I) and generative A.I. tools like Gemini, Copilot, or, of course, ChatGPT never feels easy. Besides critical questions like “What is your faculty member’s perspective on A.I. use for this assignment?” or “Is this tool trained on and utilizing copyrighted or otherwise personally protected materials without consent?,” there are also other considerations to keep in mind such as the environmental impacts of large language model systems, the intended skills a task is meant to develop that A.I. tool use may circumnavigate or detract from, and whether or not a given prompt may produce outputs laden with problematic biases obfuscated as content-neutral or verified information. When my audience is also wary of potential academic repercussions and may not be any more familiar with the options available than I am, it is easy to feel like mentioning A.I. at all is more stress than it is worth.
Even when I focus on questions that carry perhaps less immediate ethical weight, providing responses that meaningfully support a student or fellow staff member who came to me with questions can still feel hard to provide. Giving a response that targets that sweet spot of empowering the other to grow their own autonomy while still feeling like I provided a comforting support must land somewhere between a directive statement like “Here is what you do…” and an outright unhelpful statement like “I have no idea, you’re on your own.” For example, a common question a student may ask me is something like, “Here is what I need to do with this assignment or task. Which A.I. tool will best provide what I am hoping for?” Questions about HOW to use A.I. tools require me to maintain familiarity about the range of potential options available, the user-friendliness to learn and use each tool, the degree of validity versus hallucinations in outputs provided, and the accessibility of features on offer. All these questions are themselves further complicated by the fact that more products hit the market seemingly each week and often come with both free and premium-tier features that essentially produce two or even three kinds of product with the same label (e.g. ChatGPT 3.5 vs. ChatGPT 4o).
If only there were repositories of useful, curated strategies, articles, and tools to help the weary educational professional hoping to provide thoughtful mentoring for students on the overwhelming issue that is “How to approach the use of A.I.” in higher education…
Oh wait! There are!
Useful A.I. Resources
One of the great joys that has come from doing my best to keep up with the rush of new A.I.-related topics is to see the brilliant innovations and offerings that the community of scholars and academic support professionals have begun to feverishly put together in order to best support not only their students but one another as well. I’d like to share some resources below that I have found to be very helpful when I am considering using A.I. in an academic context or having a conversation with a student about their use of A.I. and hoping to give them resources and perspectives to most critically decide if any given tool is right for them:
How to Use AI to Do Stuff: An Opinionated Guide by Ethan Mollick
A quick article to turn to when trying to figure out which A.I. tools are able to do which tasks, helpfully (and snarkily) broken down into the three broad categories of “write stuff,” “make images,” and “come up with ideas” that also covers which are free and which are not.
The Prompt Library for Educators and the 5S Framework for Educational Prompts by A.I. for Education
I’m including these two resources in tandem because they both come from the same group, A.I. for Education, and also provide a solid base for finding, elaborating on, and eventually creating your own kind of prompt to use for nearly any question you may want to pose to an A.I. tool like ChatGPT. The Prompt Library is a remarkable time saver of useful prompts to use in your own work or offer to students on assignments where A.I. use is permissible, and the 5S framework is a list of important reminders to ground an A.I. tool firmly in the seat of assistant rather than driver of a learning experience.
Generative A.I. Ethics in Higher Education Scenarios – Discussion Prompts by TechnoEthics
Often when I think about sharing A.I. with students or fellow professional staff, I want to know if they have considered the various ethical ramifications and concerns that these tools bring up with their use. To this end, I have found this series of scenarios by TechnoEthics a great conversation starter for team or one-on-one meetings where we have time to reflect and discuss. Several scenarios here delve into questions about accessibility features that A.I. tools may provide, but at risk of linguistic oversimplification, disproportionate access to different resources in a classroom, and other challenges that are worth sharing a discussion about at the very least before creating a norm or policy regarding A.I. use in your classroom, program, or administrative unit.
OSU ECampus AI Tools and Decision Tree
I would be remiss if I did not include Oregon State University’s own excellent webpage about the important considerations faculty and student support professionals should keep in mind when working with A.I. tools. From a list of sample syllabus statements to modify about A.I. policy to ethical considerations to keep in mind when building out a specific assignment, OSU ECampus has done a remarkable job combining practical pedagogical frameworks with the role A.I. can play in a student’s learning experience. The section on assignment redesign ideas under their ‘Practical Strategies’ is a particularly useful list in my opinion as many of these suggestions can also be helpfully restructured into prompts that students use when in class or studying alone.