But How Do I Do That? Where to Find (Some) Guidance About A.I. Use for Higher Education

by Adam Lenz

Why Using A.I. Is Complicated to Talk About

In my own work supporting students and student employees, trying to decide how and when to talk about the use of artificial intelligence (A.I) and generative A.I. tools like Gemini, Copilot, or, of course, ChatGPT never feels easy. Besides critical questions like “What is your faculty member’s perspective on A.I. use for this assignment?” or “Is this tool trained on and utilizing copyrighted or otherwise personally protected materials without consent?,” there are also other considerations to keep in mind such as the environmental impacts of large language model systems, the intended skills a task is meant to develop that A.I. tool use may circumnavigate or detract from, and whether or not a given prompt may produce outputs laden with problematic biases obfuscated as content-neutral or verified information. When my audience is also wary of potential academic repercussions and may not be any more familiar with the options available than I am, it is easy to feel like mentioning A.I. at all is more stress than it is worth. 

Even when I focus on questions that carry perhaps less immediate ethical weight, providing responses that meaningfully support a student or fellow staff member who came to me with questions can still feel hard to provide. Giving a response that targets that sweet spot of empowering the other to grow their own autonomy while still feeling like I provided a comforting support must land somewhere between a directive statement like “Here is what you do…” and an outright unhelpful statement like “I have no idea, you’re on your own.” For example, a common question a student may ask me is something like, “Here is what I need to do with this assignment or task. Which A.I. tool will best provide what I am hoping for?” Questions about HOW to use A.I. tools require me to maintain familiarity about the range of potential options available, the user-friendliness to learn and use each tool, the degree of validity versus hallucinations in outputs provided, and the accessibility of features on offer. All these questions are themselves further complicated by the fact that more products hit the market seemingly each week and often come with both free and premium-tier features that essentially produce two or even three kinds of product with the same label (e.g. ChatGPT 3.5 vs. ChatGPT 4o).

If only there were repositories of useful, curated strategies, articles, and tools to help the weary educational professional hoping to provide thoughtful mentoring for students on the overwhelming issue that is “How to approach the use of A.I.” in higher education…

Oh wait! There are!

Useful A.I. Resources

One of the great joys that has come from doing my best to keep up with the rush of new A.I.-related topics is to see the brilliant innovations and offerings that the community of scholars and academic support professionals have begun to feverishly put together in order to best support not only their students but one another as well. I’d like to share some resources below that I have found to be very helpful when I am considering using A.I. in an academic context or having a conversation with a student about their use of A.I. and hoping to give them resources and perspectives to most critically decide if any given tool is right for them:

How to Use AI to Do Stuff: An Opinionated Guide by Ethan Mollick

A quick article to turn to when trying to figure out which A.I. tools are able to do which tasks, helpfully (and snarkily) broken down into the three broad categories of “write stuff,” “make images,” and “come up with ideas” that also covers which are free and which are not.

The Prompt Library for Educators and the 5S Framework for Educational Prompts by A.I. for Education

I’m including these two resources in tandem because they both come from the same group, A.I. for Education, and also provide a solid base for finding, elaborating on, and eventually creating your own kind of prompt to use for nearly any question you may want to pose to an A.I. tool like ChatGPT. The Prompt Library is a remarkable time saver of useful prompts to use in your own work or offer to students on assignments where A.I. use is permissible, and the 5S framework is a list of important reminders to ground an A.I. tool firmly in the seat of assistant rather than driver of a learning experience.

Generative A.I. Ethics in Higher Education Scenarios – Discussion Prompts by TechnoEthics

Often when I think about sharing A.I. with students or fellow professional staff, I want to know if they have considered the various ethical ramifications and concerns that these tools bring up with their use. To this end, I have found this series of scenarios by TechnoEthics a great conversation starter for team or one-on-one meetings where we have time to reflect and discuss. Several scenarios here delve into questions about accessibility features that A.I. tools may provide, but at risk of linguistic oversimplification, disproportionate access to different resources in a classroom, and other challenges that are worth sharing a discussion about at the very least before creating a norm or policy regarding A.I. use in your classroom, program, or administrative unit.

OSU ECampus AI Tools and Decision Tree

I would be remiss if I did not include Oregon State University’s own excellent webpage about the important considerations faculty and student support professionals should keep in mind when working with A.I. tools. From a list of sample syllabus statements to modify about A.I. policy to ethical considerations to keep in mind when building out a specific assignment, OSU ECampus has done a remarkable job combining practical pedagogical frameworks with the role A.I. can play in a student’s learning experience. The section on assignment redesign ideas under their ‘Practical Strategies’ is a particularly useful list in my opinion as many of these suggestions can also be helpfully restructured into prompts that students use when in class or studying alone.

Student Voices About the Use of AI – Concerns and New Learning Strategies for Independent Studying

by Dr. Adam Lenz, Coordinator of Supplemental Instruction

As professional staff and faculty around the Oregon State University campus look to elevate conversations about the use and regulation of artificial intelligence (AI) programs like ChatGPT and Google’s Bard in course design and student work, another conversation is also taking place on campus among our students. I believe their voices and experiences are necessary for assessing what has, is, and could be attempted regarding changes to OSU’s policies, and view their insights as valuable for faculty when considering what is and is not deemed appropriate use of the technologies in their course.

The use of AI looks differently than it did a few years ago when the pandemic set many educators and students scrambling to find new and innovative solutions for learning challenging content without the same degree of support historically available in pre-COVID times. The development of AI tools is also rapidly advancing in to new iterations at an increasing rate, leaving many to feel under-or-misinformed about what is possible and considered ethical in the use of these technologies. In my own work as Coordinator for the Supplemental Instruction (SI) program within the Office of Academic Support (OAS), I have the privilege of overhearing many such conversations between students and have compiled a few salient takeaways to share below.

AI Use from the Student Perspective – What SI Table Leaders are Hearing 

Our SI Leaders have been hearing accounts of students relying on AI to help them accomplish coursework and study for exams for a few years now, increasing in frequency over the last twelve months. Perhaps not surprisingly, one of the most common examples SI Leaders overhear is that of students discussing how they used AI to help them write lab reports and essays, as well as to solve math equations and science homework that relies on fact-checking. However, the extent to which OSU students are relying on AI as part of their routine work processes seems to be highly variable. For example, one student in the Fall of 2023 told their SI Leader “I like having the AI tools as a backup in case I really can’t figure something out or I’m worried I made a mistake.” Another reported that they only used AI when they felt ‘stuck’ or ‘trapped’ by tight deadlines and limited opportunities to receive other forms of support that felt safe and available in a timely nature.

This kind of ‘safety net’ that AI can offer seems valuable to some, but there is also a growing fear by others that the use of AI leaves students open to potential repercussions without warning. A frustration point for many students right now is that faculty have the authority to decide on an individual basis when AI is allowed to be used for homework and studying, meaning that what is permissible in one course may carry heavy penalties (including risk of being reported for academic misconduct) in another. As an example, an email came to our program in Fall with a request to unregister from a study table as the student (paraphrasing) told us, “I’m dropping this course because the faculty thinks I cheated and I don’t know what else to do.” Another student reported asking AI to generate practice exam questions only for the program to generate actual exam questions from older copies of the exam uploaded elsewhere on the internet. They reported this to their SI Leader who was unsure if the faculty member knew about the leak of their content or whether they would prefer students to use old practice exams as study materials in the first place. Out of fear, the student requested that the SI Leader not disclose this to the faculty directly and instead make an anonymous report.

Student confusion about the issue of AI use is understandable given the relatively recent introduction of this challenging but opportunity-rich new technology to the wider landscape of Higher Education. The University of Oregon recently updated their guidelines and resources for faculty on their Teaching Support and Innovation website while the University of Michigan just completed an internal campus-wide review on the affects of AI on students, faculty, and staff. The executive summary of the report stated that AI would have a ‘significant’ impact on UM’s campus in Fall 2023 and ‘can not be ignored.’ The United States Department of Education’s Office of Educational Technology released a report in May 2023 urging educators to consider more than plagiarism, but student privacy as well as transparency in their use of AI when designing new activities, warning that many students do not fully understand how AI algorithms actually compose responses, resulting in a need for urgent support for students in simply learning how to use these tools in the first place as a key part of future learning outcomes and workforce preparedness.

Oregon State University is also addressing the challenges that AI tools present for educators hoping to support students’ developing their own critical thinking and reasoning skills. An advisory group designated by the Office of the Provost has brought together representatives from student support services, ASOSU’s Office of Advocacy, members of the Faculty Senate, and student members of ASOSU to create guiding principles for using AI-assisted learning strategies. A second coordinating team has also been assembled to look for possible opportunities for OSU to invest in or collaborate with other AI-related programs and services, both on OSU’s campus and abroad. An OSU ‘AI Day’ is planned for later in Spring of 2024, with opportunities for campus innovators to share the unique work they have been developing related to AI and allow students and faculty to ask questions related to AI use in studying and learning.

Students’ Creative Uses of AI – Ideas and Strategies to Suggest 

While longer term campus-wide policies take careful thought and time to develop, there are unique and relatively safe ways students can use AI that we as educators might be able to offer as we work to help them come to terms with this ubiquitous new tool. Faculty and learning support staff in particular may benefit from some examples we’ve collected of students using AI in ways that did NOT break the rules of their instructors and which we saw as particularly effective or innovative. These included asking AI tools to:

  1. Suggest other online learning resources about a topic, such as government websites, journal articles, and virtual lab activities.
  2. Explain why they missed a particular question on graded assignments.
  3. Summarize a list of important topics about a subject to build the outline of a study guide they then fill in using course materials.
  4. Recommend different ways to study for exams, including asking for a schedule to follow in terms of hours spent and when to begin studying.
  5. Describe a concept in multiple ways so that students can check where they may hold misconceptions or missed ideas.
  6. Prompting the AI to ask students questions about a given topic that they then attempted to answer using course materials as a form of practice.

As always, if you are going to provide suggestions about the use of AI to a student, please urge them to check their course syllabus or contact their instructor before engaging in AI-assisted study. Oregon State University has launched a new website that includes thoughtful suggestions and resources for faculty, including ways to connect on research or course design, and can be a great resource to share with students who have questions as well. At the end of the day, what is most important is that we help our students make sense of and succeed in the future careers available to them and show them that asking questions and looking for support is its own vital learning skill to practice as well. If you have questions or concerns about how AI might be used by students in your course, I encourage you to please contact the Center for Teaching and Learning at CTL@oregonstate.edu, as their offices can provide suggestions through email or set you up with a one-on-one consultation discussion to help you strategize effective ways to incorporate language and strategies regarding AI into your syllabus and curriculum.