The golden rule of link accessibility: links should be descriptive! For foundational information on the why and the how, see OSU Digital Accessibility – Links.) Let’s dig deeper into a few common questions:
Can I use “click here” or “this” for my link text?
This practice is not ideal, and it’s best to avoid it. While WCAG does permit it when surrounding context provides enough information, you would not be creating a good experience for your audience. That type of text is not descriptive enough to show the user where the link will go, and it’s especially problematic if this text appears multiple times! Think of people skimming the content – whether visually or via assistive technologies. It’s much more helpful when the text clearly conveys the link’s function or destination. See an example below.
Can I link an image?
Yes, you can use an image directly as a link or button. But! If the image serves as a link on its own, make sure to write alt text that describes the action initiated by the link. The example image below is linked to an interactive lesson about cat behavior. Therefore, you would use the alt text “Cat Behavior Interactive Lesson”, NOT describe the image. See more explanations and examples on the W3C WAI Functional Images page.
Proper citations include URLs. How do we make those accessible?
Citation styles may be strict, but they do allow some flexibility for online-only resources and materials outside of formal papers. The recommended practice is to link the work title and ditch the DOI or URL, like in the example below. Check out more examples and explanations for APA and for MLA.
Is it ok to repeat a link multiple times on a page?
Canvas is flagging some links that don’t seem to exist!
You may have noticed, on occasion, “ghost links” in Canvas. The link validator or accessibility checker says there’s a broken or duplicate link, but when you look at the text, there’s nothing there. However, if you switch to the HTML editor, you’ll find the link lurking underneath. In the example below, you can see that there are actually two links instead of one: the Assignment 1 link was not completely deleted when I replaced it with Assignment 2.
What happens is that sometimes, if you delete text without unlinking first, the link may persist. To avoid this situation, make sure to remove the links before deleting or pasting in text.
BONUS link-related tip: Don’t underline regular text
Usually, links are underlined, and most people think of links when they see underlined text. This may be confusing when they try to access the link and it doesn’t work. In addition, underlining is just not a good way of highlighting information. For more information, see an article and video from Boise State University: Underlined text.
These practices make your course more readable, easy to navigate, and overall, more enjoyable for your students!
Research on rubrics has often focused on validity and reliability (Matshedisho, 2020), but more recent work explores how students actually interpret and use rubrics (Brookhart, 2015; Matshedisho, 2020; Taylor, 2024; Tessier, 2021). This emerging scholarship consistently shows a gap between instructor intention and student interpretation. For example, Matshedisho (2020) found that “students expected procedural and declarative guidance, while instructors expected conceptual, reflective work” (p. 175).
If students understand rubrics differently than we intend, rubrics cannot fully support learning. Below are key reasons this mismatch occurs—and strategies to close the gap.
Tacit Knowledge and Language
Students bring varied backgrounds, disciplinary exposure, and assumptions to their learning (Brookhart, 2015; Matshedisho, 2020). Many do not enter college knowing what a rubric is or how to apply one (Tessier, 2021).
Key issues include:
Unfamiliar terms or disciplinary jargon Early‑year students may lack field‑specific language. In Matshedisho’s (2020) study, first‑year medical students struggled with the sociological-specific criteria required for a reflective assignment.
Different meanings across disciplines Terms like “concept,” “analysis,” or “argument” shift across fields, confusing students taking multiple general‑education courses.
Ambiguous or subjective labels Students struggle to distinguish between words like good and very good, and terms such as “critical analysis” can feel subjective (Taylor, 2024).
Minimal differentiation between performance levels When descriptors are too similar, students, unable to discern differences between the ratings, cannot see how to progress.
How Students Use Rubrics
Students often approach rubrics differently than instructors expect:
They treat the rubric as separate from course content, starting with the criteria column and reading each cell in isolation (Matshedisho, 2020).
They search for procedural instructions, expecting the rubric to tell them how to complete the assignment (Matshedisho, 2020; Taylor, 2024; Tessier, 2021).
Many prefer hard‑copy rubrics over digital versions (Tessier, 2021; Panadero, 2025).
Bridging the Gap Through Instruction
Rubrics only support learning when students understand them as instructors intend (Brookhart, 2015). Effective strategies include:
Build Shared Understanding
Explain key terms and check for tacit knowledge—especially discipline‑specific language (Taylor, 2024).
Explicitly teach what a rubric is and how to use one; don’t assume prior knowledge (Tessier, 2021).
Calibrate expectations by discussing examples and rating sample work with students (Taylor, 2024).
Integrate Rubrics Into the Course
Refer to the rubric during lectures and discussions. (Tessier, 2021).
Provide feedback that directly connects to rubric criteria. (Matshedisho, 2020) (Taylor, 2024) (Tessier, 2021).
Celebrate or reinforce active rubric use (Tessier, 2021).
Provide hard copies of the rubric whenever possible (Tessier, 2021; Panadero, 2025).
Support Instructors
Offer training in rubric design and student‑centered implementation (Brookhart, 2015) (Taylor, 2024).
Use shared rubrics for multi‑section courses to support consistency.
Meet as a teaching team to create and calibrate the common rubric.
Recognize limitations of online rubric platforms; include clarifying hyperlinks or exemplars when possible (Panadero, 2025).
Clarify Task Expectations
Students often want a checklist. Provide procedural instructions separately, and use the rubric for conceptual evaluation (Matshedisho, 2020; Taylor, 2024; Tessier, 2021).
Conclusion
Research has proven that students comment favorably when it comes to questions referencing a rubric’s validity and reliability, but when the research focuses on how students interact with, understand, and apply the rubric, it is clear we still have a long way to go. Hopefully the suggestions above will get you started on the road to even better creation and application of your rubrics.
References
Brookhart, S. M. (2015). The quality and effectiveness of descriptive rubrics. Educational Review, 67(3), 343–368. doi:10.1080/00131911.2014.929565
Matshedisho, K. R. (2020). Straddling rows and columns: Students’ (mis)conceptions of an assessment rubric. Assessment & Evaluation in Higher Education, 169–179. doi:10.1080/02602938.2019.1616671
Panadero, E. O. (2025). Analysis of online rubric platforms: Advancing toward erubrics. Assessment & Evaluation in Higher Education, 31–49. doi:10.1080/02602938.2024.2345657
Taylor, B. K. (2024). Rubrics in higher education: An exploration of undergraduate students’ understanding and perspectives. Assessment & Evaluation in Higher Education, 799–809. doi:10.1080/02602938.2023.2299330
Tessier, L. (2021). Listening to student perspectives of rubrics: Perceptions, Uses, and Grades. Journal on Excellence in College Teaching, 32(3), 133–168.
OER or open educational resources are openly licensed educational materials. What makes them different from other educational materials is the fact that they carry a Creative Commons (CC) license. This means that the person who created the OER, which could be a textbook, assessments, media, course syllabi, etc., has made it possible for others to reuse, revise, remix, redistribute and retain the work without needing to ask for permission. And, even better, OERs are FREE! How does this work in practice? Here’s an example. A professor at OSU writes a textbook on cell biology specifically for the course and gives it a Creative Commons license. Their students now have access to a free textbook on cell biology, tailored to the course, and saving the students hundreds of dollars. The students can keep it as long as they want (no rental returns or use limits). A professor at another university can take that same cell biology textbook and, without worrying about copyright violations or fair use evaluations, reorder the contents to better fit their course syllabus. They can add new, updated content like a recent discovery in gene therapy, or they can remove content that does not meet their course needs. Then they can release this work under a Creative Commons license, providing their students with a free textbook (also saving them oodles of money). It a win-win. Here at Oregon State University, since 2019, our students have saved more than $20 million thanks to OSU facultywho use free textbooks or other free and low-cost learning materials in their classes.
Why is this important?
Students have access to their course materials on day one and everyone has equal access to the course content.
Students don’t have to decide between buying textbooks and rent or food and they don’t have to reduce the number of courses they are taking because they won’t be able to pay for the course materials.
Students report feeling less stressed and a stronger sense of belonging when they don’t have to worry about affording their course materials.
Faculty can customize the course materials, aligning them with course learning outcomes, and making them more relevant to local circumstances or current events.
Faculty can support students as active creators of knowledge by having them contribute to and even create OER materials (open pedagogy).
Faculty can increase their own teaching impact by creating OER that are used across the globe.
Studies have shown that students using OER course material achieve the same or better learning outcomes as with commercial course materials.
In a 2022 survey of Oregon State University students, 61% of them didn’t purchase at least one textbook because of its high cost. By utilizing low ($40 or less) and no cost resources like OERs, you can have a huge impact on our students. For example, instead of deciding between food and rent or buying a textbook, students will have immediate access, which is significant in a 10-week term, to the texts for their class. This often leads to better performance in their classes because they have access to their textbook and aren’t trying to “get by” without it. Students can also take the amount of credits they wish to stay on track with their degree completion goals because the textbooks are now not a concern as far as affordability goes.
Where do I start?
Oregon State University has a growing collection of open, free to use textbooks across several disciplines. Check out the Oregon State University OER Commons and see if there’s a resource you could utilize. If you don’t find what you were looking for there, so many more resources exist, start with looking at the OER Commons main site. But wait, there’s more!
Open Education Week is an annual celebration that raises awareness about OERs. In past years, there have been success stories shared, tools highlighted, and how to get involved in adopting or adapting OERs for use in classes.
Keep an eye out for more details about Oregon State University’s activities during Oregon State University’s Open Ed Week for 2026 happening March 2-6, 2026. Whether you’re a faculty member curious about open textbooks or a student interested in more affordable learning materials, there will be plenty of ways to participate and learn more.
Once the term begins, you and your students enter into a full motion of course activities—getting connected with one another and moving along the education journey together. Then, when you realize it is the end of the term! I have heard many instructors saying things like “I can’t believe how fast this term has gone!”, “It’s already week 10, and I don’t know where time went!” And with the term at its conclusion, it is an opportunity to debrief, reflect, and take time for self-kindness, for both instructors and instructional designers.
Debrief
A debrief is an activity that helps close out the course development project. A debrief can help instructors more intentionally discuss how the course development process worked in a particular course, identify the challenges that took place while teaching, and outline future improvements and more effective course design approaches (Chatterjee, Juvale, & Jaramillo Cherrez, 2023). If you are an instructor who worked with an instructional designer to develop the course that you just finished teaching, it is important to meet with them and discuss how the course went, what worked well, what items presented challenges for students as well as for the instructors that immediate changes or improvements can be addressed as these are fresh in mind, and what major updates or changes are required before the course is taught again. These debriefs can take place during the last weeks of the term (e.g., finals week or the week after) and be initiated by the instructional designer as a way to close out the course development project, or by the instructor to seek additional instructional design assistance for improvements.
Reflection
Why would you want to reflect as an instructor? Generally speaking, reflection can serve as a mechanism to deliberately process and examine your actions, thoughts, and experiences in developing and teaching the course. For reflection after the term, we will focus on reflection-on-action, which is engaging in this deliberate process after the fact (Brookfield, 2017; Schön, 1987)—after you have taught the course. In reflecting about your course development and teaching experience at the end of the term, you may have the opportunity to not only describe what those experiences were like but also the opportunity to question and evaluate design and teaching choices, identify additional challenges presented in the context of the course, and reviewing student feedback to better understand the instructional design decisions that were successful and those that failed to accomplish your goals and the goals of the course. Reflection can be part of the debrief, but also a regular practice to look back at the course development and teaching experience for future improvements.
Self-Kindness
Self-kindness is not a new concept, but it may well be in the context of education. Applying this concept to your online course development and teaching experience means that you engage in kind actions to yourself—actions to treat yourself with care, compassion, and consideration (Denial, 2023). At the end of the term, as you debrief and/or reflect, think about the teaching actions that went well and consider how they made you feel. Give yourself grace and compassion because you are a human being and capable of so many great things, while acknowledging that the context and experiences may shape us in multiple ways. Also, because you have created an excellent online course and your teaching presence has elevated its quality. In exercising self-kindness, you may feel vulnerable as you may start recognizing the challenges and struggles in your academic and personal lives. Consider giving yourself the same compassion that you can give a loved one or a close friend, recognizing that the challenges, struggles, and failures are part of the human experience—even in teaching. Self-kindness is a way to direct your attention and actions away from judgments and shortcomings. Take care.
I’m curious, how do you conclude a term? Are there specific self-care actions that you take besides grading and submitting final grades?
References
Brookfield, S. 2017. Becoming a critically reflective teacher. 2nd ed. Jossey-Bass
Chatterjee, R., Juvale, D., & Jaramillo Cherrez, N. (2023). What the debriefs unfold. A multicase study of their experiences of higher education faculty in designing and teaching their asynchronous online courses. The Quarterly Review of Distance Education, 24(1), 25-41.
Special Edition: Guest Blog by Assistant Professor of Practice (Urban Forestry), Jennifer Killian
When I was asked to create a new course for Oregon State University’s Ecampus program, my first reaction was a mix of sheer excitement… and, well, a little terror. I’ve built workshops, presentations, and even all-day trainings, but assembling ten weeks of graduate-level content from scratch? That felt like wandering through a haunted house to me. Dark, empty, and full of unknowns. Adding to the surrealness, I realized that thirteen years ago, I was a graduate student here, taking several Ecampus courses myself including an early version of the very class I would now be teaching. The idea that I could bring my professional experience back to this institution and shape this course? Thrilling, humbling… and a yes, definitely a little spooky.
The course, FES 454/554: Forestry in the Wildland-Urban Interface, explores the complex challenges of managing forests where communities and wildlands meet. Students dive into forest health, urban forestry, land-use planning, wildfire, and natural resource management through social, ecological, economic, and political lenses. It’s a “slash course,” meaning both undergraduates and graduate students can enroll so I knew the content needed to speak to a broad spectrum of learners. And I had to build it all from the ground up.
Enter the magical world of Ecampus Instructional Design. My Instructional Design partner was way more than support. To me, she was a friendly ghost guiding me through every room of this haunted course house. There were moments when I was convinced I had hit a dead-end, only to have a creative solution appear almost instantly. From turning complex assignments into clear, engaging experiences to keeping me on track and motivated, the team transformed my raw ideas into a cohesive, polished course. I honestly cannot say enough about the skill, creativity, and dedication they bring to the table.
One lesson I carried from my own hiking adventures literally proved invaluable during the course build. Years ago, I was struggling up a 14,000-foot peak in Colorado, staring at the distant summit, more than ready to quit. My hiking buddy simply said, “Don’t look at the summit. Pick a rock a few feet ahead and walk to that. Then take a break, and pick another rock.” That became my metaphor for course development. Instead of being paralyzed by the enormity of a ten-week course, I focused on the next “rock.” Some of my rocks included simply finishing the syllabus, creating the first assignment, securing a guest lecture, or finding a key reading. By breaking the work into manageable pieces, the haunted hallways of that blank course shell became far less intimidating and actually surprisingly rewarding.
Another highlight of building this course was connecting students with the people shaping forestry in the field. Reaching out to industry professionals for guest lectures and insights brought this material to life and grounded it in examples. It also reminded me how much real-world perspectives enrich student learning. Two colleagues from my department contributed individual weeks of material, which helped broaden the course and gave students a chance to see the WUI topic through multiple professional lenses. I was grateful for their contributions too! Seeing the course evolve into a bridge between theory and practice was incredibly rewarding and it reinforced a key principle I’d learned over the years through my various roles. That collaboration amplifies impact. Never has this resonated more with me!
For anyone stepping into a course development role for the first time, my advice is simple; Lean on the resources around you. The Ecampus team offers an incredible array of tools, templates, and guidance. Don’t hesitate to ask questions, tap into expertise, and stick to timelines. Above all, remember the “next rock” approach: the mountain is climbed one step at a time. Celebrate small wins along the way because they add up faster than you think.
Looking back, building this course has been a career highlight. From the panic of staring at a totally blank syllabus to the thrill of seeing assignments, discussions, and modules come alive, I’ve learned that teaching online is truly a team sport. The course may be called Forestry in the Wildland-Urban Interface, but what I really learned was how humans, collaboration, and thoughtful design intersect to create something extraordinary. I hope my story encourages other first-time developers to embrace the process, trust their teams, and find joy in the climb. After all, even a haunted course house is easier to navigate when you have friendly ghosts guiding the way and every “next rock” brings you closer to the summit. And as the crisp autumn air settles in and the leaves turn, I’m reminded that even the spookiest, most intimidating challenges can reveal unexpected magic when you face them step-by-step.
“You won’t always have a calculator in your pocket!”
How we laugh now, with calculators first arriving in our pockets and, eventually, smartphones putting one in our hands at all times.
I have seen a lot of comparisons 123 across the Internet to artificial intelligence (AI) and these mathematics classes of yesteryear. The idea being that AI is but the newest embodiment of this same concern, which ended up being overblown.
But is this an apt comparison to make? After all, we did not replace math lessons and teachers with pocket calculators, nor even with smart phones. The kindergarten student is not simply given a Casio and told to figure it out. The quote we all remember has a deeper meaning, hidden among the exacerbated response to the question so often asked by students: “Why are we learning this?”
The response
It was never about the calculator itself, but about knowing how, when, and why to use it. A calculator speeds up the arithmetic, but the core cognitive process remains the same. The key distinction is between pressing the = button and understanding the result of the = button. A student who can set up the equation, interpret the answer, and explain the steps behind the screen will retain the mathematical insight long after the device is switched off.
The new situation – Enter AI
Scenario
Pressed for time and juggling multiple commitments, a student turns to an AI tool to help finish an essay they might otherwise have written on their own. The result is a polished, well-structured piece that earns them a strong grade. On the surface, it looks like a success, but because the heavy lifting was outsourced, the student misses out on the deeper process of grappling with ideas, making connections, and building understanding.
This kind of situation highlights a broader concern: while AI can provide short-term relief for students under pressure, it also risks creating long-term gaps in learning. The issue is not simply that these tools exist, but that uncritical use of them can still produce passing grades without the student engaging in meaningful reflection gained by prior cohorts. Additionally, when AI-generated content contains inaccuracies or outright hallucinations, a student’s grade can suffer, revealing the importance of reviewing and verifying the material themselves. This rapid, widespread uptake stresses the need to move beyond use alone and toward cultivating the critical habits that ensure AI supports, rather than supplants, genuine learning.
Employing multivariate regression analysis, we find that students using GenAI tools score on average 6.71 (out of 100) points lower than non-users. While GenAI may offer benefits for learning and engagement, the way students actually use it correlates with diminished exam outcomes
Another study (Ju, 2023) found that:
After adjusting for background knowledge and demographic factors, complete reliance on AI for writing tasks led to a 25.1% reduction in accuracy. In contrast, AI-assisted reading resulted in a 12% decline. Ju (2023).
In this same study, Ju (2023) noted that while using AI to summarize texts improved both quality and output of comprehension, those who had a ‘robust background in the reading topic and superior reading/writing skills’ benefited the most.
Ironically, the students who would benefit most from critical reflection on AI use are often the ones using it most heavily, demonstrating the importance of embedding AI literacy into the curriculum. For example: A recent article by Heidi Mitchell from the Wall Street Journal (Mitchell, 2025) cites a study showing that the “less you know about AI, the more you are likely to use it”, and describing AI as seemingly “magical to those with low AI literacy”.
Finally, Kosmyna et al. (2025), testing how LLM usage affects cognitive processes and neural engagement in essay writing, assembled groups of LLM users, search engine users, and those without these tools (dubbed “brain-only” users). The authors recorded weaker performance in students with AI assistance over time, a lower sense of ownership of work with inability to recall work, and even seemingly reduced neural connectivity in LLM users compared to the brain-only group, which scored better in all of the above.
The takeaways from these studies are that unstructured AI use acts as a shortcut that erodes retention. While AI-assistance can be beneficial, outright replacement of thinking with it is harmful. In other words, AI amplifies existing competence but rarely builds it from scratch.
Undetected
Many people believe themselves to be fully capable of detecting AI-usage:
Most of the writing professors I spoke to told me that it’s abundantly clear when their students use AI. Sometimes there’s a smoothness to the language, a flattened syntax; other times, it’s clumsy and mechanical. The arguments are too evenhanded — counterpoints tend to be presented just as rigorously as the paper’s central thesis. Words like multifaceted and context pop up more than they might normally. On occasion, the evidence is more obvious, as when last year a teacher reported reading a paper that opened with “As an AI, I have been programmed …” Usually, though, the evidence is more subtle, which makes nailing an AI plagiarist harder than identifying the deed. (Walsh, 2025).
In the same NY Mag article, however, Walsh (2025) cites another study, showing that it might not be as clear who is using AI and who is not (emphasis added):
[…] while professors may think they are good at detecting AI-generated writing, studies have found they’re actually not. One, published in June 2024, used fake student profiles to slip 100 percent AI-generated work into professors’ grading piles at a U.K. university. The professors failed to flag 97 percent.
The two quotes are not contradictory; they describe different layers of the same phenomenon. Teachers feel they can spot AI because memorable extremes stick in their minds, yet systematic testing proves that intuition alone misses the overwhelming majority of AI‑generated work. This should not be surprising though, as most faculty have never been taught systematic ways to audit AI‑generated text (e.g., checking provenance metadata, probing for factual inconsistencies, or using stylometric analysis). Nor do most people, let alone faculty grading hundreds of papers per week, have the time to audit every student. Without a shared, college-wide rubric of sorts, detection remains an ad‑hoc, intuition‑driven activity. Faulty detection risks causing undue stress to students, and can foster a climate of mistrust by assuming that AI use is constant or inherently dishonest rather than an occasional tool in the learning process. Even with a rubric, instructors must weigh practical caveats: large-enrollment courses cannot sustain intensive auditing, some students may resist AI-required tasks, and disparities in access to tools raise equity concerns. For such approaches to work, they must be lightweight, flexible, and clearly framed as supporting learning rather than policing it.
This nuance is especially important when considering how widespread AI adoption has been. Walsh (2025) observed that “just two months after OpenAI launched ChatGPT, a survey of 1,000 college students found that nearly 90 percent of them had used the chatbot to help with homework assignments.” While this figure might seem to justify the use of AI detectors, it could simply reflect the novelty of the tool at the time rather than widespread intent to circumvent learning. In other words, high usage does not automatically equal cheating, showing the importance of measured, thoughtful approaches to AI in education rather than reactionary ones.
What to do…?
The main issue here is not that AI is magically writing better essays than humans can muster, it is that students are slipping past the very moments where they would normally grapple with concepts, evaluate evidence, and argue a position. Many institutions are now taking a proactive role rather than a reactive one, and I want to offer such a suggestion going forward.
Embracing the situation: The reflective AI honor log
It is a fact that large language models have become ubiquitous. They are embedded in web browsers, word processors, and even mobile keyboards. Trying to ban them outright creates a cat‑and‑mouse game; it also sends the message that the classroom is out of sync with the outside world.
Instead of fighting against a technology that is already embedded in our lives, invite students to declare when they use it and to reflect on what they learned from that interaction.
For this post, I am recommending using an “AI Honor-Log Document”, and deeply embedding it into courses, with the goal of increasing AI literacy.
What is it?
As assignments vary across departments and even within courses, a one-size-fits-all approach is unlikely to be effective. To support thoughtful AI use without creating extra work for students, faculty could select an approach that best aligns with their course design:
Built-in reflection: Students note when and how they used AI, paired with brief reflections integrated into their normal workflow.
Optional, just-in-time logging: Students quickly log AI use and jot a short note only when it feels helpful, requiring minimal time.
Embedded in assignments: Reflection is incorporated directly into the work, so students engage with it as part of the regular writing or research process.
Low-effort annotations: Students add brief notes alongside tasks they are already completing, making reflection simple and natural.
These options aim to cultivate critical thinking around AI without imposing additional burdens or creating the perception of punishment, particularly for students who may not be using AI at all.
AI literacy is a massive topic, so let’s only address a few things here:
Mechanics Awareness: Ability to explain the model architecture, training data, limits, and known biases.
Critical Evaluation: Requiring fact-checking, citation retrieval, and bias spotting.
Orchestration Skills: Understanding how to craft precise prompts, edit outputs, and add original analysis.
Note: you might want to go further and incorporate these into an assignment level learning outcome. Something like: “Identifies at least two potential biases in AI-generated text” could be enough on a rubric to gather interesting student responses.
Log layout example
#
Assignment/Activity
Date
AI Model
Exact Prompt
AI Output
What you changed/Added
Why You Edited
Confidence (1-5)
Link to Final Submission
1
Essay #2 – Digital-privacy law
2025-09-14
GPT-5
“Write a 250-word overview of GDPR’s extraterritorial reach and give two recent cases
[pastes AI text]
Added citation to 2023 policy ruling; re-phrased a vague sentence.
AI omitted the latest case; needed up-to-date reference
4
https://canvas.oregonstate.edu/……
Potential deployment tasks (and things to look out for)
It need not take much time to model this to students or deploy it in your course. That said, there are practical and pedagogical limits depending on course size, discipline, and student attitudes toward AI. The notes below highlight possible issues and ways to adjust.
Introduce the three reasons above (either text form or video, if you have more time and want to make a multimedia item). Caveat: Some students may be skeptical of AI-required work. Solution: Frame this as a reflection skill that can also be done without AI, offering an alternative if needed.
Distribute the template to students: post a Google-Sheet link (or similar) in the LMS. Caveat: Students with limited internet access or comfort with spreadsheets may struggle. Solution: Provide a simple Word/PDF version or allow handwritten reflections as a backup.
Model the process in the first week: Submit a sample log entry like the one above but related to your class and required assignment reflection type. Caveat: In large-enrollment courses, individualized modeling is difficult. Solution: Share one well-designed example for the whole class, or record a short screencast that students can revisit.
Require the link with each AI-assisted assignment (or as and when you believe AI will be used). Caveat: Students may feel burdened by repeated uploads or object to mandatory AI use. Solution: Keep the log lightweight (one or two lines per assignment) and permit opt-outs where students reflect without AI.
Provide periodic feedback: scan the logs, highlight common hallucinations or errors provided by students, give a “spot the error” mini lecture/check-in/office hour. Caveat: In large classes, it’s not realistic to read every log closely. Solution: Sample a subset of entries for themes, then share aggregated insights with the whole class during office hours, or post in weekly announcements or discussion boards designed for this kind of two-way feedback.
(Optional) Student sharing session in a discussion board: allow volunteers or require class to submit sanitized prompts (i.e., any personal data removed) and edits for peer learning. Caveat: Privacy concerns or reluctance to share work may arise. Solution: Keep sharing optional, encourage anonymization, and provide opt-outs to respect comfort levels.
Important considerations when planning AI-tasks
Faculty should be aware of several practical and pedagogical considerations when implementing AI-reflective logs. Large-enrollment courses may make detailed feedback or close monitoring of every log infeasible, requiring sampling or aggregated feedback. Some students may object to AI-required assignments for ethical, accessibility, or personal reasons, so alternatives should be available (i.e. the option to declare that a student did not use AI should be present). Unequal access to AI tools or internet connectivity can create equity concerns, and privacy issues may arise when students share prompts or work publicly. To address these challenges, any approach should remain lightweight, flexible, and clearly framed as a tool to support learning rather than as a policing mechanism.
Conclusion
While some students may feel tempted to rely on AI, passing an assignment in this manner can also pass over the critical thinking, analytical reasoning, and reflective judgment that go beyond content mastery to true intellectual growth. Incorporating a reflective AI-usage log based not on assumption of cheating, but on the ubiquitous availability of this now-common tool, reintroduces one of the evidence-based steps for learning and mastery that has fallen out of favor in the last 2-3 years. By encouraging students to pause, articulate, and evaluate their process, reflection helps them internalize knowledge, spot errors, and build the judgment skills that AI alone cannot provide.
Fu, Y. and Hiniker, A. (2025). Supporting Students’ Reading and Cognition with AI. In Proceedings of Workshop on Tools for Thought (CHI ’25 Workshop on Tools for Thought). ACM, New York, NY, USA, 5 pages. https://arxiv.org/pdf/2504.13900v1
Kosmyna, N., Hauptmann, E., Yuan, Y. T., Situ, J., Liao, X-H., Beresnitzky, A. V., Braunstein, I., & Maes, P. (2025). Your Brain on ChatGPT: Accumulation of Cognitive Debt when Using an AI Assistant for Essay Writing Task. https://arxiv.org/abs/2506.08872
Fall Term is just around the corner, bringing with it new opportunities, fresh faces, and the chance to make a lasting impact on your students. Whether they’re logging in for the first time or for their final term, setting a welcoming and engaging tone from day one helps create a foundation for everyone’s success, yours included.
Here are a few ways to kick things off and set the stage for a smooth, successful term:
Start with a warm welcome
Post a welcome announcement and introduce yourself to your students.
Use a warm and welcoming tone in your message to help students feel encouraged, supported, and comfortable as they enter the course.
Personalize it with a photo or short video, it goes a long way in making connections.
Open your course early
If possible, open your course before the official start date. This gives students a chance to explore, order materials, and introduce themselves.
Open modules at least two weeks ahead. Many students juggle full-time jobs, families, and other commitments, so maximum flexibility is appreciated.
Keep communication open
Set up a Q&A discussion forum, and check it regularly. This allows you to answer common questions once and ensures everyone sees the response.
Encourage students to post questions in this forum and let students know when and how they can expect replies.
Be responsive to messages and follow up with students if needed.
Model engagement
Join discussion boards and post regularly. Ask guiding questions, offer feedback, or simply cheer students on, show them you’re present and engaged.
Think about how you’d engage in a face-to-face class and bring that energy to your online space too.
Be accessible
Hold regular office hours or offer flexible scheduling options. Creating the time and space for students to connect with you makes a difference.
Grade consistently and give meaningful feedback
Timely, constructive feedback helps students grow. The effort you put in early pays off in improved work later in the term.
Stay organized
Block out time in your calendar each week for class check-ins and grading. A little planning now can prevent overwhelm and burnout later.
Take care of yourself
Don’t forget to breathe. Support your students by also supporting yourself.
Be kind to yourself and set boundaries to attend to personal commitments, too.
Here’s to a strong, successful Fall Term — you’ve got this!
This post was written in collaboration with Mary Ellen Dello Stritto, Director of Ecampus Research Unit.
Quality Matters standards are supported by extensive research on effective learning. Oregon State University’s own Ecampus Essentials build upon these standards, incorporating OSU-specific quality criteria for ongoing course development. But what do students themselves think about the elements that constitute a well-designed online course?
The Study
The Ecampus Research Unit took part in a national research study with Penn State and Boise State universities that sought student insight into what elements of design and course management contribute to quality in an online course. Data was collected from 6 universities across the US including Oregon State in Fall of 2024. Students who chose to participate completed a 73-item online survey that asked about course design elements from the updated version of the Quality Matters Rubric. Students responded to each question with the following scale: 0=Not important, 1=Important, 2=Very Important, 3=Essential. A total of 124 students completed survey, including 15 OSU Ecampus students. The findings reveal a remarkable alignment between research-based best practices and student preferences, validating the approach taken in OSU’s Ecampus Essentials.
See the findings in data visualization form below, followed by a detailed description.
What Students Consider Most Important
Students clearly value practical, research-backed features that make online courses easier to navigate, more accessible, and more supportive of learning. The following items received the most ratings of “Essential” + “Very Important”:
QM Standards and Study Findings
Related Ecampus Essentials
Accessibility and Usability (QM Standards 8.2, 8.3, 8.4, 8.5, 8.6): Every OSU student rated course readability and accessible text as “Very Important” or “Essential” (100%). Nationally, this was also a top priority (96% and 91%, respectively). Accessibility of multimedia—like captions and user-friendly video/audio—was also highly rated (100% OSU, 90% nationally).
Text in the course site is accessible. Images in the course are accessible (e.g., alt text or long description for images). The course design facilitates readability. All video content is accurately captioned.
Clear Navigation and Getting Started (QM Standards 1.1, 8.1): 93% of OSU students and 94% of the national sample rated easy navigation highly, while 89% of OSU students and 96% nationally said clear instructions for how to get started and where to find things were essential.
Course is structured into intuitive sections (weeks, units, etc.) with all materials for each section housed within that section (e.g., one page with that week’s learning materials rather than a long list of files in the module). Course is organized with student-centered navigation, and it is clear to students how to get started in the course.
Meaningful Feedback and Instructor Presence (QM Standards 3.5, 5.3): Students placed high importance on receiving detailed feedback that connects directly to course content (100% OSU, 94% nationally). The ability to ask questions of instructors was also essential (100% OSU, 96% nationally).
Assessments are sequenced in a way to give students an opportunity to build knowledge and learn from instructor feedback. The instructor’s plan for regular interaction with students in substantive ways during the course is clearly articulated. Information about student support specific to the course (e.g., links to the Writing Center in a writing course, information about TA open office hours, etc.) is provided.
Clear Grading Criteria (QM Standards 3.2, 3.3): 93% of OSU students and the full sample found clear, detailed grading rules to be essential.
Specific and descriptive grading information for each assessment is provided (e.g., detailed grading criteria and/or rubrics).
Instructional Materials (QM Standard 4.1): All OSU students and 92% nationally rated high-quality materials that support learning outcomes as very important or essential.
Instructional materials align with the course and weekly outcomes. A variety of instructional materials are used to appeal to many learning preferences (readings, audio, visual, multimedia, etc.). When pre-recorded lectures are utilized, content is brief and integrated into course learning activities, such as with interactive components, discussion questions, or quiz questions. Longer lectures should be shortened to less than 20 min. chunks.
What Students Consider Less Important
The study also revealed areas where students expressed less enthusiasm:
Study Findings
Related Ecampus Essentials
Self-Introductions (QM Standard 1.9): Over half of OSU students (56%) and a third nationally (33%) rated opportunities to introduce themselves as “Not Important”.
No specific EE
Peer Interaction (QM Standard 5.2): Students were lukewarm about peer-to-peer learning activities. Nearly half said that working in small groups is not important (47% OSU, 46% nationally). About a quarter didn’t value sharing ideas in public forums (27% OSU, 24% nationally) or having learning activities that encourage them to interact with other students (27% OSU, 23% nationally).
Three forms of interaction are present, in some form, in the course (student/content, student/instructor, student/student).
Technology Variety and Data Privacy Info (QM Standards 6.3, 6.4): Some students questioned the value of using a variety of tech tools (20% OSU, 23% nationally rated this as “Not Important”) or being given info about protecting personal data (20% OSU, 22% nationally).
Privacy policies for any tools used outside of Canvas are provided.
Student Comments
Here are a few comments from Ecampus students that illustrate their opinions on what makes a quality course:
“Accessible instructional staff who will speak to students in synchronous environments. Staff who will guide students toward the answer rather than either treating it like cheating to ask for help at all or simply giving out the answer.”
“A lack of communication/response from teachers and no sense of community” – was seen as a barrier.
“Mild reliance on e-book/publisher content, out-weighed by individual faculty created content that matches student deliverables. In particular, short video content guiding through the material in short, digestible amounts (not more than 20 minutes at a go).”
“When there aren’t a variety of materials, it makes it hard to successfully understand the materials. For example, I prefer there to be lectures or videos associated with readings so that I understand the material to the professor’s standards. When I only have reading materials, I can sometimes misinterpret the information.”
“Knock it off with the discussion boards, and the ‘reply to 2 other posts’ business. This is not how effective discourse takes place, nor is it how collaborative learning/learning community is built.”
Conclusion and Recommendations
The takeaways? This research shows that students recognize and value the same quality elements emphasized in OSU’s Ecampus Essentials:
Student preferences align with research-based standards – Students consistently value accessibility, clear structure, meaningful feedback, and purposeful content.
Universal design benefits everyone – Students’ strong preference for accessible, well-designed courses supports the universal design principles embedded in the Ecampus Essentials.
However, there is always room for improvement, and these data provide some hints. Many students don’t immediately see value in peer interactions and collaborative activities, even though extensive educational research shows these are among the most effective learning strategies. Collaborative learning is recognized as a High Impact Practice that significantly improves student outcomes and critical thinking. This disconnect suggests we need to design these experiences more thoughtfully to help students recognize their benefits. Here are some suggestions:
Frame introductions purposefully: Instead of generic “tell us about yourself” posts, connect introductions to course content (“Introduce yourself and share an experience related to the topic of this course”).
Design meaningful group work: Create projects that genuinely require collaboration and produce something students couldn’t create alone.
Show the connection: Explicitly explain how peer interactions help students learn and retain information better, and the value of teamwork for their future jobs.
Start small: Begin with low-stakes peer activities before moving to more complex collaborations.
Over the past few years, Higher Education (HE) has been called to action in response to the rise of Generative Artificial Intelligence (GenAI) tools. As Artificial Intelligence (AI) becomes more autonomous and capable, proactive steps are needed to preserve academic and learning integrity. This article will explore tangible strategies educators can apply to their unique program and course contexts. Only slight adjustments may be necessary to support learning processes and capture evidence of learning, as changes will build upon the excellent work that is already occurring.
Initially, the focus in HE was on understanding the potential impact these tools would have on teaching and learning. Awareness of GenAI capabilities, limitations, and risks has been acknowledged with great care. Today, the tools are now being tested, and educators are envisioning how to use them for various purposes (e.g., productivity, creativity). Integration of these tools has begun with the aim of supplementing and enhancing human learning. As we move forward, concerns with regard to academic and learning integrity become increasingly prominent.
Meet Agentic AI
Recently, I had the opportunity to attend the Quality Matters Quality in Action conference, where I attended the session Ensuring Academic Integrity and Quality Course Design in the Age of AI. The presenter Robert Gibson, Director of Instructional Design at WSU Tech, shared about an Artificial Intelligence (AI) innovation now available to the public (and our students)….meet Agentic AI!
Your new Agentic AI assistant no longer requires you to be an expert prompt engineer. These tools are designed to achieve specific and clear goals with minimal human supervision or oversight. Engagement in complex reasoning, decision making, problem solving, learning from new information, and adapting to environments can occur autonomously (Gibson, 2025; Schroeder, 2025; Marr, 2025). These new Agentic AIs can even work together to form what is known as an Orchestrated AI. Think of this as an AI team working collaboratively to accomplish complex tasks. Agentic AI has already demonstrated the capability to create and complete online courses. What does this mean for Higher Education?
Now more than ever, we need to come together to collectively reinforce academic and learning integrity in online and hybrid courses. Preserving the quality of our institutional products and credentials is essential. Equally important are the students who will apply their OSU-acquired knowledge and skills in the real world. The time to be proactive is now.
Where and how should I start?
A good starting point is to evaluate assessments that AI can complete. Running an assignment through a GenAI tool to see if it can complete the task, with relative accuracy, can produce helpful insights. Next, consider modifications to pedagogical approaches and assessment methods. The goal is to design assessments to produce and capture evidence that learning is taking place. This could include assessments that are process-oriented, focus on skill mastery, are personalized, incorporate visual demonstrations (e.g., video), and/or integrate real-time engagement (Gibson, 2025).
What might a reimagined activity look like?
For example, let us say an instructor uses case-based learning in their course, and small groups discuss real-world scenarios on a discussion board. This activity could be reimagined by having students meet virtually and record their discussion. During their real-time interaction, they examine a real-world scenario, identify associated evidence, present examples, and share their lived experiences. This would be similar to how students conduct group presentations currently. This approach could be enhanced by shifting the focus to the learning process, such as arriving at ideas and cultivating perspectives (i.e., learning, growth, development). This would be in lieu of having students find a right or wrong answer (Gibson, 2025). This approach encourages students to engage substantively, co-construct knowledge, and work together to demonstrate learning. After participating in the activity, each student could create an individual video presentation to synthesize their learning. A synthesis video could include discussing their initial perspectives (Where did I start? – prior knowledge activation), how those initial perspectives evolved (What was my cognitive process? – metacognition), what new knowledge is needed (gap analysis), and how my perspectives and knowledge change (learning reflection). This method reinforces academic and learning integrity by validating that students are learning and achieving outcomes (Bertram-Gallant, 2017).
Reflect! Take a moment to reflect on how you know that students are learning in your course(s). What evidence do you have?
While the potential for academic dishonesty cannot be entirely controlled, there should not be an assumption that students will use these tools in their coursework just because they are available. Take a moment to examine the Ecampus Research Unit’s research, “Student Perceptions of Generative AI Tools in Online Courses.” This research study explores online students’ perceptions, understanding, and use of GenAI tools. The study found that most students had not been using GenAI tools in their courses, but rather, they were primarily using GenAI tools within professional contexts. Students noted that they understood that using AI in their careers would be necessary. However, strong concerns were articulated around inaccuracies, biases, lack of reliability, propagation of misinformation, and that the use of the tools is not in alignment with their personal values and ethics. (Dello Stritto, M, Underhill, G. and Aguiar, N., 2024).
How can academic and learning integrity be reinforced?
Educators can foster academic integrity in a way that drives students’ internal motivation, self-determination, and desire to demonstrate their learning because they value the work they are doing. A multifaceted developmental approach that fosters a culture of academic integrity using various strategies in concert with one another is key (Bertram-Gallant, 2025), as no single approach can serve as a definitive solution.
Integrity teaching – Taking on the role of an active guide during course delivery and meeting students where they are developmentally is essential. This may include teaching students how to engage in critical thinking around the use of AI tools, connect the value of academic and learning integrity to their future profession, how to make well-informed decisions, and how to leverage metacognitive strategies when engaging with AI.
Integrity messaging – This approach is one that can be most effective if holistically integrated into a course. The content communicates that integrity, values, and ethics are normative within the course and will be held at the forefront of the learning community. Staged and timed messaging can be most helpful when targeted at different points in a course and as the complexity of academic work increases.
Transformative real time experiential learning – Transformative experiential learning involves designing opportunities that generate new ideas for action, which can be applied to other experiences. These activities may include, but are not limited to, service learning, internships, hands-on collaborative activities (e.g., role play, point-counterpoint discussions), and demonstrations. By focusing on real-time engagement, this approach demonstrates learning and thereby reinforces academic and learning integrity.
Deep learning – Learning opportunities focused on skill mastery and demonstration through staged attempts. This approach may necessitate a pedagogical shift focusing on development and growth (Bertram-Gallant, 2025).
Agentic AI brings exciting opportunities for the world but tangible challenges for HE. By intentionally designing assessments that lead students to demonstrate evidence of their learning and using facilitation strategies that foster a culture of academic integrity, we can harness the potential of AI to supplement learning. What is the end goal? To ensure that educational opportunities are designed to preserve and enhance learners’ critical skills and knowledge needed to thrive in their professional pursuits. Will you accept this challenge?
Trying to decide when and how to incorporate AI into your work? Take a look at the AI Decision Tree!
Need a few quick, practical strategies to get started? These recommendations aim to improve learning for both teachers and students.
Are you ready to evaluate and enhance the resiliency (i.e., flexibility, adaptability) of your course within the context of AI? Check out the new The Course AI Resilience Tracker [CART] interactive tool. This interactive tool can help you reflect on various course elements and will share personalized resources to help you get started.
Review Bloom’s Taxonomy Revisited to explore how to emphasize distinctive human skills and/or integrate AI tools to supplement the learning process.
Explore our AI Assessment Examples Library for assessment ideas designed to incorporate AI tools and strategies in your course and/or create more human-centric assessments.
Bertram Gallant, T. & Rettinger, D. (2025, March 11). The Opposite of Cheating: Teaching for Integrity in the Age of AI. University of Oklahoma Press.
Dello Stritto, M. E., Underhill G. R., & Aguiar, N. R. (2024). Online Students’ Perceptions of Generative AI. Oregon State University Ecampus Research Unit. https://ecampus.oregonstate.edu/research/publications/
Gibson, R. (2025, April 10). Ensuring Academic Integrity and Quality Course Design in the Age of AI [Conference presentation]. Quality Matters Quality in Action 2025. Virtual.
Ashlee M. C. Foster, MSEd, is a seasoned Instructional Designer with the Oregon State University Ecampus Course Development and Training Team. With a profound commitment to supporting faculty and students in online teaching and learning, Ashlee’s mission is to design high-quality and innovative educational opportunities that foster transformational learning, development, and growth. Ashlee’s learning design approaches are grounded in research-based insights, foundational learning theories, and the thoughtful integration of industry-led practices. This ensures that each educational experience is not only effective but also engaging and relevant.
This is a guest post by Winter 2025 Ecampus Instructional Design Intern Terrence Scott.
Creating Online Learning Spaces Where Adult Learners Belong
Today’s college students are increasingly adults returning to education to pursue career shifts, personal growth, or new credentials. Yet, this return often brings discomfort. Adult learners find themselves in a liminal space, caught between who they were and who they are becoming as students. This “in-between” state is a psychological and social threshold where identity and belonging are in flux (Maksimović, 2023; Turner, 1969).
Liminal space is defined as “characterized by the questioning and reexamination of one’s identity, often as a result of transitional moments in an individual’s life such as separation, loss, and conflicts” (Maksimović, 2023). Rather than a moment, adult learners experience the entire educational journey—from enrollment to graduation—as a liminal space. Supporting learners through this journey requires intentional course design that centers on inclusion and belonging.
Online Learning as a Threshold
As Johnson (2022) and Maksimović (2023) describe, adult learners often navigate identity shifts as they move from familiar roles in work or family life into the unfamiliar space of studenthood. For some, prior negative school experiences further intensify feelings of isolation during this transition.
Adult learners in liminal space often struggle with:
Imposter Syndrome: “Am I really capable of doing this?”
Identity Conflict: “Am I a student now, or still just a working professional?”
Social Isolation: “Do I belong here, or am I too different from my classmates?”
Fear of Failure: “What if I don’t succeed and let myself or my family down?”
Without a strong sense of belonging, these feelings can lead to disengagement or dropout. But when courses are designed to recognize this liminal space, learners are more likely to persist and thrive (Mezirow, 1991).
The Role of Belonging in Adult Learning
Belonging is a powerful driver of student success, especially for those from nontraditional backgrounds. It’s not just about showing up—it’s about feeling seen, respected, and included. When learners experience psychological safety and validation, motivation and commitment grow (Strayhorn, 2019).
How Does Belonging Develop?
Representation: Course content should reflect diverse identities and lived experiences.
Identity Validation: Recognize the knowledge adult learners bring with them.
Connection: Encourage interaction through group work, discussion forums, or mentorship.
Flexibility: Design with life responsibilities in mind—multiple paths to participation and success.
These elements help learners cross the threshold from “outsider” to “insider,” evolving from questioning their role in higher education to fully embracing it.
UDL 3.0: Designing for Inclusion and Belonging
Universal Design for Learning offers a framework for inclusive online course design. The latest iteration, UDL 3.0, centers identity, belonging, and engagement more explicitly than ever (CAST, 2024). It urges instructors to create spaces where students feel welcomed and recognized, not just accommodated.
How UDL Supports Adult Learners in Liminal Spaces
Engagement: Make content relevant with real-world examples, reflection exercises, and collaborative activities.
Representation: Use varied media—text, video, podcasts, interactive tools—and include voices that reflect learners’ identities.
Action & Expression: Offer multiple ways to demonstrate understanding with flexible formats, low-stakes practice, and accommodations for life’s demands.
When courses reflect these principles, adult learners gain the confidence to move through uncertainty and emerge with a stronger academic identity.
Conclusion
Liminal spaces—the uncertain, transitional moments in adult learning—can be both challenging and transformative. While some learners struggle with identity shifts, imposter syndrome, or social isolation, institutions that prioritize belonging and inclusive design can help them navigate these transitions successfully.
Higher education can foster a sense of belonging and empowerment for adult learners by integrating UDL 3.0 principles into course design and student support services. Welcoming students, valuing their diverse experiences, and establishing supportive learning environments are essential to addressing students’ unique needs and ensuring their success.
References
CAST (2024). UDL Guidelines 3.0: Universal Design for Learning.
De Abreu, K. (2023, August 7). Extreme coming of age rituals. ExplorersWeb. Link
Johnson, K. (2022). Beginning, Becoming and Belonging: Using Liminal Spaces to Explore How Part-Time Adult Learners Negotiate Emergent Identities. Widening Participation and Lifelong Learning, 24(2).
Maksimović, M. (2023). Insights from Liminality: Navigating the Space of Transition and Learning. Sisyphus–Journal of Education, 11(1).
Mezirow, J. (1991). Transformative dimensions of adult learning. Jossey-Bass.
Turner, V. (1969). The ritual process: Structure and anti-structure. Cornell University Press.
Strayhorn, T. L. (2019). College Students’ Sense of Belonging: A Key to Educational Success for All Students (2nd ed.). Routledge.