The idea of creating equitable learning environments is at the core of inclusive practices. Many educators argue that equitable learning environments are essential for student success, but what does this mean in practice? Inclusive practices build upon the premise that design and teaching should adapt to support students’ unique needs, fostering student agency. Student-centered approaches prioritize equity over equality. It’s crucial to understand that equity, which tailors resources and opportunities to individual needs, fundamentally differs from equality, which assumes that students should receive the same treatment and that this treatment is thus fair for all.

Equity refers to the “removal of systemic barriers and biases (e.g., policies, processes, outcomes), enabling all individuals to have equal opportunity to access and benefit from resources and opportunities.” (University of Waterloo, n.d.). Equity in online learning means removing barriers to participation. This is especially true for underrepresented groups, first-generation students, and those with different learning styles. Barriers include accessing materials, completing assignments, and interacting with peers and instructors. Applying an equity lens to online and hybrid design and facilitation involves many factors. Among these, structure, flexibility, and feedback are particularly critical. In this post, we will explore these elements from the perspective of the course as a whole. This will set the stage for a deeper examination of these same elements at the assessment level in my next blog post. Stay tuned!

Structure

  • Provide short descriptions for each learning material and their value in the learning process. What do students gain from reading or watching the required material for the remaining activities? 
  • Include a purpose statement in each assignment and how it contributes to learning and achieving outcomes. How are the assessments connected to the overall goal of the course? 
  • Design assessments that promote active learning, higher-order thinking, and student agency. How are students involved in the learning process? How do students apply concepts? Do assessments reflect meaningful personal experiences?   
  • Make the module content, format, and requirements consistent. How are students expected to participate in discussions? Are there activities that can benefit students from peer learning and interaction? For example, create spaces for students to collaborate and support each other beyond the traditional discussion boards.
  • Build in multimodalities for content and assessments. What are the skills that are being asked in the assignment (e.g., emphasize writing or deepening concept understanding)? Are there other ways in which students can demonstrate their learning?  For example, rather than a written paper, consider allowing students to submit an audio recording, a multimedia presentation, a collage, etc.
  • Provide a clear course schedule with regular milestones and check-in points to support learning. For instance, incorporate scaffolding into course activities and assessments with low-stakes and formative assessments.

Flexibility

While a clear and robust course structure is essential for guiding students through the learning process, it’s equally important to recognize the role flexibility plays in supporting diverse learners. Flexibility does not mean a lack of academic integrity or rigor. Flexibility can mean many things for many people therefore, it is important to clarify its intention, meaning, and place in the course. The most common use of flexibility in the online classroom is for extensions on assignments, which can help reduce instructor bias and increase student engagement and agency (Ruesch & Sarvary, 2023). How else can flexibility be incorporated into an online course? Following, there are a few ideas and questions to guide the decision about flexibility:

  • Consider updating late submission policies. Ensure that students know what to expect if unforeseen circumstances prevent them from submitting assignments by the due date. Be cognizant that life happens to everyone, and we need to offer kindness and empathy to students. How can an “automatic” late assignment policy work within the nature and scope of the course?
  • In selecting materials, identify multiple formats students can use to gain knowledge. Are there media-based materials that provide the textbook content in an alternative way (e.g., audiobook, ebook)?
  • Design assignments that include choice for students to select the format or topic of their preference. Do assignments need to all be written? Where can students choose their own topic for a project?

Feedback

While structure and flexibility are essential components of an inclusive learning environment, they alone are insufficient. Research suggests that instructor presence is fundamental to developing a sense of belonging and connection in the online environment. In addition, instructor feedback is as critical as presence to promote learning. Instructor feedback can help students identify the areas to improve. It can also help instructors identify additional resources to support students. Providing students feedback is mutually beneficial. Students receive actionable feedback on their progress and instructors learn what works in the course and what to improve. Let’s explore some ideas for feedback as a framework to build a connection with students:

  • Create feedback guidelines that communicate to students what to expect from you and when. When will assignments be graded and grades reported? How soon will you respond to email questions?
  • Consider offering feedback in multiple formats, such as audio or video in addition to text. Reflect on which activities might benefit from the added context and personal connection audio/video feedback would provide?
  • Prepare rubrics or grading guidelines that clearly indicate to students how the assignments will be graded. Ensure the rubrics are connected to the purpose and expectations of the assignments.
  • Give students actionable feedback that shows students their learning progress and guides them on how to improve.
  • Design the course modules to include feedback and revision steps. This approach will help students see how all course components are connected and contribute to meeting the course outcomes.
  • Include peer feedback (peer review) that 1) gives students guidance on how to conduct a peer review, and 2) enhances their critical thinking and perspectives by reading peers’ work.

Educators can move away from a one-size-fits-all approach by intentionally combining structure, flexibility, and feedback. This creates an environment that addresses the diverse and unique needs of all students and ensures every student has an equal opportunity to succeed, regardless of where they start.

References

  • Eddy, S. L., & Hogan, K. A. (2014). Getting under the hood: how and for whom does increasing course structure work?. CBE life sciences education, 13(3), 453–468. https://doi.org/10.1187/cbe.14-03-0050
  • Ruesch, J. M., & Sarvary, M. A. (2024, March). Structure and flexibility: systemic and explicit assignment extensions foster an inclusive learning environment. In Frontiers in Education (Vol. 9, p. 1324506). Frontiers Media SA.
  • University of Waterloo (n.d.). Humanizing Virtual Learning. https://ecampusontario.pressbooks.pub/humanizinglearningonline/


This layered paper art depicts the Heceta Head Lighthouse amid colorful hills, a flowing river, and tall green trees. Whimsical clouds and birds add depth, creating a vibrant and detailed handcrafted scene.

I’d like to share a recent experience highlighting the crucial role of collecting and using feedback to enhance our online course materials. As faculty course developers and instructional designers, we understand the importance of well-designed courses. However, even minor errors can diminish the quality of an otherwise outstanding online course.

This layered paper art depicts the Heceta Head Lighthouse amid colorful hills, a flowing river, and tall green trees. Whimsical clouds and birds add depth, creating a vibrant and detailed handcrafted scene.
A lighthouse on the Oregon coast, where student feedback and technological tools act as the guiding light. Image generated with Midjourney.

A Student’s Perspective

Recently, feedback was forwarded to me submitted by an online student enrolled in a course I had helped develop.

He praised the overall design of the courses and the instructors’ responsiveness, but he pointed out some typographic and grammatical errors that caused confusion. He mentioned issues like quiz answers not matching the questions and contradictory examples.

What stood out to me was his statement:

“These courses are well-designed and enjoyable. Their instructors are great. They deserve written material to match.”

Proactive Steps for Quality Improvement

This feedback got me thinking about how we can proactively address such concerns and ensure our course materials meet the high standards our students deserve. Here are a few ideas that might help:

Implement a Feedback Mechanism

Incentivize students to hunt for flaws. Reward sharp eyes for spotting typos and grammar slips. Bonus points could spark enthusiasm, turning proofreading into a game of linguistic detective work. For example:

  • Weekly Surveys: Add a question to the weekly surveys asking students to report any errors they encounter, specifying the location (e.g., page number, section, or assignment).
    • “Did you encounter any typographic or grammatical errors in the course materials this week? If so, please describe them here, including the specific location (e.g., page number, section, or assignment).”
  • Assignment Feedback: Include a text-field option for students to report errors alongside their file uploads in each assignment submission.

Utilize Technology Tools

Consider using technology tools to streamline the review process and help identify typographic, grammatical, or factual errors.

AI tools

The latest advanced AI tools can assist in identifying grammatical errors, suggesting more precise phrasing, and improving overall readability. They can also highlight potential inconsistencies or areas needing clarification, ensuring the materials are more accessible to students. They can also help format documents consistently, create summary points for complex topics, and even generate quiz questions based on the content.

(Oregon State University employees and currently enrolled students have access to the Data Protected version of Copilot. By logging in with their OSU credentials, users can use Copilot with commercial data protection, ensuring their conversations are secure and that Microsoft cannot access any customer data.)

Many powerful AI tools exist. But always verify their information for accuracy. Use them as a helper, not your only guide. AI tools complement human judgment but can’t replace it. Your oversight is essential. It ensures that AI-suggested changes align with the learning goals. It also preserves your voice and expertise.

Tools for content help

Some tools can be used to target different areas of content improvement:

  • Grammar and Style Checkers:
  • Fact-Checking Tools:
    • Google Scholar: This can be used to verify academic sources and find citations and references.
    • Snopes.com: Checks common misconceptions and urban legends
    • FactCheck.org: Verifies political claims and statements
  • Language Translation Tools:
    • Google Translate: Offers quick translations for various languages
    • DeepL: Provides accurate translations for multi-language content
  • Text-to-Speech and Proofreading:
  • Collaborative Editing Platforms:
    • Google Docs: Allows real-time collaboration and suggesting mode
    • Microsoft Word (with Track Changes): Enables collaborative editing

Request Targeted Assistance

If specific content requires a closer review, ask for help from other SMEs, your instructional designer, colleagues, or even students. Collaboration can provide fresh perspectives and help catch errors that might have been overlooked.

Encourage Open Communication

Foster an environment where students feel comfortable reporting errors and providing feedback. Make it clear that their input is valued and will be used to improve the course.

Embrace Constructive Criticism

It’s natural to feel defensive when receiving critical feedback (I always do!), but view it as an opportunity for potential improvement. By addressing these concerns, you can enhance the quality of your course materials and ultimately improve our students’ learning experience.

This post is adapted from a panel talk for AI Week, Empowering OSU: Stories of Harnessing Generative AI for Impact in Staff and Faculty Work

This past spring marked one year in my role as an instructional designer for Ecampus. Like many of our readers, I started conversing with AI in the early months of 2023, following OpenAI’s rollout of ChatGPT. Or as one colleague noted in recapping news of the past year, “generative AI happened.” Later, I wrote a couple of posts for this blog on AI and media literacy. A few things became clear from this work. Perhaps most significantly, in the words of research professor Ethan Mollick: “You will need to check it all.”

As the range of courses I support began to expand, so did my everyday use of LLM-powered tools. Here are some of my prompts to ChatGPT from last year, edited for clarity:

  • What is the total listening time of the Phish album Sigma Oasis?
    • Answer: 66 minutes and 57 seconds
  • How many lines are in the following list of special education acronyms (ranging from Section 504 – the Rehabilitation Act – to TBI – Traumatic Brain Injury)?
    • Answer: 27 lines
  • Where is the ancient city of Carthage today?
    • Answer: Today, Carthage is an archaeological site and historical attraction in the suburbs of the Tunisian capital, Tunis.
  • What is the name of the Roman equivalent of the Greek god Zeus?
    • Answer: Jupiter, king of the gods and the god of the sky and thunder
  • What’s the difference between colors D73F09 and DC4405?
    • Answer: In terms of appearance, … 09 will likely have a slightly darker, more orange-red hue compared to … 05, which might appear brighter. (Readers might also know these hues as variations on Beaver Orange.)

And almost every day:

  • Please create an (APA or MLA) citation of the following …

The answers were often on point but always in need of fact checking or another iteration of the prompt. Early LLMs were infamously prone to hallucinations. Factual errors and tendencies toward bias are still not uncommon.

As you can sense from my early prompts, I was mostly using AI as either a kind of smart calculator or an uber-encyclopedia. But in recent months, my colleagues and I here at Course Development and Training (CDT)—along with other units in the Division of Educational Ventures (DEV)—have been using AI in more creative and collaborative ways. And that’s where I want to focus this post.

The Partnership

First, some context for the work we do at DEV. Online course development is both a journey and a partnership between the instructor or faculty member and any number of support staff, from training to multimedia and beyond. Anchoring this partnership is the instructor’s working relationship with the instructional designer—an expert in online pedagogy and educational technology, but also a creative partner in developing the online or hybrid course.

Infographic showing the online course development process, from set up, to terms 1-2 in collaboration with the instructional designer, to launch and refresh.
Fig. 1. Collaboration anchors the story of online course development at OSU (credit: Ecampus).

Ecampus now offers more than 1,800 courses in more than 100 subjects. Every course results from a custom build that must maintain our strong reputation for quality (see fig. 1). This post is focused on that big circle in the middle—collaboration with the instructional designer. That’s where I see incredible potential for support or “augmentation” from generative AI tools.

As Yong Bakos, a senior instructor with the College of Engineering, recently reminded Faculty Forum, modern forms of this technology have been around since the 1940s, starting with the influence of programmable computers on World War II. But now, he added—in challenging faculty using AI to figure out rapid, personalized feedback for learners—”we speak the same language.”

Through continued partnership, how do we make such processes more nimble, more efficient? What does augmentation and collaboration look like when we add tools like Copilot or a custom GPT? Many instructional designers have been wrestling with these questions as of late.

“Human Guided, but AI Assisted”

Here are a few answers from educators Wesley Kinsey and Page Durham at Germanna Community College in Virginia (see fig. 2). Generative AI—also known as GAI—is a powerful tool, says Kinsey. “But the real magic happens when it is paired with a framework that ensures course quality.”

Slide on
Fig. 2. From a recent QM webinar on “unleashing” generative AI (CC BY-NC-ND).

Take this line of inquiry a little farther, and one starts to wonder: How might educators track or evaluate progress toward such use cases?

Funneling Toward Augmentation

As a thought experiment, I offer the following criteria and inventory—a kind of self-assessment of my own “human guided” journey through course development with generative AI (see fig. 3).

Criteria for Augmenting Development with Generative AI

ESTABLISHED – Regular, refined practice in course development
— EMERGING – Irregular and/or unrefined practice, could be improved
— ENVISION – Under consideration or imagined, not yet practiced

Faculty with experience teaching online may find my suggested criteria familiar; “established, emerging, envision” is adapted from an Ecampus checklist used in course redevelopment.

Funnel-shaped infographic with five augmentations: (1) From set up to intake; (2) Course content; (3) Suggested revisions; (4) Discussion, planning, and review; (5) Building and rebuilding
Fig. 3. Self-assessment of augmenting development with generative AI (CC BY-NC-SA).

Augmentation 1: From Set Up to Intake

Broadly speaking, I’m only starting to use chatbots in kicking off a course development—to capture a bulleted summary of an intake over Zoom, for example. Or with these kinds of level-setting prompts:

  • Remind me, what is linear regression analysis?
  • What fields are important to physical hydrology?
  • Explain to a college professor the migration of a social annotation learning tool from LTI 1.1 to 1.3.

Augmentation 2: Course Content

In my experience, instructors are only now beginning to envision how they might propose a course or develop its learning materials and activities with support from tools like Copilot—which is increasingly adept at helping us with this kind of iterative brainstorming work. The key here will be getting comfortable with practice, engaging in sustained conversations with defined parameters, often in scenarios that build on existing content. In recent practice with building assignments, I’m finding Claude 3 Sonnet helpful—more nuanced in its responses, and because you can upload brief documents at no cost and revisit previous chats.

Screenshot of conversation with Copilot, starting with a request to create an MLA citation of a lecture by Liam Callanan at the Bread Loaf Writers' Conference
Fig. 4. From a “more precise” conversation on citation generation. Can you spot Copilot’s errors in applying MLA style?

Augmentation 3: Suggested Revisions

Once course content begins rolling in, I apply more established practices for augmentation. For building citations of learning materials, I’m using Copilot’s “more precise” mode for its more robust abilities to read the open web and draw on various style guides (see fig. 4). With activities, often the germ of an idea for interaction needs enlargement—a statement of purpose or more detailed instructions. Here are a few more examples from working with the School of Psychological Science, with prompts edited for brevity:

  • What would be the purpose of practicing rebus puzzles in a lower division course on general psychology?
  • Please analyze the content of the following exam study guide, excerpted in HTML. Then, suggest a two-sentence statement of purpose that should replace the phrase lorem ipsum.
  • How should college students think about exploring Rorschach tests with inkblots? Please suggest two prompts for reflection (see fig. 5.)
Screenshot of Week 6 - Reflection Activity - Rorschach Inkblot Test, including a warning about the limitations of Rorschach tests and prompts for reflection
Fig. 5. From an augmented reflection activity in PSY 202H, General Psychology (credit: Juan Hu).

Augmentation 4: Discussion, Planning & Review

As with course planning, I’m not quite there yet with using generative AI to shape module templates and collect preferred settings for the building I do in Canvas. But by next year—armed perhaps with a desktop license for Copilot—I can imagine using AI to offer instructors custom templates or prompts to accelerate the design process. One more note on annotating augmentation—it’s incredibly important to let my faculty partners know—with consistent labeling—when I’m suggesting course content adapted from a conversation with AI. Most often, I’m not the subject matter expert—they are. That rule of thumb from Ethan Mollick still holds true: “You will need to check it all.”

Augmentation 5: Building & Rebuilding—More Efficiently

Finally, I look forward to exploring opportunities for more efficiently writing and revising the code behind everything we do with support from generative AI. Just imagine if the designer or instructor could ask a bot to suggest ways to strengthen module learning outcomes or update a task list, right there in Canvas.

Your Turn

With the above inventory in mind, let’s pause to reflect. To what extent are you comfortable using generative AI as a course developer? In what ways could this technology supplement new partnerships with instructional designers—or other colleagues involved in the discipline you teach? Together, how would you assess “augmentation” at each stage of the course development process?

Looking back on my own year of “human guidance with AI assistance,” I now turn more reflexively to AI for help with frontline design work—even as our team considers, for example, the ethical dimensions of asking chatbots to deliver custom graphics for illustrating weekly modules. In other stages, I’m still finding my footing in leveraging new tools, particularly during set up, refresh, and redesign. As we continue to partner with faculty, I remain open to navigating the evolving intersection of AI and course development.

(And now, for fun: Can you spot the augmentation? How much of that last sentence was crafted with support from a “creative” conversation with Copilot? Find the answer below.)

Resources, etc.

The following resources may be helpful in exploring generative AI tools, becoming more fluent with their applications, and considering their role in your teaching and learning practices.

This image is part of the Transformation Projects at the Ars Electronica Kepler's Garden at the JUK. The installation AI Truth Machine deals with the chances and challenges of finding truth through a machine.

All the buzz recently has been about Generative AI, and for good reason. These new tools are reshaping the way we learn and work. Within the many conversations about Artificial Intelligence in Higher Ed a common thread has been appearing regarding the other AI–Academic Integrity. Creating and maintaining academic integrity in online courses is a crucial part of quality online education. It ensures that learners are held to ethical standards and encourages a fair, honest, and respectful learning environment. Here are some strategies to promote academic integrity and foster a culture of ethical behavior throughout your online courses, even in the age of generative AI.

Create an Academic Integrity Plan

Having a clear academic integrity plan is essential for any course. Create an instructor-only page within your course that details a clear strategy for maintaining academic integrity. This plan might include a schedule for revising exam question banks to prevent cheating, as well as specific measures to detect and address academic dishonesty (plagiarism or proctoring software). In this guide, make note of other assignments or places in the course where academic integrity is mentioned (in the syllabus and/or particular assignments), so these pages can be easily located and updated as needed. By having a plan, you can ensure a consistent approach across the course.

Exemplify Integrity Throughout the Course

It is important to weave academic integrity into the fabric of your course. Begin by introducing the concept in your Start Here module. Provide an overview of what integrity means in your course, including specific examples of acceptable and unacceptable behavior. This sets the tone for the rest of the course and establishes clear expectations. On this page, you might:

  • Offer resources and educational materials on academic integrity for learners, such as guides on proper citation and paraphrasing.
  • Include definitions of academic dishonesty, such as plagiarism, cheating, and falsification.
  • Provide guidance on how learners might use generative AI within the class, including what is and is not considered acceptable.
  • Add scenarios or case studies that allow learners to discuss and understand academic integrity issues, specifically related to the use of generative AI.
  • Connect academic integrity with ethical behavior in the larger field.
  • Provide a place for learners to reflect on what it means for them to participate in the course in a way that maximizes their learning while maintaining academic integrity.

Throughout the course, continue to reinforce these ideas. Reminders about academic integrity can be integrated into various lessons and modules. By articulating the integrity expectations at the activity and assignment level, you provide learners with a deeper understanding of how these principles apply to their work. 

Set Clear Expectations for Assignments

When designing assignments, it is important to be explicit about your expectations for academic integrity. Outline what learners should and should not do when completing the task. For instance, if you do not want them to collaborate on a particular assignment, state that clearly. Provide examples and resources to guide learners on how to properly cite sources or avoid plagiarism. Be specific with your expectations and share why you have specific policies in place. For instance, if you want to discourage the use of generative AI in particular assignments, call out the ways it can and cannot be used. As an example, you might tell learners they can use generative AI to help form an outline or check their grammar in their finished assignment, but not to generate the body text. Share the purpose behind the policy, in this case it might be something about how a writing assignment is their opportunity to synthesize their learning and cement specific course concepts. This kind of transparency shows respect for the tools and the learning process, while also clearly outlining for learners what is acceptable.

Encourage Conversations About Integrity

Creating opportunities for learners to engage in discussions about academic integrity can help solidify these concepts in their minds. You can incorporate forums or discussion boards where learners can share their thoughts and experiences related to integrity. This also gives them a chance to ask questions and seek clarification on any concerns they may have. Encourage open dialogue between instructors and learners regarding academic integrity and any related concerns. These conversations can also extend beyond the classroom, exploring how integrity applies in your field or career paths. By connecting academic integrity to real-world scenarios, you help learners understand its relevance and importance in their professional lives.

Foster a Supportive Learning Environment

A supportive learning environment can help reinforce academic integrity by making learners feel comfortable asking questions and seeking guidance. Offer resources like definitions, guides, or access to mentors who can provide additional support. When learners know they have access to help, they are more likely to adhere to integrity standards. With generative AI in the learning landscape, we will inevitably encounter more “gray areas” in academic integrity. Be honest with your learners about your concerns and your hopes. Being open to conversations can only enhance the learning experience and the integrity in your courses.

We all play a role in cultivating a culture of academic integrity in online courses. By documenting a clear plan, weaving integrity into the course content, setting clear expectations, encouraging conversations, and providing support, you can create an environment where honesty and ethical behavior are valued and upheld. This not only benefits learners during their academic journey but also helps them develop skills and values that will serve them well in their future careers.

A few years ago, I was taking a Statistics class that was dreaded by most students in my graduate program. Upon starting, I discovered with pleasure that the instructor had introduced a new textbook, called An Adventure in Statistics: The Reality Enigma by Andy Field. The book followed a story-telling format and featured an outlandish science-fiction type plot, humor, colorful graphics, and comic-book snippets.

The merits of storytelling have been widely discussed, and that’s not what I want to talk about here. Rather, I’d like to highlight a specific element that I believe made a great contribution to the book’s instructional value: most of the content is presented through the dialogue between the main character, Zach, who needs to learn statistics, and various mentors, in particular one professor-turned-cat. The mentors guide Zach through his learning journey by explaining concepts, answering his queries, and challenging him with thought-provoking points. This makes the content more approachable and easier to understand as we, the students, struggle, ask questions, and learn together with Zach.

I believe that using dialogues—in particular of the student-tutor type—instead of monologues in instructional materials is an underutilized method of making difficult concepts more accessible. It is not a topic that has been researched much, but I did encounter a few interesting references.

One term that is often used to refer to this type of learning—by observing others learn—is “vicarious learning”. It was introduced in the 1960’s by Bandura, who showed that learning can happen through observing others’ behavior. Later, it was also used to talk about learning through the experiences of others or through storytelling (Roberts, 2010).

I was interested specifically in the effectiveness of student-tutor dialogue, which is a type of vicarious learning, and I found two articles that presented research on this topic.

Muller, Sharma, Eklund, and Reiman (2007) used instructional videos on quantum mechanics topics for second year physics students. In one condition, the video was a regular presentation of the material. In the other, the video was a semi-authentic dialogue between a student and a tutor, and incorporated alternative conceptions that physics students might hold, in combination with Socratic dialogue. The authors found significantly better outcomes on the post-test for the dialogue treatment.

Chi, Kang, and Yaghmourian (2017) conducted two studies that also featured physics concepts. They compared the effects of student-tutor dialogue videos versus lecture-style monologue videos, using the same tutors and the same supporting multimedia presentations. They, too, found increased learning for the students who watched the dialogue videos. They also found that students who watched the dialogue videos seemed to engage more in solving problems, generating substantive comments, and interacting constructively with their peers. The researchers offered some possible explanations for why this was the case: the incorrect statements and questions of the tutee triggered a more active engagement; tutees can serve as a model of learning; tutees make errors which are followed by tutor feedback – what they call “conflict episodes” that may motivate students to try harder.

Creating tutorial dialogue videos is time consuming and more difficult than making regular lectures. So, it is certainly not practical to use them on a large scale. However, it may be worth considering them for those areas where students struggle a lot.

Let us know if you’ve tried vicarious learning in any shape or form!

References:

Bandura A, Ross D, Ross S (1963) Vicarious reinforcement and imitative learning. Journal of Abnormal and Social Psychology 67(6): 601–607.

Chi, M. T., Kang, S., & Yaghmourian, D. L. (2017). Why students learn more from dialogue- than monologue-videos: Analyses of peer interactions. Journal of the Learning Sciences, 26(1), 10-50.

Muller, D. A., Sharma, M. D., Eklund, J., & Reimann, P. (2007). Conceptual change through vicarious learning in an authentic physics setting. Instructional Science, 35(6), 519–533. http://www.jstor.org/stable/41953754

Roberts, D. (2010). Vicarious learning: A review of the literature. Nurse Education in Practice, 10(1), 13-16.

I was recently reminded of a conference keynote that I attended a few years ago, and the beginning of an academic term seems like an appropriate time to revisit it on this blog.

In 2019, Dan Heath, a bestselling author and senior fellow at Duke University’s CASE Center, gave a presentation at InstructureCon, a conference for Canvas users, where he talked about how memories are formed. He explained that memories are composed of moments. Moments, according to Heath, are “mostly forgettable and occasionally remarkable.” To illustrate, most of what I’ve done today–dropping my kids off at spring break camp, replying to emails, going to a lunchtime yoga class, and writing this blog post–will largely be forgotten by next month. There is nothing remarkable about today. Unremarkable is often a desirable state because it means that an experience occurred without any hiccups or challenges.

Heath went on to describe what it is that makes great experiences memorable. His answer: Great experiences consist of “peaks,” and peaks consist of at least one of the following elements: elevation, insight, pride, or connection. He argued that we need to create more academic peaks in education. Creating peaks, he contends, will lead to more memorable learning experiences.

So, how do we create these peaks that will lead to memorable experiences? Let’s explore some ideas through the four approaches outlined by Heath.

Elevation. Elevation refers to moments that bring us joy and make us feel good. You might bring this element into your course by directly asking students to share what is bringing them joy, perhaps as an icebreaker. Sharing their experiences might also lead to connection, which is another way (see below) to create peaks that lead to memorable experiences. 

Insight. Insight occurs when new knowledge allows us to see something differently. Moments of insight are often sparked by reflection. You might consider making space for reflection in your courses. Creativity is another way to spark new insights. How might students engage with course concepts in new, creative ways? To list off a few ideas, perhaps students can create a meme, record a podcast, engage in a role play, or write a poem.

Pride. People often feel a sense of pride when their accomplishments are celebrated. To spark feelings of accomplishment in your students, I encourage you to go beyond offering positive feedback and consider sharing particularly strong examples of student work with the class (after getting permission–of course!) Showcasing the hard work of students can help students to feel proud of their efforts and may even lead to moments of joyful elevation.

Connection. Connection refers to our ties with other people. Experiencing connection with others can feel deeply rewarding. As I mentioned above, asking students to share their experiences with peers is one way to foster connection. In Ecampus courses, we aim to foster student-student and student-teacher connection, but I encourage you to explore other opportunities for students to make meaningful connections. Perhaps students can get involved with their communities or with colleagues, if they happen to have a job outside of classes. Students could connect with their academic advisors or the writing center to support their work in a course. There are many ways to foster connections that support students in their learning!

It’s easy to focus on delivering content, especially in online courses. This was one of Heath’s overarching points. The key, however, to creating memorable learning experiences is to take a student-centered approach to designing and facilitating your course. 

I invite you to start the term off by asking yourself: How can I create more moments of elevation, insight, pride, and connection for my students? It might be easier than you think.

References:

Heath, D. (2019, July 10). Keynote. InstructureCon. Long Beach, CA.

Seems like an easy question to answer, right? I might not give it a second thought.

Yet, as an online course developer, I sometimes find myself in conversations with co-developers where I realize I’ve been working under a different assumption about what a lecture is. And that’s fine, I kind of like having my preconceptions challenged. I wanted to share a little of that experience.

Our media development team has one of the stricter definitions of lecture, a specific kind of video recording. They have to, they are handling hundreds of videos every term. It’s essential for them to be able to sort media into the most efficient pipeline. Makes sense.

However, when I am working with subject matter experts, instructors, co-developers, etc. … I have found it useful to stay more flexible regarding many definitions. Sometimes errors in assumptions can open a door for discourse. It has certainly been a creative challenge. As an example, I’ll reminisce a little about a couple of my favorite mistaken assumptions about lectures. Ah yes, I remember it like it happened just this last Fall ….

I helped develop an upper-division online course centered on technology for educators. My first mistaken assumption was going in all ready to talk about video lecturing. The instructor on this co-development was a podcaster and wanted to deliver lectures in that format. I’ve had other instructors who preferred podcast lectures, no worries there. Some instructors see podcasts as a more portable kind of lecture or an alternative way to access the content. Students can listen to lectures on the go or download the lecture for offline listening. We just had to make sure to include transcripts for accessibility instead of captioning.

Also, I got to design the following playback interfaces to make them look more ‘podcasty’.

As you’ve probably guessed, there was a second mistaken assumption on my part. I was thinking the podcasts were the lectures. The podcasts are pieces of a larger “discourse given before an audience or class especially for instruction”. (Miriam-Webster: lecture and discourse). For our course, this discourse might be composed of multiple media element types.

The instructor wanted each ‘lecture’ to be a curated collection of learning elements focusing on specific topics; podcasts, video, reading, even Padlet posts. Part of the pedagogy here is to immerse them, as students, in a variety of technologies in the lectures that they may be using as educators. Together, we collaborated to find the most effective way to present all of this material as discrete lectures. Below is what we came up with. Would you still consider these lectures?

Interestingly, I don’t think we’ve quite strayed far from the Merriam-Webster definitions:

  • lecture: A discourse given before an audience or class especially for instruction
  • discourse: Formal and orderly and usually extended expression of thought on a subject. Connected speech or writing

While the course did include some interactive learning elements, these were not incorporated into lectures. It’s an interesting thought though.

  • How would you incorporate interactive exercises into lectures?
  • Does that still work within the definitions given above?

Maybe we can stretch the definition a little more. (Hmm. Perhaps in another blog post)

My takeaway here is that a lecture doesn’t have to be something given before a live class, or a simple narrated PowerPoint video online. As a course developer, my goal is to support my co-developer’s vision. But I am also serving the learning needs of students. As an online course developer, I have more flexibility about what a lecture can be. It makes sense to be open to more possibilities. I look forward to having more of these conversations with co-developers.

Stay flexible. Keep learning.

One of the most common concerns that instructors raise about teaching online is how to engage students in meaningful interactions. Online discussion boards is the default for simulating the types of conversations that take place in a classroom, albeit the online environment favors written communication in the form of posts and replies. These written posts may be the easiest ways of communication in online learning environments offering students less overwhelming experiences and more opportunities for critical thinking and building community (see benefits of discussion boards). However, written communication is not the only way in which students can interact with one another -images, audio, or video can increase engagement and motivation. Still, these options are not intuitively built into online discussion forums. 

The discussion board option appears to be boring and demotivating -it sounds more like a chore than an activity where students build community and participate in the exchange of ideas and perspectives – where they grow intellectually and as individuals. Online discussions can turn into spaces for dialogue, debates, and community. How do we design these spaces so that students engage and interact more meaningfully? Well, let’s explore a tiered approach to spark engagement in online discussions.

Tier 1: Revamp Discussion Boards

Consider the Community of Inquiry framework (CoI) in facilitating deep, engaging, and meaningful learning. The three elements of this framework can be used to design discussion boards: social presence, cognitive presence, and teaching presence. Ragupathi (2016) describes these presences in online courses as follows: “Social presence that will encourage students to present their individual personalities/profiles, help them identify with the community, communicate purposefully and function comfortably in a trusted environment; (2) Cognitive presence that will get students to introduce factual, conceptual, and theoretical knowledge into the discussion and be able to construct/confirm meaning through sustained reflection and discourse; and (3) Teaching presence to provide necessary facilitation of the learning process through effective discussion.” (p. 4). Social presence in particular can be achieved through discussions (although not the only tool) to promote a sense of connection and community. 

Apart from a strong foundation on a sense of connection and community that the CoI promotes, the structure of the discussion assignment plays an important role. To this effect, “structure” and “why” are the key

Revise Structure and Format

  • Establish a clear purpose and add value to the participation/contribution:
    • Instructor-led: contextualize the outcomes, make explicit expectations
    • Student-led: ask students to share their takeaways from the discussion participation (e.g., reflection, embedded in assignments)
    • Connect the content to the discussion assignment (e.g., ask students to refer back or cite previous readings/videos completed in the weekly content)
  • Clearly set expectations for:
    • Grading criteria (e.g., provide a rubric or grading guidelines)
    • Timeframe
    • Resources (e.g., from the course or external)
    • What is a “good post” (e.g., provide an example, describe an example that does not meet expectations)
    • Clarify terminology (e.g., link to a glossary of terms)
  • Support continuity of engagement:
  • Make discussion spaces manageable (students & faculty)

Visit this link for discussion board examples.

Tier 2. Augment the Discussion Boards

The next tier is to augment the opportunities that discussion boards offer. Structure and creativity will intertwine in layers to turn discussions into collaborative spaces. Here, there is greater emphasis on community as a place where students take a more active role, embrace challenges, and own their contribution role as active participants in building knowledge together.

  • Start with setting the discussion board as a place for a conversation:
    • Introductions: encourage students to use additional elements to introduce themselves to the class (e.g., images, videos, goals, expectations). With the caveat that it is optional so they feel comfortable choosing what and how to share. 
  • Create discussion scenarios/questions/prompts that elicit more than one response:
    • Post first before you see previous posts
    • Students post follow-up questions and bring additional examples. Students reply to more than 2 peers who have not received replies yet
    • Encourage students to bring their experiences, outside readings, and additional resources to share
    • Encourage posts in different formats (e.g., video, images, infographics, mindmaps)
  • Student-facilitated discussions:
    • Create small groups and ask students to select a leader (rotate leadership role) Alternatively, randomly assign a leader
    • Student leaders post summaries of discussions in small groups and/or in whole-class discussions
    • Set expected participation: 
  • A minimum number of responses (1 post; 2 replies; number of posts in total)
  • Consider self-paced discussions and encourage students to post a certain number of posts throughout the term or week. (Caveat: the first few students that post might need to wait until others post)
  • Create a learning community for future assignments:
    • Students share initial drafts, outlines, and research topics and ask for comments/feedback. Alternatively, students post their initial work and share their goals, and ideas about how it is relevant. Students are encouraged to read the shared work or not.
    • Beyond the Question and Answer format (e.g., role plays, debates, WebQuests)
    • Set the discussion as a Peer review assignment.  

Tier 3. Beyond Discussion Boards*

The linearity that many discussion board platforms have could make the interaction feel inauthentic, boring, and tedious to navigate. An alternative to a linear discussion is the concept of social annotations and collaborative spaces where students intersect transversally and with multimodal elements.

  • Social Annotations: students can add comments, post questions, vote, and interact with peers over learning materials such as readings, videos, visuals, and websites. Students interact and collaborate based on interests and questions they have while studying the content. You can use social annotations as a learning tool.
  • Asynchronous conversations: increase the collaborative nature of group work with multimodality where students not only post and reply but also create their own content for others to comment on. Explore asynchronous conversations in VoiceThread.
  • Collaborative work: online discussions do not have to be about posts and replies only. Students can engage in meaningful conversations through collaborative work. For example, students can do collaborative assignments, interact synchronously or asynchronously, and comment on each others’ contributions. Some web platforms you can explore include Microsoft Whiteboard and Miro.

Tier 4: Unleash the Discussion Boards

While discussion boards are mainly associated with asynchronous learning environments, discussions can play an important role in hybrid learning. You may be wondering why when we know that one of the underlying features of hybrid learning is to use the class time for active learning, collaborative and team activities, increased participation, and social interaction. But these activities do not have to end when the class time is over. Discussions can help keep students engaged in the class topics and activities after the in-person experience. Any of the tier approaches described above could be integrated seamlessly into hybrid learning to give continuity to class conversations, prep for future in-person activities, foster metacognitive and reflection skills, and strengthen social presence. 

*Note: The use of other tools outside of the Canvas learning management system will require a careful evaluation of accessibility and privacy policies.  

References

By: Julie Jacobs, Jana King, Dana Simionescu, Tianhong Shi

Overview

A recent scenario with our course development team challenged our existing practices with lecture media. Formerly, we had encouraged faculty to include only slides with narration in their lecture videos due to concerns about increasing learners’ cognitive load. Students voiced their hope for more instructor presence in courses, and some instructors started asking about including video of themselves inserted into their lectures. This prompted us to begin thinking about instructor presence in lecture videos more deeply: why were we discouraging faculty from including their faces in lecture videos? While our practices were informed by research-based media theory, we also recognized those theories might be outdated. 

We began to explore the latest research with the following question in mind: does visual instructor presence in lectures increase extraneous cognitive load in learners? We use the phrase “visual instructor presence” to refer to lecture videos where an instructor’s moving image is seen giving the lecture, composited together with their slides. This technique is also commonly referred to as “picture-in-picture”, as seen in the image below.

Image 1: Adam Vester, instructor in College of Business, in his lecture design for BA 375 Applied Quantitative Methods.

A task force was created to review recent research on visual instructor presence and cognitive load, specifically in lecture-type videos. Our literature review included a look at leading multimedia learning scholar Richard E. Mayer’s newest group of principles. We also reviewed more than 20 other scholarly articles, many of which were focused on learner perception, motivation & engagement, and emotion. 

Findings

According to recent work in multimedia learning, research in this area should focus on three areas, namely learning outcomes (“what works/ what does not work?”), learning characteristics (“when does it work?”), and learning process (“how does it work?”) (Mayer, 2020). Below are our conclusions from the 23 research articles we reviewed regarding instructional videos, attempting to answer the above questions of “what works”, “when does it work”, and “how does it work”.  

  1. This review of recent literature shows no evidence that visual instructor presence increases extraneous cognitive load. 
  2. Students tend to prefer lectures with visual instructor presence – they report increased satisfaction and better perceived learning, which can boost motivation and engagement. 
  3. While some studies find no difference in performance outcomes when visual instructor presence is utilized, others found increased performance outcomes with visual instructor presence. Proposed explanations: embodiment techniques such as gestures, eye contact, and body movement which fosters generative processing (the cognitive processes required for making sense of the material); social cues can help direct the learners’ attention; increased motivation (as per point 2 above) contributes to better learning. 
  4. The effects may depend on the specific type of visual instructor presence (e.g., small picture-in-picture, green-screen, or lightboard) and the characteristics of the content (complex/difficult vs simpler/easier). 

Recommendations

Based on these findings, our team has decided to remove the default discouragement of instructors wishing to use picture-in-picture in lectures. If an instructor is interested in having their visual presence in the lectures, we encourage them to discuss this option with their Instructional Designer and Lecture Media Coordinator to determine if this style is a good fit for them and their content.

Image 2: Bryony DuPont, associate professor of Mechanical Engineering, utilizing visual instructor presence in her lecture design for ME 382 Introduction to Design.

We recommend considering the following points:

  • What is their presentation style? Do they tend to spend a lot of time talking over a slide or is there a lot of text or other action (e.g. software demo) happening in the video? If there’s a lot happening on the screen, perhaps it’s better to not put their video on top of it (the instructor video could be placed only at the beginning and/or end instead).
  • What type of content? Is it simple or more complex? For more visually complex content, a lightboard or digital notation without picture-in-picture may work better, to take advantage of the dynamic drawing principle and the gaze guidance principle. 
  • Is it a foreign language course? If so, it’s likely helpful for the learners to see the instructor’s mouth and body language. 
  • Is the instructor comfortable with being on video? If they’re not comfortable with it, it may not add value. This being said, our multimedia professionals can help make instructors more comfortable in front of the camera and coach them on a high-embodied style of lecturing. 

Since implementing these guidelines and working with an increased number of lectures with visual instructor presence, we also noticed that it works best when the instructor does not look and sound like they’re reading. Therefore, for people who like working with a script, we recommend practicing in advance so they can sound more natural and are able to enhance their presentation with embodiment techniques.

We would love to hear about your opinions or experiences with this type of video. Share them in the comments!

For a detailed summary of our findings and full citation list, please see the full Literature Review.


Some form of group work is a common activity that I help design with faculty every term. Oftentimes, faculty ask how to consider the different levels of engagement from individual group members and how to assess group work, often in the form of a group grade. Improving group work in asynchronous courses and group contracts to promote accountability are some of many ways to guide students into collaborative work. However, collaborative work may require offering equitable opportunities to all students to succeed. Based on the work by Feldman (2019), I’d like to outline some suggestions for assessment design through an equity lens.

Before jumping into assessing group work, Feldman outlines three pillars of equitable grades:

  1. “They are mathematically accurate, validly reflecting a student’s academic performance.
  2. They are bias-resistant, preventing biased subjectivity from infecting our grades.
  3. They motivate students to strive for academic success, persevere, accept struggles and setbacks, and to gain critical lifelong skills” (Feldman, p. 71).

With these three pillars in mind, let’s examine some potential issues with a group receiving one grade for their work.

  1. Accuracy: a collective group grade does not necessarily reflect an individual’s contribution to the group work or assess an individual student’s learning in terms of outcomes. For example, if a group splits up sections of a project into individual responsibilities, a student who did their assigned section very well may not have had an opportunity to gain new knowledge or build on their learning for aspects where they were struggling. And a group grade does not accurately capture their individual work or learning.
  2. Bias: Many times peer evaluations of group work come with some kind of group contract or accountability measure. However, there is a possibility for bias in how students evaluate their peers, especially if that evaluation is based on behaviors like turning things in on time and having strong social skills instead of learning. For example, maybe one of the group members had a job with a variable schedule from week to week, making it difficult to join regular group discussions and complete work at the same pace every week for the duration of the project. Other group members may perceive them as difficult to work with or inconsistent in their commitment and award them fewer points in a peer evaluation, especially if other group members did not have outside factors noticeably impacting their performance.
  3. Motivation: Group contracts and using evaluation as a way to promote productivity is an external motivator and does not instill a sense of internal relevance for students participating in group work. Instead, students may feel resentful that their peers may evaluate them harshly for things outside of their control, which can quickly snowball into a student disengaging from group work entirely.

“The purpose of group work is not to create some product in which all members participate, but for each student to learn specific skills or content through the group’s work together.”

Feldman, p. 104

So how do we assess this learning? Individually. If we can reimagine group work as a journey toward an individual reaching a learning outcome, then instead of assessing a behavior (working well and timeliness in a group) or what a group produces, we can instead create an assessment that captures the individual impact of the group work instead. Feldman outlines some tips for encouraging group work without a group grade:

  1. Have a clear purpose statement and overview for the group work that outlines the rationale and benefit of learning that content in a group context.
  2. Have clear evaluation criteria that shows the alignment of the group work with a follow-up individual assessment.
  3. If possible, include students in the process by having a brainstorm or pre-work discussion ahead of time about what makes groups productive, how to ensure students learn material when working in groups, and what kinds of collaborative expectations can be set for a particular cohort of students.
  4. Be patient with students navigating a new assessment strategy for the first time and offer ample feedback throughout the process so students are set up for success on their assessments.
  5. Ensure the follow-up individual assessment is in alignment with learning outcomes and is focused on the content or skills students are expected to gain through group work.

As an added bonus, assessing group work individually in this way is often simpler than elaborate group work rubrics with separate peer evaluations factored in, making it both easier for the instructor and easier for the student to understand how their grade is calculated. Additionally, it will be important to design this group work with intention—if an individual could learn the material on their own, then what is the purpose of the group interaction? Think about a group project you may have assigned or designed in the past. What was the intention for that journey as a group? And how might you reimagine it if there was an individual assessment after its completion? I hope these questions are great starting points for reflecting on group work assessments and redesigning with equity in mind!

References

Feldman, J. (2019). Grading for equity: What it is, why it matters, and how it can transform schools and classrooms. Thousand Oaks, CA: Corwin.