“Belonging is a universal human need that is fundamentally linked to learning and well-being. It describes an individual’s experience of feeling that they are, or are likely to be, accepted and respected as a valued contributor in a specific environment.”           

Structures for Belonging: A Synthesis of Research on Belonging-Supportive Learning Environments
image of Maslow's pyramid of needs

Maslow’s Hierarchy of Needs is a helpful framework when discussing belonging, which falls in the middle, at level three, just above the basics for survival (level one: air, water, food, shelter) and safety (level 2: health, employment, family, security). 

Image from Wikimedia Commons

Have you heard the word belonging recently in reference to students and employees? At OSU, it seems to be popping up frequently in conversations and discussions, onboardings and trainings, online and off, becoming a buzzword for those concerned with teaching and learning, recruitment and outreach, employee satisfaction, and student success, and has become a focal point of our ongoing efforts towards diversity, equity, and inclusion. This increased focus on the concept of belonging at OSU is reflected in the university’s 2018 Innovate & Integrate: Plan for Inclusive Excellence, and is echoed by the 2021 Oregon Department of Education’s passing of the Every Student Belongs rule, which states, “It is the policy of the State Board of Education that all students, employees, and visitors in public schools are entitled to learn, work, and participate in an environment that is safe and free from discrimination, harassment, and intimidation.” These initiatives reflect a growing understanding that traditionally prevailing systems of power have historically marginalized certain groups and excluded them from many realms of life, including education, and prioritize a commitment to changing the status quo explicitly and with intention. 

At Ecampus, belonging is an area of active study, and our effort to extend the feeling of belonging to our online students is an important part of our mission, vision, & values and our own Inclusive Excellence Strategic Plan’s goals. We realize that our Ecampus students come from a wide range of backgrounds, seek online learning for a variety of reasons, and comprise higher numbers of students from historically marginalized backgrounds, and thus, combined with the nature of online learning, can feel increased isolation and less of a sense of belonging than their on-campus peers. 

What is belonging and why is it important?

Belonging is a complex, multi-layered, and changeable quality that is nonetheless very important for student success. Maslow’s Hierarchy of Needs places belonging in the category of psychological needs, just above the basic needs including food, water, air, safety, and shelter. While there are many definitions, the concept of belonging generally encompasses feeling safe, appreciated, welcomed, valued, and respected in a given situation. Humans learn to search for and interpret signals that they belong or do not belong when entering into new situations or contexts. Marginalized groups have had to learn to be cognizant of where and when they could expect to be excluded and on the alert for cues signaling such. Traditionally, educational institutions have been places of exclusionary practices, often closed to large groups in both policy and practice. Students from marginalized populations, facing this problematic history of exclusion, may be looking for signals and signs that indicate the extent to which they are valued and respected as members of the school community. Students may not be sure they will be accepted in institutions, departments, courses, and other school environments and may be consciously or unconsciously searching for such clues as reassurance that they do, in fact, belong. 

Belonging is important for student success because it conveys a host of positive benefits and is a crucial aspect of educational accomplishment. When students find welcoming, inclusive attitudes, see others like themselves being accepted and thriving, and are made to feel safe, protected, supported, and valued, their sense of belonging increases, which in turn allows them to relax and be confident sharing more of their full selves. Students who have a strong sense of belonging show increased academic performance, better attendance, persistence, retention, and motivation, and less likelihood of dropping out. Dr. Terrill Strayhorn, Professor of Urban Education and Vice President for Academic and Student Affairs at LeMoyne-Owen College, in his book College Student’s Sense of Belonging, concludes that “deprivation of belonging in college prevents achievement and wellbeing, while satisfaction of college students’ sense of belonging is a key to educational success for all students.” 

In education, as in our society at large, belonging is often related to larger systems that privilege and prefer certain groups and their ideas, beliefs, and ways of being. Those whose race, ethnicity, sexual identity, gender, class, indigeneity, language, or ability are not of the majority are especially likely to be anxious and “on alert” to othering, exclusion, bullying, and stereotyping. This can have dramatic negative short and long term effects, including lowered cognitive capacity, increased stress, and reduced persistence and achievement. Students who lack a sense of belonging may feel uncomfortable in class or group work, unable to concentrate, and may experience self-consciousness and worry, which makes it that much more difficult to attain higher-level needs such as self-confidence, recognition, respect, fulfillment, and achievement. When students face active discrimination, bullying, or other forms of harassment, they may become depressed, choose to disengage, drop courses, or discontinue studying. With such dire consequences, taking the time to understand and assist in ensuring all OSU students are made to feel welcomed and accepted is well worth the effort. 

Why do online students sometimes feel less of a sense of belonging? 

There are many contributing factors to the disparity between online and traditional students’ development of a sense of belonging, starting with the very nature of the modality in which they study. Students living and studying on campus often have more frequent contact with instructors, campus staff, and other students, both structured and impromptu, providing opportunities to build relationships that can enhance their sense of community and belonging. The pacing of on-campus courses tends to be predictable, with regular meetings during which students often have the chance to ask questions (and receive answers quickly) and get to know fellow students and instructors. Instructors have dedicated class time to review important concepts, check understanding, and provide opportunities for students to get to know them and their fellow students. The traditional on-campus experience is geared towards taking a diverse group of students and building a cohesive community in many ways- students have a wide array of support services available to them, many activities, sports, and clubs they can join, and have a host of opportunities to participate in the rich culture of OSU and in academic and social communities, most of which are easily accessible on campus. Indeed, the very nature of on-campus learning seeks to provide a community for traditional students, many of whom are young and leaving their own homes and communities for the first time.

In contrast, Ecampus courses are asynchronous, featuring no scheduled meeting times, as our students live around the USA and the world. While this format allows for increased access for students who cannot attend in person, the lack of face-to-face interaction can make it difficult for both students and instructors to make personal connections. Unless their courses are carefully designed to provide chances for interaction, conversation, collaboration, and community building, online students may not often interact with their instructors or peers. Online students can experience feelings of isolation, loneliness, and disengagement, which can greatly affect their sense of belonging as an OSU student as well as their success and performance. 

Complicating things even further is the tendency to experience digital miscommunication, the concept that humans are less able to infer tone, underlying sentiment, and in general not understand nuance when communicating by text and online, to some extent due to the lack of context and/or visual clues one gets when interacting face to face. A 2016 literature review on the topic of establishing community in online courses found digital communication to be a consistent issue, noting “…the absence of visual meaning-making cues such as gesture, voice tone, and immediate interaction can frustrate students and lead to feelings of isolation and disconnectedness in an online classroom” and recommended that instructors who teach online learn the nuances of these different communication needs. 

It must be noted that some online students, who may be older, working full or part time, caring for family, or otherwise already leading (sometimes overly) full lives do not particularly want or need the sense of community that younger traditional students may seek out from their university. They may have little time to devote to community building and little interest in superfluous interaction, shying away from an increased social burden they may not have time and energy to fully commit to. Since we cannot know in advance the detailed makeup of our student body, planning with an assumption that creating belonging is an important aspect of our approach serves online students best.

Stay tuned for Part 2: What can we do to help? for research-based strategies you can use to improve belonging and inclusion.


Sources

Ally for Canvas | Learn@OregonState

Belonging and Emotional Safety – Casel Schoolguide 

Building Inclusivity and Belonging | Division of Student Affairs

College Student’s Sense of Belonging

Creating a Safe and Respectful Environment in Our Nation’s Classrooms 

Cultural Centers | Oregon State University

Decades of Scientific Research that Started a Growth Mindset Revolution

Ecampus Essentials – Standards and Principles – Faculty Support | Oregon State Ecampus | OSU Degrees Online

Establishing Community in Online Courses: A Literature Review 

Growth Mindset in the Higher Education Classroom | Center for Learning Experimentation, Application, and Research

Innovate & Integrate: Plan for Inclusive Excellence | Institutional Diversity 

Mission, Vision and Values | Oregon State Ecampus | OSU Degrees Online

Online Teaching Principles – Standards and Principles – Faculty Support | Oregon State Ecampus | OSU Degrees Online

Oregon Department of Education 

OSU Search Advocate Program

Peer Mentor Program | TRiO | Oregon State University

Social Justice Education Initiative 

State of Oregon Diversity, Equity, and Inclusion Action Plan

Student Academic Experience Survey 2022

The UDL Guidelines

Update Syllabus – Term Checklist and Forms – Faculty Support | Oregon State Ecampus | OSU Degrees Online

Using a warmer tone in college syllabi makes students more likely to ask for help, OSU study finds | Oregon State University

Utilizing Inclusive and Affirming Language | Institutional Diversity

Wooden sign with the word welcome on it.
Wooden sign with the word welcome on it.

“You never get a second chance to make a first impression.” ~ Will Rogers

As Winter Break has begun it’s rapid decent into the start of a new term, it’s time to take a look at how we will welcome our students back to school in the new year. Winter term brings new beginnings for students as their papers now contain the date 2024. Maybe they’ve made resolutions to do homework on time, or read every last page you request, or just be more present, whatever it is, that first message or impression from you in the new term sets the tone for the class. I’m sure that everyone wants to start a class off on a positive note, so let’s look at 5 ways you can create an informational, welcoming, and inclusive message to start the term/semester off right.

  1. Welcoming tone
  2. Talk about your class
  3. Offer support (and remind them to review the syllabus!)
  4. How to get started
  5. Inspire them

Create a Welcoming Tone

I don’t know about you but when I think back to the professors and teachers that I enjoyed learning from, I remember who they were and how they communicated with the class. They weren’t just an educated, knowledgable, and smart person, they were personable too. Empathy for their students, calling out the fact that we all have a bad day from time to time or might have just missed a deadline made it not seem daunting if we had to come “begging” for an extension. It didn’t seem like begging, it was known and called out that it could happen. Give your students the ease as you recognize them as people and not just a name on a roster.

Talk About the Class

Just think, a brand new set of classes, so many new syllabi to read and materials to devour. Hype your class up by talking about exciting topics, real world applications, and maybe mention an assignment or two that they’ll be working on.

Offer Support

We know that each of our students begins our class with a different set of circumstances on the other side of that screen. With that in mind, including a reference to support for these students can be helpful in letting them know the resources are there and it’s ok to use them. Mention your syllabus, the getting started or introduction module, and make sure they know resources are listed and available in all of those places and not only for your class but for all those other things that life tosses their way.

How to Get Started

So much information is available at the start of a new term. Sometimes it’s hard to know where to start! Wait, what happens when it’s before the term starts? Can we help our students prepare for their classes ahead of time and maybe ease their mind a little bit? How about a Canvas email to your students that introduces them before the term starts to their upcoming class. You could include information about if the class is published already, even if it’s just the welcome page and what OSU Ecampus calls the “Start Here” module that includes information about the class (syllabus) and resources that they have access to as Ecampus students. In that same email, you can help them to figure out where they should start. By telling them directly, and maybe even providing a link, you can give them the information to get started with less anxiety as they know they’re starting where you think they should.

Inspire them

Your excitement about working with them often evokes excitement and positive anticipation of a great class. Share with them a quote or why you love this topic and maybe give them an interesting fact that can pique their curiosity. The point here is to get them inspired and excited to learn.

Example

Dear Students,

Welcome to QLT 123: Introduction to Quilting! My name is Professor Seam and I’ll be your instructor for this online course. We are going to learn so much this term, the first three months of quilting are simply mind-blowing as you move from not knowing how to start to drafting a mockup of one you’d like to make, and finishing your first quilt! We’ll explore the basics, you’ll have opportunities to show off your success and funny failures (because guess what, they happen!) and in the end, you’ll get to showcase all of your hard work in your finished quilt. Guess what? There are no textbooks for this class! Instead, you get order in some fun fabric (but not yet!) Hop into our Canvas site and take a look at the syllabus, find resources for support if you are in need, introduce yourself in the first discussion board and take a look at what’s in the first module. We’ll start next week when the term begins so get ready to sew the seams of creativity because you’ve just started the most sew-perb quilting class and I can’t wait to embark on this journey with you.
-Professor Seam

Share out!

Got a great welcome message? Share with us in the comments!

By Greta Underhill

In my last post, I outlined my search for a computer-assisted qualitative data analysis software (CAQDAS) program that would fit our Research Unit’s needs. We needed a program that would enable our team to collaborate across operating systems, easily adding in new team members as needed, while providing a user-friendly experience without a high learning curve. We also needed something that would adhere to our institution’s IRB requirements for data security and preferred a program that didn’t require a subscription. However, the programs I examined were either subscription-based, too cumbersome, or did not meet our institution’s IRB requirements for data security. It seemed that there just wasn’t a program out there to suit our team’s needs.

However, after weeks of continued searching, I found a YouTube video entitled “Coding Text Using Microsoft Word” (Harold Peach, 2014). At first, I assumed this would show me how to use Word comments to highlight certain text in a transcript, which is a handy function, but what about collating those codes into a table or Excel file? What about tracking which member of the team codes certain text? I assumed this would be an explanation of manual coding using Word, which works fine for some projects, but not for our team.

Picture of a dummy transcript using Lorem Ipsum placeholder text. Sentences are highlighted in red or blue depending upon the user. Highlighted passages have an associated “comment” where users have written codes.

Fortunately, my assumption was wrong. Dr. Harold Peach, Associate Professor of Education at Georgetown College, had developed a Word Macro to identify and pull all comments from the word document into a table (Peach, n.d.). A macro is “a series of commands and instructions that you group together as a single command to accomplish a task automatically” (Create or Run a Macro – Microsoft Support, n.d.). Once downloaded, the “Extract Comments to New Document” macro opens a template and produces a table of the coded information as shown in the image below. The macro identifies the following properties:

  • Page: the page on which the text can be found
  • Comment scope: the text that was coded
  • Comment text: the text contained in the comment; for the purpose of our projects, the code title
  • Author: which member of the team coded the information
  • Date: the date on which the text was coded

Picture of a table of dummy text that was generated from the “Extract Comments to New Document” Macro. The table features the following columns: Page, Comment Scope, Comment Text, Author, and Date.

You can move the data from the Word table into an Excel sheet where you can sort codes for patterns or frequencies, a function that our team was looking for in a program as shown below:

A picture of the dummy text table in an Excel sheet where codes have been sorted and grouped together by code name to establish frequencies.

This Word Macro was a good fit for our team for many reasons. First, our members could create comments on a Word document, regardless of their operating system. Second, we could continue to house our data on our institution’s servers, ensuring our projects meet strict IRB data security measures. Third, the Word macro allowed for basic coding features (coding multiple passages multiple times, highlighting coded text, etc.) and had a very low learning curve: teaching someone how to use Word Comments. Lastly, our institution provides access to the complete Microsoft Suite so all team members including students that would be working on projects already had access to the Word program. We contacted our IT department to have them verify that the macro was safe and for help downloading the macro.

Testing the Word Macro       

Once installed, I tested out the macro with our undergraduate research assistant on a qualitative project and found it to be intuitive and helpful. We coded independently and met multiple times to discuss our work. Eventually we ran the macro, pulled all comments from our data, and moved the macro tables into Excel where we manually merged our work. Through this process, we found some potential drawbacks that could impact certain teams.

First, researchers can view all previous comments made which might impact how teammates code or how second-cycle coding is performed; other programs let you hide previous codes so researcher can come at the text fresh.

Second, coding across paragraphs can create issues with the resulting table; cells merge in ways that make it difficult to sort and filter if moved to Excel, but a quick cleaning of the data took care of this issue.

Lastly, we manually merged our work, negotiating codes and content, as our codes were inductively generated; researchers working on deductive projects may bypass this negotiation and find the process of merging much faster.

Despite these potential drawbacks, we found this macro sufficient for our project as it was free to use, easy to learn, and a helpful way to organize our data. The following table summarizes the pro and cons of this macro.

Pros and Cons of the “Extract Comments to New Document” Word Macro

Pros

  • Easy to learn and use: simply providing comments in a Word document and running the macro
  • Program tracks team member codes which can be helpful in discussions of analysis
  • Team members can code separately by generating separate Word documents, then merge the documents to consensus code
  • Copying Word table to Excel provides a more nuanced look at the data
  • Program works across operating systems
  • Members can house their data in existing structures, not on cloud infrastructures
  • Macro is free to download

Cons

  • Previous comments are visible through the coding process which might impact other members’ coding or second round coding
  • Coding across paragraph breaks creates cell breaks in the resulting table that can make it hard to sort
  • Team members must manually merge their codes and negotiate code labels, overlapping data, etc.

Scientific work can be enhanced and advanced by the right tools; however, it can be difficult to distinguish which computer-assisted qualitative data analysis software program is right for a team or a project. Any of the programs mentioned in this paper would be good options for individuals who do not need to collaborate or for those who are working with publicly available data that require different data security protocols. However, the Word macro highlighted here is a great option for many research teams. In all, although there are many powerful computer-assisted qualitative data analysis software programs out there, our team found the simplest option was the best option for our projects and our needs.

References 

Create or run a macro—Microsoft Support. (n.d.). Retrieved July 17, 2023, from https://support.microsoft.com/en-us/office/create-or-run-a-macro-c6b99036-905c-49a6-818a-dfb98b7c3c9c

Harold Peach (Director). (2014, June 30). Coding text using Microsoft Word. https://www.youtube.com/watch?v=TbjfpEe4j5Y

Peach, H. (n.d.). Extract comments to new document – Word macros and tips – Work smarter and save time in Word. Retrieved July 17, 2023, from https://www.thedoctools.com/word-macros-tips/word-macros/extract-comments-to-new-document/

by Greta Underhill

Are you interested in qualitative research? Are you currently working on a qualitative project? Some researchers find it helpful to use a computer-assisted qualitative data analysis software (CAQDAS) program to help them organize their data through the analysis process. Although some programs can perform basic categorization for researchers, most software programs simply help researchers to stay organized while they conduct the deep analysis needed to produce scientific work. You may find a good CAQDAS program especially helpful when multiple researchers work with the same data set at different times and in different ways. Choosing the right CAQDAS for your project or team can take some time and research but is well worth the investment. You may need to consider multiple factors before determining a software program such as cost, operating system requirements, data security, and more.

For the Ecampus Research Unit, issues with our existing CAQDAS prompted our team to search for another program that would fit our specific needs: Here’s what we were looking for:

NeedsReasoning
General qualitative analysisWe needed a program for general analysis for multiple types of projects; Other programs are designed for specific forms of analysis such as Leximancer for content analysis
Compatibility across computer operating systems (OS)Our team used both Macs and PCs
Adherence to our institution’s IRB security requirementsLike many others, our institution and our team adhere to strict data security and privacy requirements, necessitating a close look at how a program would manage our data
Basic coding capabilitiesAlthough many programs offer robust coding capabilities, our team needed basic options such as coding one passage multiple times and visually representing coding through highlights
Export of codes into tables or Excel booksThis function is helpful for advanced analysis and reporting themes in multiple file formats for various audiences
A low learning-curveWe regularly bring in temporary team members on various projects for mentorship and research experience, making this a helpful function
A one-time purchaseA one-time purchase was the best fit for managing multiple and temporary team members on various projects

Testing a CAQDAS

I began systematically researching different CAQDAS options for the team. I searched “computer-assisted qualitative data analysis software” and “qualitative data analysis” in Google and Google Scholar. I also consulted various qualitative research textbooks and articles, as well as blogs, personal websites, and social media handles of qualitative researchers to identify software programs. Over the course of several months, I generated a list of programs to examine and test. Several programs were immediately removed from consideration as they are designed for different types of analysis: DiscoverText, Leximancer, MAXQDA, QDA Miner. These programs are powerful, but best suited for specific analysis, such as text mining. With the remaining programs, I signed up for software trials, attended several product demonstrations, participated in training sessions, borrowed training manuals from the library, studied how-to videos online, and contacted other scholars to gather information about the programs. Additionally, I tested whether programs would work across different operating systems. I kept recorded details about each of the programs tested, including how they handled data, the learning curve for each, their data security, whether they worked across operating system, how they would manage the export of codes, and whether they required a one-time or subscription-based payment. I started with three of the most popular programs, NVivo, Dedoose, and ATLAS.ti. The table below summarizes which of these programs fit our criteria.

NVivoDedooseATLAS.ti
General Qualitative Analysis
Cross-OS Collaboration
Data security
Basic coding capabilities
Export codes
Low learning curve
One-time purchase
A table demonstrating whether three programs (NVivo, Dedoose, and ATLAS.ti) meet the team’s requirements. Details of requirements will be discussed in the text of the blog below.

NVivo

I began by evaluating NVivo, a program I had used previously. NVivo is a powerful program that adeptly handled large projects and is relatively easy to learn. The individual license was available for one-time purchase and allowed the user to maintain their data on their own machine or institutional servers. However, it had no capabilities for cross-OS collaboration, even when clients purchased a cloud-based subscription. Our team members could download and begin using the program, but we would not be able to collaborate across operating systems.

Dedoose

I had no prior experience with Dedoose, so I signed up for a trial of the software. I was impressed with the product demonstration, which significantly helped in figuring out how to use the program. This program excelled at data visualization and allowed a research team to blind code the same files for interrater reliability if that suited the project. Additionally, I appreciated the options to view code density (how much of the text was coded) as well as what codes were present across transcripts. I was hopeful this cloud-based program would solve our cross-OS collaboration problem, but it did not pass the test for our institution’s IRB data security requirements because it housed our data on Dedoose servers.

ATLAS.ti

ATLAS.ti was also a new program for me, so I signed up for a trial of this software. It is a well-established program with powerful analysis functions such as helpful hierarchical coding capabilities and institutive links among codes, quotations, and comments. But the cross-OS collaboration, while possible via the web, proved to be cumbersome and this too did not meet the data security threshold for our institution’s IRB. Furthermore, the price point meant we would need to rethink our potential collaborations with other organizational members.

Data Security

Many programs are now cloud-based, which offer powerful analysis options, but unfortunately did not meet our IRB data security requirements. Ultimately, we had to cut Delve, MAXQDA, Taguette, Transana, and webQDA. All of these programs would have been low-learning curve options with basic coding functionality and cross-OS collaboration; however, for our team to collaborate, we would need to purchase a cloud-based subscription, which can quickly become prohibitively expensive, and house our data on company servers, which would not pass our institutional threshold for data security.

Note-taking programs

After testing multiple programs, I started looking beyond just qualitative software programs and into note-taking programs such as DevonThink, Obsidian, Roam Research, and Scrintal. I had hoped these might provide a work around by organizing data on collaborative teams in ways that would facilitate analysis. However, most of them did not have functionalities that could be used for coding or had high learning curves that precluded our team using them.

It seemed like I had exhausted all options and I still did not have a program to bring back to the Research Unit. I had no idea that a low-cost option was just a YouTube video away. Stay tuned for the follow-up post where we dive into the solution that worked best for our team.

 

For the first part of this post, please see Media Literacy in the Age of AI, Part I: “You Will Need to Check It All.”

Just how, exactly, we’re supposed to follow Ethan Mollick’s caution to “check it all” happens to be the subject of a lively, forthcoming collaboration from two education researchers who have been following the intersection of new media and misinformation for decades.

In Verified: How to Think Straight, Get Duped Less, and Make Better Decisions about What to Believe Online (University of Chicago Press, November 2023), Mike Caulfield and Sam Wineburg provide a kind of user’s manual to the modern internet. The authors’ central concern is that students—and, by extension, their teachers—have been going about the process of verifying online claims and sources all wrong—usually by applying the same rhetorical skills activated in reading a deep-dive on Elon Musk or Yevgeny Prigozhin, to borrow from last month’s headlines. Academic readers, that is, traditionally keep their attention fixed on the text—applying comprehension strategies such as prior knowledge, persisting through moments of confusion, and analyzing the narrative and its various claims about technological innovation or armed rebellion in discipline-specific ways.

The Problem with Checklists

Now, anyone who has tried to hold a dialogue on more than a few pages of assigned reading at the college level knows that sustained focus and critical thinking can be challenging, even for experienced readers. (A majority of high school seniors are not prepared for reading in college, according to 2019 data.) And so instructors, partnering with librarians, have long championed checklists as one antidote to passive consumption, first among them the CRAAP test, which stands for currency, relevance, authority, accuracy, and purpose. (Flashbacks to English 101, anyone?) The problem with checklists, argue Caulfield and Wineburg, is that in today’s media landscape—awash in questionable sources—they’re a waste of time. Such routines might easily keep a reader focused on critically evaluating “gameable signals of credibility” such as functional hyperlinks, a well-designed homepage, airtight prose, digital badges, and other supposedly telling markers of authority that can be manufactured with minimal effort or purchased at little expense, right down to the blue checkmark made infamous by Musk’s platform-formerly-known-as-Twitter.

Three Contexts for Lateral Reading

One of the delights in reading Verified is drawing back the curtains on a parade of little-known hoaxes, rumors, actors, and half-truths at work in the shadows of the information age—ranging from a sugar industry front group posing as a scientific think tank to headlines in mid-2022 warning that clouds of “palm-sized flying spiders” were about to descend on the East Coast. In the face of such wild ideas, Caulfield and Wineburg offer a helpful, three-point heuristic for navigating the web—and a sharp rejoinder to the source-specific checklists of the early aughts. (You will have to read the book to fact-check the spider story, or as the authors encourage, you can do it yourself after reading, say, the first chapter!) “The first task when confronted with the unfamiliar is not analysis. It is the gathering of context” (p. 10). More specifically:

  • The context of the source — What’s the reputation of the source of information that you arrive at, whether through a social feed, a shared link, or a Google search result?
  • The context of the claim — What have others said about the claim? If it’s a story, what’s the larger story? If a statistic, what’s the larger context?
  • Finally, the context of you — What is your level of expertise in the area? What is your interest in the claim? What makes such a claim or source compelling to you, and what could change that?
“The Three Contexts” from Verified (2023)

At a regional conference of librarians in May, Wineburg shared video clips from his scenario-based research, juxtaposing student sleuths with professional fact checkers. His conclusion? By simply trying to gather the necessary context, learners with supposedly low media literacy can be quickly transformed into “strong critical thinkers, without any additional training in logic or analysis” (Caulfield and Wineburg, p. 10). What does this look like in practice? Wineburg describes a shift from “vertical” to “lateral reading” or “using the web to read the web” (p. 81). To investigate a source like a pro, readers must first leave the source, often by opening new browser tabs, running nuanced searches about its contents, and pausing to reflect on the results. Again, such findings hold significant implications for how we train students in verification and, more broadly, in media literacy. Successful information gathering, in other words, depends not only on keywords and critical perspective but also on the ability to engage in metacognitive conversations with the web and its architecture. Or, channeling our eight-legged friends again: “If you wanted to understand how spiders catch their prey, you wouldn’t just look at a single strand” (p. 87).

SIFT graphic by Mike Caulfield with icons for stop, investigate the source, find better coverage, and trace claims, quotes, and media to the original context.

Image 2: Mike Caulfield’s “four moves”

Reconstructing Context

Much of Verified is devoted to unpacking how to gain such perspective while also building self-awareness of our relationships with the information we seek. As a companion to Wineburg’s research on lateral reading, Caulfield has refined a series of higher-order tasks for vetting sources called SIFT, or “The Four Moves” (see Image 2). By (1) Stopping to take a breath and get a look around, (2) Investigating the source and its reputation, (3) Finding better sources of journalism or research, and (4) Tracing surprising claims or other rhetorical artifacts back to their origins, readers can more quickly make decisions about how to manage their time online. You can learn more about the why behind “reconstructing context” at Caulfield’s blog, Hapgood, and as part of the OSU Libraries’ guide to media literacy. (Full disclosure: Mike is a former colleague from Washington State University Vancouver.)

If I have one complaint about Caulfield and Wineburg’s book, it’s that it dwells at length on the particulars of analyzing Google search results, which fill pages of accompanying figures and a whole chapter on the search engine as “the bestie you thought you knew” (p. 49). To be sure, Google still occupies a large share of the time students and faculty spend online. But as in my quest for learning norms protocols, readers are already turning to large language model tools for help in deciding what to believe online. In that respect, I find other chapters in Verified (on scholarly sources, the rise of Wikipedia, deceptive videos, and so-called native advertising) more useful. And if you go there, don’t miss the author’s final take on the power of emotion in finding the truth—a line that sounds counterintuitive, but in context adds another, rather moving dimension to the case against checklists.

Given the acceleration of machine learning, will lateral reading and SIFTing hold up in the age of AI? Caulfield and Wineburg certainly think so. Building out context becomes all the more necessary, they write in a postscript on the future of verification, “when the prose on the other side is crafted by a convincing machine” (p. 221). On that note, I invite you and your students to try out some of these moves on your favorite chatbot.

Another Postscript

The other day, I gave Microsoft’s AI-powered search engine a few versions of the same prompt I had put to ChatGPT. In “balanced” mode, Bing dutifully recommended resources from Stanford, Cornell, and Harvard on introducing norms for learning in online college classes. Over in “creative” mode, Bing’s synthesis was slightly more offbeat—including an early-pandemic blog post on setting norms for middle school faculty meetings in rural Vermont. More importantly, the bot wasn’t hallucinating. Most of the sources it suggested seemed worth investigating. Pausing before each rabbit hole, I took a deep breath.

Related Resource

Oregon State Ecampus recently rolled out its own AI toolkit for faculty, based on an emerging consensus that developing capacities for using this technology will be necessary in many areas of life. Of particular relevance to this post is a section on AI literacy, conceptualized as “a broad set of skills that is not confined to technical disciplines.” As with Verified, I find the toolkit’s frameworks and recommendations on teaching AI literacy particularly helpful. For instance, if students are allowed to use ChatGPT or Bing to brainstorm and evaluate possible topics for a writing assignment, “faculty might provide an effective example of how to ask an AI tool to help, ideally situating explanation in the context of what would be appropriate and ethical in that discipline or profession.”

References

Caulfield, M., & Wineburg, S. (2023). Verified: How to think straight, get duped less, and make better decisions about what to believe online. University of Chicago Press.

Mollick, E. (2023, July 15). How to use AI to do stuff: An opinionated guide. One Useful Thing.

Oregon State Ecampus. (2023). Artificial Intelligence Tools.

Have you found yourself worried or overwhelmed in thinking about the implications of artificial intelligence for your discipline? Whether, for example, your department’s approaches to teaching basic skills such as library research and source evaluation still hold up? You’re not alone. As we enter another school year, many educators continue to think deeply about questions of truth and misinformation, creativity, and how large language model (LLM) tools such as chatbots are reshaping higher education. Along with our students, faculty (oh, and instructional designers) must consider new paradigms for our collective media literacy.

Here’s a quick backstory for this two-part post. In late spring, shortly after the “stable release” of ChatGPT to iOS, I started chatting with bot model GPT-3.5, which innovator Ethan Mollick describes as “very fast and pretty solid at writing and coding tasks,” if a bit lacking in personality. Other, internet-connected models, such as Bing, have made headlines for their resourcefulness and darker, erratic tendencies. But so far, access to GPT-4 remains limited, and I wanted to better understand the more popular engine’s capabilities. At the time, I was preparing a workshop for a creative writing conference. So, I asked ChatGPT to write a short story in the modern style of George Saunders, based in part on historical events. The chatbot’s response, a brief burst of prose it titled “Language Unleashed,” read almost nothing like Saunders. Still, it got my participants talking about questions of authorship, originality, representation, etc. Check, check, check.

The next time I sat down with the GPT-3.5, things went a little more off-script.

One faculty developer working with Ecampus had asked our team about establishing learning norms in a 200-level course dealing with sensitive subject matter. As a writing instructor, I had bookmarked a few resources in this vein, including strategies from the University of Colorado Boulder. So, I asked ChatGPT to create a bibliographic citation of Creating Collaborative Classroom Norms, which it did with the usual lightning speed. Then I got curious about what else this AI model could do, as my colleagues Philip Chambers and Nadia Jaramillo Cherrez have been exploring. Could ChatGPT point me to some good resources for faculty on setting norms for learning in online college classes?

“Certainly!” came the cheery reply, along with a summary of five sources that would provide me with “valuable information and guidance” (see Image 1). Noting OpenAI’s fine-print caveat (“ChatGPT may produce inaccurate information about people, places, or facts”), I began opening each link, expecting to be teleported to university teaching centers across the country. Except none of the tabs would load properly.

“Sorry we can’t find what you’re looking for,” reported Inside Higher Ed. “Try these resources instead,” suggested Stanford’s Teaching Commons. A closer look with Internet Archive’s Wayback Machine confirmed that the five sources in question were, like “Language Unleashed,” entirely fictitious.

An early chat with ChatGPT-3.5, asking whether the chatbot can point the author to some good resources for faculty on setting classroom norms for learning in online college classes. "Certainly," replies ChatGPT, in recommending five sources that "should provide you with valuable information and guidance."

Image 1: An early, hallucinatory chat with ChatGPT-3.5

As Mollick would explain months later: “it is very easy for the AI to ‘hallucinate’ and generate plausible facts. It can generate entirely false content that is utterly convincing. Let me emphasize that: AI lies continuously and well. Every fact or piece of information it tells you may be incorrect. You will need to check it all.”

The fabrications and limitations of chatbots lacking real-time access to the ever-expanding web have by now been well-documented. But as an early adopter, the speed and confidence ChatGPT brought to the task of inventing and describing fake sources felt unnerving. And without better guideposts for verification, I expect students less familiar with the evolution of AI will continue to experience confusion, or worse. As the Post recently reported, chatbots can easily say offensive things and act in culturally-biased ways—”a reminder that they’ve ingested some of the ugliest material the internet has to offer, and they lack the independent judgment to filter that out.”

Just how, exactly, we’re supposed to “check it all” happens to be the subject of a lively, forthcoming collaboration from two education researchers who have been following the intersection of new media and misinformation for decades.

Stay tuned for an upcoming post with the second installment of “Media Literacy in the Age of AI,” a review of Verified: How to Think Straight, Get Duped Less, and Make Better Decisions about What to Believe Online by Mike Caulfield and Sam Wineburg (University of Chicago Press, November 2023).

References

Mollick, E. (2023, July 15). How to use AI to do stuff: An opinionated guide. One Useful Thing.

Wroe, T., & Volckens, J. (2022, January). Creating collaborative classroom norms. Office of Faculty Affairs, University of Colorado Boulder.

Yu Chen, S., Tenjarla, R., Oremus , W., & Harris, T. (2023, August 31). How to talk to an AI chatbot. The Washington Post.

By: Julie Jacobs, Jana King, Dana Simionescu, Tianhong Shi

Overview

A recent scenario with our course development team challenged our existing practices with lecture media. Formerly, we had encouraged faculty to include only slides with narration in their lecture videos due to concerns about increasing learners’ cognitive load. Students voiced their hope for more instructor presence in courses, and some instructors started asking about including video of themselves inserted into their lectures. This prompted us to begin thinking about instructor presence in lecture videos more deeply: why were we discouraging faculty from including their faces in lecture videos? While our practices were informed by research-based media theory, we also recognized those theories might be outdated. 

We began to explore the latest research with the following question in mind: does visual instructor presence in lectures increase extraneous cognitive load in learners? We use the phrase “visual instructor presence” to refer to lecture videos where an instructor’s moving image is seen giving the lecture, composited together with their slides. This technique is also commonly referred to as “picture-in-picture”, as seen in the image below.

Image 1: Adam Vester, instructor in College of Business, in his lecture design for BA 375 Applied Quantitative Methods.

A task force was created to review recent research on visual instructor presence and cognitive load, specifically in lecture-type videos. Our literature review included a look at leading multimedia learning scholar Richard E. Mayer’s newest group of principles. We also reviewed more than 20 other scholarly articles, many of which were focused on learner perception, motivation & engagement, and emotion. 

Findings

According to recent work in multimedia learning, research in this area should focus on three areas, namely learning outcomes (“what works/ what does not work?”), learning characteristics (“when does it work?”), and learning process (“how does it work?”) (Mayer, 2020). Below are our conclusions from the 23 research articles we reviewed regarding instructional videos, attempting to answer the above questions of “what works”, “when does it work”, and “how does it work”.  

  1. This review of recent literature shows no evidence that visual instructor presence increases extraneous cognitive load. 
  2. Students tend to prefer lectures with visual instructor presence – they report increased satisfaction and better perceived learning, which can boost motivation and engagement. 
  3. While some studies find no difference in performance outcomes when visual instructor presence is utilized, others found increased performance outcomes with visual instructor presence. Proposed explanations: embodiment techniques such as gestures, eye contact, and body movement which fosters generative processing (the cognitive processes required for making sense of the material); social cues can help direct the learners’ attention; increased motivation (as per point 2 above) contributes to better learning. 
  4. The effects may depend on the specific type of visual instructor presence (e.g., small picture-in-picture, green-screen, or lightboard) and the characteristics of the content (complex/difficult vs simpler/easier). 

Recommendations

Based on these findings, our team has decided to remove the default discouragement of instructors wishing to use picture-in-picture in lectures. If an instructor is interested in having their visual presence in the lectures, we encourage them to discuss this option with their Instructional Designer and Lecture Media Coordinator to determine if this style is a good fit for them and their content.

Image 2: Bryony DuPont, associate professor of Mechanical Engineering, utilizing visual instructor presence in her lecture design for ME 382 Introduction to Design.

We recommend considering the following points:

  • What is their presentation style? Do they tend to spend a lot of time talking over a slide or is there a lot of text or other action (e.g. software demo) happening in the video? If there’s a lot happening on the screen, perhaps it’s better to not put their video on top of it (the instructor video could be placed only at the beginning and/or end instead).
  • What type of content? Is it simple or more complex? For more visually complex content, a lightboard or digital notation without picture-in-picture may work better, to take advantage of the dynamic drawing principle and the gaze guidance principle. 
  • Is it a foreign language course? If so, it’s likely helpful for the learners to see the instructor’s mouth and body language. 
  • Is the instructor comfortable with being on video? If they’re not comfortable with it, it may not add value. This being said, our multimedia professionals can help make instructors more comfortable in front of the camera and coach them on a high-embodied style of lecturing. 

Since implementing these guidelines and working with an increased number of lectures with visual instructor presence, we also noticed that it works best when the instructor does not look and sound like they’re reading. Therefore, for people who like working with a script, we recommend practicing in advance so they can sound more natural and are able to enhance their presentation with embodiment techniques.

We would love to hear about your opinions or experiences with this type of video. Share them in the comments!

For a detailed summary of our findings and full citation list, please see the full Literature Review.


Some form of group work is a common activity that I help design with faculty every term. Oftentimes, faculty ask how to consider the different levels of engagement from individual group members and how to assess group work, often in the form of a group grade. Improving group work in asynchronous courses and group contracts to promote accountability are some of many ways to guide students into collaborative work. However, collaborative work may require offering equitable opportunities to all students to succeed. Based on the work by Feldman (2019), I’d like to outline some suggestions for assessment design through an equity lens.

Before jumping into assessing group work, Feldman outlines three pillars of equitable grades:

  1. “They are mathematically accurate, validly reflecting a student’s academic performance.
  2. They are bias-resistant, preventing biased subjectivity from infecting our grades.
  3. They motivate students to strive for academic success, persevere, accept struggles and setbacks, and to gain critical lifelong skills” (Feldman, p. 71).

With these three pillars in mind, let’s examine some potential issues with a group receiving one grade for their work.

  1. Accuracy: a collective group grade does not necessarily reflect an individual’s contribution to the group work or assess an individual student’s learning in terms of outcomes. For example, if a group splits up sections of a project into individual responsibilities, a student who did their assigned section very well may not have had an opportunity to gain new knowledge or build on their learning for aspects where they were struggling. And a group grade does not accurately capture their individual work or learning.
  2. Bias: Many times peer evaluations of group work come with some kind of group contract or accountability measure. However, there is a possibility for bias in how students evaluate their peers, especially if that evaluation is based on behaviors like turning things in on time and having strong social skills instead of learning. For example, maybe one of the group members had a job with a variable schedule from week to week, making it difficult to join regular group discussions and complete work at the same pace every week for the duration of the project. Other group members may perceive them as difficult to work with or inconsistent in their commitment and award them fewer points in a peer evaluation, especially if other group members did not have outside factors noticeably impacting their performance.
  3. Motivation: Group contracts and using evaluation as a way to promote productivity is an external motivator and does not instill a sense of internal relevance for students participating in group work. Instead, students may feel resentful that their peers may evaluate them harshly for things outside of their control, which can quickly snowball into a student disengaging from group work entirely.

“The purpose of group work is not to create some product in which all members participate, but for each student to learn specific skills or content through the group’s work together.”

Feldman, p. 104

So how do we assess this learning? Individually. If we can reimagine group work as a journey toward an individual reaching a learning outcome, then instead of assessing a behavior (working well and timeliness in a group) or what a group produces, we can instead create an assessment that captures the individual impact of the group work instead. Feldman outlines some tips for encouraging group work without a group grade:

  1. Have a clear purpose statement and overview for the group work that outlines the rationale and benefit of learning that content in a group context.
  2. Have clear evaluation criteria that shows the alignment of the group work with a follow-up individual assessment.
  3. If possible, include students in the process by having a brainstorm or pre-work discussion ahead of time about what makes groups productive, how to ensure students learn material when working in groups, and what kinds of collaborative expectations can be set for a particular cohort of students.
  4. Be patient with students navigating a new assessment strategy for the first time and offer ample feedback throughout the process so students are set up for success on their assessments.
  5. Ensure the follow-up individual assessment is in alignment with learning outcomes and is focused on the content or skills students are expected to gain through group work.

As an added bonus, assessing group work individually in this way is often simpler than elaborate group work rubrics with separate peer evaluations factored in, making it both easier for the instructor and easier for the student to understand how their grade is calculated. Additionally, it will be important to design this group work with intention—if an individual could learn the material on their own, then what is the purpose of the group interaction? Think about a group project you may have assigned or designed in the past. What was the intention for that journey as a group? And how might you reimagine it if there was an individual assessment after its completion? I hope these questions are great starting points for reflecting on group work assessments and redesigning with equity in mind!

References

Feldman, J. (2019). Grading for equity: What it is, why it matters, and how it can transform schools and classrooms. Thousand Oaks, CA: Corwin.

By Cat Turk and Mary Ellen Dello Stritto

In this time of rapid change in online education, we can benefit from leveraging the expertise of faculty who have experienced the evolution of online education. At the Oregon State University (OSU) Ecampus Research Unit, we have been learning from a group of instructors who have taught online for ten years or more. A review of recent research uncovered that these instructors are an untapped resource. Their insights can provide valuable guidance for instructors who are just beginning their careers or instructors who may be preparing to teach online for the first time. Further, their perspectives can also be enlightening for online students.

In 2018-2019 we conducted interviews with 33 OSU faculty who had been teaching online for 10 years or more as a part of a larger study. Two of the questions we asked them were the following:

  1. What skills do you think are most valuable for online instructors to have?
  2. What skills do you think are most valuable for online students to have?

We will share some of the results of a qualitative analysis of these questions and highlight the similarities and differences.

When asked about the most valuable skills for online instructors, three key skills emerged: communication, organization, and time management. When asked about the most valuable skills for online students to have, the same skills were among the most frequently mentioned by these instructors.

As the table below shows, in the responses about skills for online instructors, communication emerged as the most prominent skill, with 85% of instructors in the study emphasizing its importance, while time management and organization were split evenly at 45%. In their response about skills for students, 64% of the instructors emphasized both communication and time management, while 42% discussed organization. When discussing communication for instructors, they indicated that effective communication is essential for building rapport with students, providing clear instructions, and facilitating meaningful interactions in the online environment. Organization (such as structuring course materials or their weekly work process) and time management skills (such as scheduling availability to connect with students) were also highly valued by these instructors. Read more about the analysis of instructor skills here.

 Skills for InstructorsSkills for Students
Communication    28 responses (85%)   21 responses (64%)
Time Management15 responses (45%)  21 responses (64%)
Organization15 responses (45%)   14 responses (42%)
Self-Motivation   —21 responses (64%)            
Frequency of responses of skills for instructors and students.

The responses to both questions emphasized the significance of communication skills in written assignments and in proactive connections within the scope of the online learning environment. Instructors articulated that online students needed to be proactive communicators themselves. Examples of this include contacting their instructors about questions and clarification in a timely way, interacting with their peers in a respectful manner, and turning in quality written assignments that demonstrate comprehension of their learning material. For students, clear and effective communication ensures understanding and engagement, while organization facilitates seamless navigation through course materials, and time management ensures that students are able to make the most of the asynchronous environment.

While time management and organization were both considered by instructors to be just as crucial for students, their responses demonstrated that these skills were needed for different reasons than would be the case for instructors. Instructors personally valued time management and organization due to the nature of facilitating courses online. When the online classroom can travel from place to place, setting blocks of intentional time and structuring hours accordingly were considered essential to instructors for maintaining a work-life balance and so tasks would not be missed.

On the other hand, according to these instructors, students need time management and organization due to the asynchronous and sometimes isolating nature of online courses. One instructor stressed that:

 “[Students] do need to be more organized than on-ground students, because there’s not that weekly meeting to keep students on track.”

These instructors indicated some online students may need to structure their study time to accommodate a different time zone, while others may need to structure their academic pursuits around careers or children. Another instructor emphasized that:

“A lot of our [online students] actually work full-time, so they have families and kids and have to be much more organized too.”

While there were overlaps with the responses to the two questions, a notable difference was the emergence of another skill for students: self-motivation. This concept of self-motivation emerged from the instructor responses about students’ capacity to persevere in online courses. This included their level of motivation, capacity to learn on their own, and comfort with self-paced learning.

One instructor said the following about students’ self-motivation,

“Some people would say it’s self-discipline, but I think it’s more of they have to have a purpose for that class.”

Self-motivation was not mentioned by the instructors as a skill for online instructors, suggesting that these instructors perceive this as more pertinent to students for success in managing their own learning process. It is worth noting that proactive communication was highlighted as an essential aspect of self-motivation, with instructors emphasizing that students who take the initiative in reaching out to them tend to be more successful. This observation suggests that self-motivated individuals are more likely to actively seek support and clarification, which can enhance their learning experience and overall success. 

Another noteworthy aspect was the need for students to be comfortable with learning in physical isolation. Instructors acknowledged that online learners must navigate the challenges of studying independently without the immediate presence of peers and instructors. For online students specifically,

“They need to be motivated because they’re not going to have peers sitting in a classroom with them, and they don’t have a place that they have to physically go every week.”

This finding underscores the importance of maintaining motivation and engagement, as students ideally possess an intrinsic drive to succeed despite the absence of a physical connection to the university and their classmates.

The findings from this study highlight the importance of certain similar skills for both online instructors and students. Effective communication, organization, and time management are vital for success in the online learning environment for both instructors and students. We found this to be an interesting connection that online students might benefit from understanding: these are key skills that students and instructors have in common.

Our findings about self-motivation may be useful for online instructors. Consider incorporating strategies that foster student self-motivation, such as goal-setting exercises, regular check-ins, and providing opportunities for self-reflection. By empowering students to take ownership of their learning, instructors might enhance student engagement and success in the online environment.

Further, students can learn from the instructors’ emphasis on communication, organization, and time management skills. They can intentionally work on improving their communication skills, seeking clarification when needed, and actively participating in online discussions. Developing effective organization and time management strategies, such as creating schedules, prioritizing tasks, and breaking them down into manageable chunks, may significantly enhance their online learning experience.

The field of online education is evolving rapidly, and here we can see how educators and students alike are adapting to these changes. The experiences of long-term online instructors provide valuable insights into the skills necessary for success in the online learning environment. In the future, what answers would we find if we asked students the same question: what do online students think are the skills needed to succeed in the online classroom? By understanding the shared and distinct perspectives of instructors and students, educators can design effective online courses and support systems that foster meaningful learning experiences and empower students to succeed.

An illustration of a person kneeling and question marks around

Have you ever been assigned a task but found yourself asking: “What’s the point of this task? Why do I need to do this?” Very likely, no one has informed you of the purpose of this task! Well, it likely was because that activity was missing to show a critical element: the purpose. Just like the purpose of a task can be easily left out, in the context of course design, a purpose statement for an assignment is often missing too.

Creating a purpose statement for assignments is an activity that I enjoy very much. I encourage instructors and course developers to be intentional about that statement which serves as a declaration of the underlying reasons, directions, and focus of what comes next in an assignment. But most importantly, the statement responds to the question I mentioned at the beginning of this blog…why…?

Just as a purpose statement should be powerful to guide, shape, and undergird a business (Yohn, 2022), a purpose statement for an assignment can guide students in making decisions about using strategies and resources, shape students’ motivation and engagement in the process of completing the assignment, and undergird their knowledge and skills.  Let’s look closer at the power of a purpose statement.

What does “purpose” mean?

Merriam-Webster defines purpose as “something set up as an object or end to be”, while Cambridge Dictionary defines it as “why you do something or why something exists”. These definitions show us that the purpose is the reason and the intention behind an action.

Why a purpose is important in an assignment?

The purpose statement in an assignment serves important roles for students, instructors, and instructional designers (believe it or not!).

For students

The purpose will:

  1. answer the question “why will I need to complete this assignment?”
  2. give the reason to spend time and resources working out math problems, outlining a paper, answering quiz questions, posting their ideas in a discussion, and many other learning activities.
  3. highlight its significance and value within the context of the course.
  4. guide them in understanding the requirements and expectations of the assignment from the start.

For instructors

The purpose will:

  1. guide the scope, depth, and significance of the assignment.
  2. help to craft a clear and concise declaration of the assignment’s objective or central argument.
  3. maintain the focus on and alignment with the outcome(s) throughout the assignment.
  4. help identify the prior knowledge and skills students will be required to complete the assignment.
  5. guide the selection of support resources.

For instructional designers

The purpose will:

  1. guide building the structure of the assignment components.
  2. help identify additional support resources when needed.
  3. facilitate an understanding of the alignment of outcome(s).
  4. help test the assignment from the student’s perspective and experience.

Is there a wrong purpose?

No, not really. But it may be lacking or it may be phrased as a task. Let’s see an example (adapted from a variety of real-life examples) below:

Project Assignment:

“The purpose of this assignment is to work in your group to create a PowerPoint presentation about the team project developed in the course. Include the following in the presentation:

  • Title
  • Context
  • Purpose of project
  • Target audience
  • Application of methods
  • Results
  • Recommendations
  • Sources (at least 10)
  • Images and pictures

The presentation should be a minimum of 6 slides and must include a short reflection on your experience conducting the project as a team.”

What is unclear in this purpose? Well, unless the objective of the assignment is to refine students’ presentation-building skills, it is unclear why students will be creating a presentation for a project that they have already developed. In this example, creating a presentation and providing specific details about its content and format looks more like instructions instead of a clear reason for this assignment to be.

A better description of the purpose could be:

“The purpose of this assignment is to help you convey complex information and concepts in visual and graphic formats. This will help you practice your skills in summarizing and synthesizing your research as well as in effective data visualization.”

The purpose statement particularly underscores transparency, value, and meaning. When students know why, they may be more compelled to engage in the what and how of the assignment. A specific purpose statement can promote appreciation for learning through the assignment (Christopher, 2018).

Examples of purpose statements

Below you will find a few examples of purpose statements from different subject areas.

Example 1: Application and Dialogue (Discussion assignment)

Courtesy of Prof. Courtney Campbell – PHL /REL 344

Example 2: An annotated bibliography (Written assignment)

Courtesy of Prof. Emily Elbom – WR 227Z

Example 3: Reflect and Share (Discussion assignment)

Courtesy of Profs. Nordica MacCarty and Shaozeng Zhang – ANTH / HEST 201

With the increased availability of language learning models (LLMs) and artificial intelligence (AI) tools (e.g., ChatGPT, Claude2), many instructors worry that students would resort to these tools to complete the assignments. While a clear and explicit purpose statement won’t deter the use of these highly sophisticated tools, transparency in the assignment description could be a good motivator to complete the assignments with no or little AI tools assistance.

Conclusion

Knowing why you do what you do is crucial” in life says Christina Tiplea. The same applies to learning, when “why” is clear, the purpose of an activity or assignment can become a more meaningful and crucial activity that motivates and engages students. And students may feel less motiavted to use AI tools (Trust, 2023).

Note: This blog was written entirely by me without the aid of any artificial intelligence tool. It was peer-reviewed by a human colleague.

Resources:

Christopher, K. (02018). What are we doing and why? Transparent assignment design benefits students and faculty alike. The Flourishing Academic.

Sinek, S. (2011). Start with why. Penguin Publishing Group.

Trust, T. (2023). Addressing the Possibility of AI-Driven Cheating, Part 2. Faculty Focus.

Yohn, D.L. (2022). Making purpose statements matter. SHR Executive Network.