While attending a panel presentation by students pursuing degrees online, I heard one of the student panelists share something to the effect of, “Oh, I don’t do Office Hours. However, instead of Office Hours, one of my instructors had these ‘Afternoon Tea’ sessions on Zoom that I loved to attend when it worked with my schedule. She answered my questions, and I feel like she got to know me better. She was also available to meet by appointment.” What wasn’t revealed was why this student wouldn’t attend something called “Office Hours” but did attend these other sessions. Did “Office Hours” sound too formal? Was she unsure of what would happen during office hours, or unsure of what the purpose was? Did she think office hours was something students only went to if they were failing the course? The student didn’t say.

There is some mystery around why this student wouldn’t attend office hours, and her comment reminded me of what I had read in Small Teaching Online: Applying Learning Science in Online Classes, by Flower Darby and James Lang (available digitally through the Valley Library if you are part of the OSU community). In Small Teaching Online, under the section titled, “Get Creative with Virtual Office Hours,” several tips are highlighted for how to enhance participation in office hours. Here is a summary of a few of those tips presented in this book, which are based on Lowenthal’s 2017 study (pp. 119-121, Darby & Lang, 2019):

  • Rename office hours to sound more welcoming: “Afternoon Tea,” “Consultations,” or “Coffee Breaks” are some ideas to consider (p. 188, Lowenthal, 2017).
  • To enhance participation, plan just 3-4 well-timed sessions instead of weekly office hours, and announce them early in the term. For timing, think about holding a session before or after a major assessment or project milestone is due, for example.
  • Collect questions ahead of time, and make office hours optional.

Additionally, outside of office hours, remind students that you are available to meet with them individually by appointment since students’ schedules vary so widely. 

Putting these tips into practice, here is what the redesigned office hours can look like in an asynchronous online course, where this “Coffee Break” happens three times in the term and is presented in the LMS using the discussion board tool or the announcements feature as needed:

Canvas page shows a banner image titled "Coffee Break" and  "Join me for a chat. I hope to get to know each of you in this course, so I would like to invite you to virtual coffee breaks." The description on the page details expectations, tasks, and how to join the Coffee Break.

What I like about this design is that the purpose and expectations of the session are explained, and it is flexible for both students and faculty. The “Coffee Break” is presented in an asynchronous discussion board so that students’ questions can be collected ahead of time and at their convenience. Further, if something comes up with the faculty and the live “Coffee Break” is canceled, the faculty can answer questions asynchronously in the discussion board. There is also a reminder that students are invited to make a separate appointment with their instructor at a time that works for them.

Have you tried rebranding your office hours? How did it go?

References

Darby, F., & Lang, J. M. (2019). Small teaching online : applying learning science in online classes (First edition.). Jossey-Bass, a Wiley Brand.

Lowenthal, P. R., Dunlap, J. C., & Snelson, C. (2017). Live Synchronous Web Meetings in Asynchronous Online Courses: Reconceptualizing Virtual Office Hours. Online Learning (Newburyport, Mass.), 21(4), 177-. https://doi.org/10.24059/olj.v21i4.1285

Neuron image from Adobe Stock

Engaged learning design helps students comprehend the learning materials and apply the newly learned knowledge and skills to new contexts. Jessie Moore proposed six key principles for engaged learning, namely:
* “Acknowledging and building on students’ prior knowledge and experiences;
* Facilitating relationships, including substantive interactions with faculty/staff mentors and peers, and development of diverse networks;
* Offering feedback on both students’ work-in-progress and final products;
* Framing connections to broader contexts, including practice in real-world applications of students’ developing knowledge and skills;
* Fostering reflection on learning and self; and
* Promoting integration and transfer of knowledge” (Moore 2021; Moore, 2023).

This blog will showcase three course design projects using engaged learning principles to overcome design challenges, including challenging content, lack of student motivation, and/or difficulty transferring knowledge.

Design Case #1
Engaged learning Principle: Acknowledging and building on students’ prior knowledge and experiences
Design Challenge: Students are non-accounting majors and need more motivation to study accounting.
Design Solution: College students have all bought textbooks and paid bills for college education, even though they may not have any accounting training before. Building on students’ prior knowledge of bill paying and textbook purchases, the instructor created a mock-up student-run company and assigned students to work with accounting related to students’ activities, such as buying and selling textbooks and offering tutoring services, in order to make learning materials of “BA 315 Accounting for Decision Making” relevant and meaningful to students. The instructor also collaborated with Ecampus media team to create an online monopoly simulation game modified with Oregon State University themes to further engage students in accounting practices.

Design Case #2
Engaged Learning principle: Facilitating relationships, including substantive interactions with faculty/staff members and peers and developing diverse networks.
Engaged Learning principle: Offering feedback on students’ work-in-progress and final products.
Engaged learning principle: Fostering reflection on learning and self.
Engaged learning principle: promoting integration and transfer of knowledge.
Design Challenge: trauma-informed helping skills are challenging to teach in HDFS 462 online.
Design Solution:
Building on students’ prior knowledge and experiences (Discussion board activities)

The course developer used case study and group case discussion on developing a plan to help a client; Students individually practice attending and listening with single-word responses. Instructor provides feedback on both group work and individual work.
Also, instructor Modeling the empowerment process with recorded videos, students practicing helping skills, and the instructor offering feedback on students’ helping skills practices with peer partners in the classmates.
Connections to Broader contexts and promoting integration and transfer of knowledge: students practice helping skills with non-classmate clients; and instructor provides feedback.



Design Case #3
Engaged Learning Principle: Framing connections to broader contexts, including practice in real-world applications of students’ developing knowledge and skills.
Design Challenge: There is a lack of full access to construction sites especially for students in CE 427 Online Course to get hands-on experience and understand construction site structure fundamentals.
Design Solution: the instructor and instructional designer collaborated with the media team to design an interactive simulation called Clickable Structure to help students understand the most difficult concepts in the course: elements of structures and how various pieces relate to each other. The Clickable Structure simulation enables students to see each group of structures layer by layer according to their functions and the corresponding equations needed for calculations of weight bearing, etc.

What we see versus what students in CE 427 needs to learn


As a Reflection Tool
Another way to use the six principles of engaged learning is to change the statements in the principles to a list of questions for students to reflect:
1. What prior knowledge do I bring to this topic?
2. What new knowledge and skills I learned about this topic? How are these new concepts and skills and principles and relationships related to each other? How does individual pieces of information connect to make sense?
3. What feedback did I receive from instructor and classmates that gives me insights to this topic?
4. How is this topic related to broader contexts of main learning outcomes of this course or real-world applications?
5. How could I use what I learned about this topic into real-world application?
6. What new understandings did I gain from this reflection activity?

If you find these six principles of engaged learning meaningful and have adopted or adapted them in your teaching and learning, I encourage you to share with us (email tianhong.shi@oreognstate.edu) so we can build a collection of engaged learning cases and examples.

References

Moore, Jessie L. Key Practices for Fostering Engaged Learning: A Guide for Faculty and Staff. Sterling, VA: Stylus Publishing, 2023.

Moore, Jessie L. 2021. “Key Practices for Fostering Engaged Learning.” Change: The Magazine of Higher Learning, 53(6): 12-18. https://doi.org/10.1080/00091383.2021.1987787

This article has its roots in a discussion I had with an Ecampus intern about going on the job market. This intern is working in an academic technologies role at a higher ed institution already, but also getting the Instructional Design certificate here at OSU. It was my first time thinking about what the growth of instructional design certificate and degree credentials means for all instructional designers. Very few of the instructional designers I’ve met and worked with here or at my previous institutions actually have degrees in instructional design, including myself. The field of instructional design emerged out of a specific institutional and educational need in higher education and corporate education, which makes for an ever-growing, ever-changing, but always innovative membership. How do we, as a field, continue to be inclusive of all instructional designers, regardless of their academic or educational backgrounds? 

One potentially positive fact is that academia moves very slowly, so we have some time to strategize. Instructional design is still an emerging speciality within higher education, with each institution classifying that role differently, and providing that role with different levels of support. Some institutions, even today, do not have any instructional designers. Current research indicates that this must change. One of the best sources of data about the field of instructional design in higher education and instructional designers is the Changing Landscape of Online Education (CHLOE) Project. The participants in the CHLOE survey are “the senior online officer at each participating institution.” This survey pool recognizes the variance in organizational structures at different institutions by focusing on the COO’s purview. In the 2019 CHLOE 3 survey, it was reported that the median number of instructional designers employed at 2-year colleges, and public and private 4-year institutions, was four, regardless of enrollment or institution size. In CHLOE 7, one of the conclusions was that “insufficient instructional design staffing may be one of online learning’s most serious long-term vulnerabilities,” with only 10% of Chief Online Officers surveyed describing their ID capacity as sufficient for their current needs, and only 3% believing they would be able to meet anticipated need. 

These findings signal that universities should be moving towards a significant fiscal investment in hiring instructional designers. Joshua Kim wrote a few key takeaways from the CHLOE 7 in CHLOE 7: The Present and Future of Instructional Design Capacity. Kim predicts that universities will need to not only hire more instructional designers, but that these roles will need to be hybrid or remote to attract the post-pandemic workforce. In addition to hybrid and remote options, Kim posits that, “Forward-thinking universities may find that they need to start offering star non-faculty educators the same recognition and incentives that have long been necessary to recruit and retain star tenure-line faculty.” But what does this mean for instructional designers? How would an instructional designer even be able to become identified as a “star” within the field or even at a specific institution? 

Understanding Branding for Faculty and Non-Faculty Educators

Circling back to my initial inquiry about what instructional designers can do to ensure the field stays inclusive, I believe an individual enterprise will have a collective impact that will benefit the largest number of people: personal branding. In What’s the Point of a Personal Brand? Executive coach Harrison Monarth uses the story of his client, Mike, to illustrate how important it is for employees to think about how personal branding is now a strategy for gaining visibility within organizations, and that visibility is now a key component when employers are thinking about promotion. Monarth observes that “In high-performing organizations, at certain levels, everyone is exceptional. To clearly differentiate your value and what you bring to the table, you need to do more than have a good reputation. You need to have an outstanding personal brand.” Having a brand isn’t the same thing as being a celebrity, although I think many would agree that there are celebrities in every field, even instructional design. 

Creating a personal brand is a successful career strategy outside of the corporate world as well, and one of the fields that is encouraging faculty to think about branding is not, as one might think, business but medicine. In 2019, the Academic Medicine blog published Knowing Your Personal Brand: What Academics Can Learn From Marketing 101, the purpose of which was to persuade medical professionals that a brand identity can be empowering. According to the article,

[K]nowing one’s academic brand can (1) help faculty members approach projects and other responsibilities through the lens of building or detracting from that brand, (2) provide a framework for determining how faculty members might best work within their institutions, and (3) help faculty members better understand and advocate their own engagement and advancement.

Although this article specifically speaks to and about academic teaching faculty, Instructional designers at institutions are often placed in the professional faculty role, along with librarians or program directors, and have many of the same professional demands on their job descriptions. As former faculty, I can attest that both of my careers have included independent  research, departmental service, and conference or publication responsibilities. 

Finding Your Personal Brand

If a brand is defined as opinions that people have about you based on your work, it is important to be self-aware, and intentional about the work that you do. Creating your brand can be a difficult task if, like me, you have a variety of experiences and interests. It requires self-reflection about one’s accomplishments and body of work as a whole, and the need to generalize what are sometimes very disparate activities. In Using Your Personal Mission Statement to INSPIRE and Achieve Success, an article published in Academic Pediatrics, the official journal of the Academic Pediatric Association, the authors describe a framework for building a personal mission statement (INSPIRE):

  • Identify Your Core Values
  • Name the Population You Serve
  • Set Your Vision
  • Plan How You Will Achieve Your Vision
  • Identify Activities That Align With Your Mission
  • Review, Revise, and Refine Your Mission Statement
  • Enlist Others to Help You Accomplish Your Mission

A slimmed down version of this same framework can be a helpful starting point for creating a brand identity. It enables you to identify your core values, name the population you serve, and identify activities that support those values and populations. But unlike a mission statement, this framework is best completed in reverse; a backwards brand design, if you will. (Sidenote: Instructional designers love to do things backwards). I call this framework SIFT:

  • Start with your experience and accomplishments
  • Identify keywords or topics
  • Frame your work and interests
  • Tie everything together

I believe that SIFT-ing has the potential to be a reflective process that will lead to a changing self-awareness of different types of instructional designers, for ourselves, and collectively. 

Start with your experience and accomplishments

The best place to begin is with your complete resume or CV. It might be tempting to start with the tailored version you used to get your last position, but you don’t want to limit your view to only things that you think are relevant to instructional design. I can trace some elements of my brand back to my undergraduate and graduate degrees. I also include my two years as a contracted captioner for 3play and Rev within the same brand. Finding a brand that encompasses all that you are will only be successful if you use the most complete picture of yourself.

Identify keywords or topics

Your brand is more than just the places that you’ve worked at, the committees you’ve served on, and projects you’ve worked on. To understand your brand, you should begin by identifying a perspective, or positionality, that informs the decisions you’ve made in the past, however unconsciously that might have been, and looks towards the future. Keywords can be a useful next step, but you will want to avoid the potential to find yourself trapped within categories! In a field like medicine, there are already established research interests and specialties. As a field, instructional design hasn’t reached the point of specialization, but we are trending towards accepting that there are too many topics that fall under the broad umbrella of instructional design for everyone to be experts in everything. For example, the Quality Matters Instructional Designers Association has 21 expertise categories that you can select from when joining the association that others can use to find you to connect with you. 

A screencapture of the list of categories from the QM IDA website
A screencapture of the list of categories from the QM IDA website

When I first joined the QM IDA, I didn’t know what boxes to check, or even what some of these categories were. And since they are presented without explanation, the criteria for self-identification are unclear. I can check almost all of these boxes as things I have experience in—with the exception of K12 and the Continuing and Professional Education Rubric, but is experience the same thing as expertise? It might be my imposter syndrome talking, but I am more inclined to identify with interests than areas of expertise. (Sidenote: I still haven’t checked any boxes.) 

Frame your work and interests

I hadn’t noticed a pattern to my interests while I was doing them, but by reflecting on my professional journey, I realized that I could trace one interest all the way back to my undergraduate honors thesis, through to my current career as an instructional designer. I’ve always had an interest in communities and the community spaces they inhabit—especially if they are online. Community doesn’t appear on QM’s list of categories, but it is the lens through which I approach many of the categories on that list. Accessibility, Computer-Based Learning, Distance Education, Hybrid instruction/Design, LMS, Multimedia Creation, Problem-Based Learning—all of these categories need to address questions of community by addressing inclusivity, access, equity, and authentic student-student and student-teacher interactions. Community is the keyword I use to frame my research interests and approach to instructional design, in all of its various forms.

Tie everything together

If you go to my LinkedIn profile, you’ll see that I have “Humanities girl in an instructional technology world” as my headline. That’s my brand. You might notice that it does not include “instructional design” or “community.” But at the same time, by labeling myself a humanist, I am evoking the words associated with humanities and humanism–things like communities, kindness, compassion, human potential, and the arts. Technology is often viewed either as the savior of humanity, or its destruction. In reality, of course, it’s both. By framing myself as a humanist working with technology, I am clueing people in that my perspective on technology will incorporate potential negative impacts for people. The playful nature of the headline i.e. using “girl” to rhyme with “world” also reveals my personality. Compare this headline with something like, “I am interested in humane approaches to technology used in education.” It’s true, but it doesn’t tell you about me as a person outside of my interests. 

Being “On Brand”

To declare a brand is not to limit your interests, nor should it be criticized as promoting a non-interest in other topics. Another observation from Kim is that instructional designers are likely very busy, and overstretched. In his words, there is “a significant mismatch between institutional demand for instructional design services and the available supply.” To avoid burnout, instructional designers need to be strategic with the projects they commit to. A brand can also help you be selective about which conferences you attend, or committees you serve on. Being “on brand” can be a way of focusing your energy, and also a touchstone of your identity. 

Using the SIFT framework, you can reflect on your professional values, and your professional goals. One of my colleagues in the field is an accessibility expert, and gets called in to consult on all things related to accessibility in addition to her daily work as an instructional designer. She recently became a certified Accessibility Professional with the IAAP, and this credential is visible on her LinkedIn profile as an emblem of her brand. Knowing her brand allowed her to appeal to her institution to allow her this opportunity that enriches not only her own skillset, but the prestige of her institution by having an IAPP certified accessibility professional on their staff. In that sense, personal branding can also help institutions build diverse departments that are teams of specialists.

To return to the three benefits of branding for faculty outlined in the Academic Medicine article, knowing my brand helps me decide where to devote my limited bandwidth by pursuing professional activities that are “on brand” for me. I can also use my brand to search for specific opportunities that will build my brand, even if those fall outside the typical skillset of instructional designers. However, moving towards a “branding” mindset also benefits my colleagues, who are equally, individually, uniquely talented, and should be recognized for their specialties and allowed to follow their passions, rather than be constrained to their job duties. As instructional design teams at universities grow, having a team of specialists can also help alleviate burnout by allowing people to play to their strengths. This can ensure that instructional design remains a space where all career pathways are valid and not contingent on specific credentials.

References

Borman-Shoap, Emily, Li, Su-Ting T., St Clair, Nicole E., Rosenbluth, Glenn, Pitt, Susan, and Michael B. Pitt. Knowing Your Personal Brand: What Academics Can Learn From Marketing 101 Academic Medicine 94(9):p 1293-1298, September 2019.

Kim, J. CHLOE 7: The Present and Future of Instructional Design Capacity InsideHigherEd (2022)

Li, Su-Ting T., Frohna, John G. and Susan B. Bostwick. Using Your Personal Mission Statement to INSPIRE and Achieve Success View from the Association of Pediatric Program Directors 17(2): p107-109, March 2017.

Monarth, H. What’s the Point of a Personal Brand? Harvard Business Review (2022)

Uranis, J. Definition Update: Chief Online Learning Officer (COLO) (2023)

“Belonging is a universal human need that is fundamentally linked to learning and well-being. It describes an individual’s experience of feeling that they are, or are likely to be, accepted and respected as a valued contributor in a specific environment.”           

Structures for Belonging: A Synthesis of Research on Belonging-Supportive Learning Environments
image of Maslow's pyramid of needs

Maslow’s Hierarchy of Needs is a helpful framework when discussing belonging, which falls in the middle, at level three, just above the basics for survival (level one: air, water, food, shelter) and safety (level 2: health, employment, family, security). 

Image from Wikimedia Commons

Have you heard the word belonging recently in reference to students and employees? At OSU, it seems to be popping up frequently in conversations and discussions, onboardings and trainings, online and off, becoming a buzzword for those concerned with teaching and learning, recruitment and outreach, employee satisfaction, and student success, and has become a focal point of our ongoing efforts towards diversity, equity, and inclusion. This increased focus on the concept of belonging at OSU is reflected in the university’s 2018 Innovate & Integrate: Plan for Inclusive Excellence, and is echoed by the 2021 Oregon Department of Education’s passing of the Every Student Belongs rule, which states, “It is the policy of the State Board of Education that all students, employees, and visitors in public schools are entitled to learn, work, and participate in an environment that is safe and free from discrimination, harassment, and intimidation.” These initiatives reflect a growing understanding that traditionally prevailing systems of power have historically marginalized certain groups and excluded them from many realms of life, including education, and prioritize a commitment to changing the status quo explicitly and with intention. 

At Ecampus, belonging is an area of active study, and our effort to extend the feeling of belonging to our online students is an important part of our mission, vision, & values and our own Inclusive Excellence Strategic Plan’s goals. We realize that our Ecampus students come from a wide range of backgrounds, seek online learning for a variety of reasons, and comprise higher numbers of students from historically marginalized backgrounds, and thus, combined with the nature of online learning, can feel increased isolation and less of a sense of belonging than their on-campus peers. 

What is belonging and why is it important?

Belonging is a complex, multi-layered, and changeable quality that is nonetheless very important for student success. Maslow’s Hierarchy of Needs places belonging in the category of psychological needs, just above the basic needs including food, water, air, safety, and shelter. While there are many definitions, the concept of belonging generally encompasses feeling safe, appreciated, welcomed, valued, and respected in a given situation. Humans learn to search for and interpret signals that they belong or do not belong when entering into new situations or contexts. Marginalized groups have had to learn to be cognizant of where and when they could expect to be excluded and on the alert for cues signaling such. Traditionally, educational institutions have been places of exclusionary practices, often closed to large groups in both policy and practice. Students from marginalized populations, facing this problematic history of exclusion, may be looking for signals and signs that indicate the extent to which they are valued and respected as members of the school community. Students may not be sure they will be accepted in institutions, departments, courses, and other school environments and may be consciously or unconsciously searching for such clues as reassurance that they do, in fact, belong. 

Belonging is important for student success because it conveys a host of positive benefits and is a crucial aspect of educational accomplishment. When students find welcoming, inclusive attitudes, see others like themselves being accepted and thriving, and are made to feel safe, protected, supported, and valued, their sense of belonging increases, which in turn allows them to relax and be confident sharing more of their full selves. Students who have a strong sense of belonging show increased academic performance, better attendance, persistence, retention, and motivation, and less likelihood of dropping out. Dr. Terrill Strayhorn, Professor of Urban Education and Vice President for Academic and Student Affairs at LeMoyne-Owen College, in his book College Student’s Sense of Belonging, concludes that “deprivation of belonging in college prevents achievement and wellbeing, while satisfaction of college students’ sense of belonging is a key to educational success for all students.” 

In education, as in our society at large, belonging is often related to larger systems that privilege and prefer certain groups and their ideas, beliefs, and ways of being. Those whose race, ethnicity, sexual identity, gender, class, indigeneity, language, or ability are not of the majority are especially likely to be anxious and “on alert” to othering, exclusion, bullying, and stereotyping. This can have dramatic negative short and long term effects, including lowered cognitive capacity, increased stress, and reduced persistence and achievement. Students who lack a sense of belonging may feel uncomfortable in class or group work, unable to concentrate, and may experience self-consciousness and worry, which makes it that much more difficult to attain higher-level needs such as self-confidence, recognition, respect, fulfillment, and achievement. When students face active discrimination, bullying, or other forms of harassment, they may become depressed, choose to disengage, drop courses, or discontinue studying. With such dire consequences, taking the time to understand and assist in ensuring all OSU students are made to feel welcomed and accepted is well worth the effort. 

Why do online students sometimes feel less of a sense of belonging? 

There are many contributing factors to the disparity between online and traditional students’ development of a sense of belonging, starting with the very nature of the modality in which they study. Students living and studying on campus often have more frequent contact with instructors, campus staff, and other students, both structured and impromptu, providing opportunities to build relationships that can enhance their sense of community and belonging. The pacing of on-campus courses tends to be predictable, with regular meetings during which students often have the chance to ask questions (and receive answers quickly) and get to know fellow students and instructors. Instructors have dedicated class time to review important concepts, check understanding, and provide opportunities for students to get to know them and their fellow students. The traditional on-campus experience is geared towards taking a diverse group of students and building a cohesive community in many ways- students have a wide array of support services available to them, many activities, sports, and clubs they can join, and have a host of opportunities to participate in the rich culture of OSU and in academic and social communities, most of which are easily accessible on campus. Indeed, the very nature of on-campus learning seeks to provide a community for traditional students, many of whom are young and leaving their own homes and communities for the first time.

In contrast, Ecampus courses are asynchronous, featuring no scheduled meeting times, as our students live around the USA and the world. While this format allows for increased access for students who cannot attend in person, the lack of face-to-face interaction can make it difficult for both students and instructors to make personal connections. Unless their courses are carefully designed to provide chances for interaction, conversation, collaboration, and community building, online students may not often interact with their instructors or peers. Online students can experience feelings of isolation, loneliness, and disengagement, which can greatly affect their sense of belonging as an OSU student as well as their success and performance. 

Complicating things even further is the tendency to experience digital miscommunication, the concept that humans are less able to infer tone, underlying sentiment, and in general not understand nuance when communicating by text and online, to some extent due to the lack of context and/or visual clues one gets when interacting face to face. A 2016 literature review on the topic of establishing community in online courses found digital communication to be a consistent issue, noting “…the absence of visual meaning-making cues such as gesture, voice tone, and immediate interaction can frustrate students and lead to feelings of isolation and disconnectedness in an online classroom” and recommended that instructors who teach online learn the nuances of these different communication needs. 

It must be noted that some online students, who may be older, working full or part time, caring for family, or otherwise already leading (sometimes overly) full lives do not particularly want or need the sense of community that younger traditional students may seek out from their university. They may have little time to devote to community building and little interest in superfluous interaction, shying away from an increased social burden they may not have time and energy to fully commit to. Since we cannot know in advance the detailed makeup of our student body, planning with an assumption that creating belonging is an important aspect of our approach serves online students best.

Stay tuned for Part 2: What can we do to help? for research-based strategies you can use to improve belonging and inclusion.


Sources

Ally for Canvas | Learn@OregonState

Belonging and Emotional Safety – Casel Schoolguide 

Building Inclusivity and Belonging | Division of Student Affairs

College Student’s Sense of Belonging

Creating a Safe and Respectful Environment in Our Nation’s Classrooms 

Cultural Centers | Oregon State University

Decades of Scientific Research that Started a Growth Mindset Revolution

Ecampus Essentials – Standards and Principles – Faculty Support | Oregon State Ecampus | OSU Degrees Online

Establishing Community in Online Courses: A Literature Review 

Growth Mindset in the Higher Education Classroom | Center for Learning Experimentation, Application, and Research

Innovate & Integrate: Plan for Inclusive Excellence | Institutional Diversity 

Mission, Vision and Values | Oregon State Ecampus | OSU Degrees Online

Online Teaching Principles – Standards and Principles – Faculty Support | Oregon State Ecampus | OSU Degrees Online

Oregon Department of Education 

OSU Search Advocate Program

Peer Mentor Program | TRiO | Oregon State University

Social Justice Education Initiative 

State of Oregon Diversity, Equity, and Inclusion Action Plan

Student Academic Experience Survey 2022

The UDL Guidelines

Update Syllabus – Term Checklist and Forms – Faculty Support | Oregon State Ecampus | OSU Degrees Online

Using a warmer tone in college syllabi makes students more likely to ask for help, OSU study finds | Oregon State University

Utilizing Inclusive and Affirming Language | Institutional Diversity

By Greta Underhill

In my last post, I outlined my search for a computer-assisted qualitative data analysis software (CAQDAS) program that would fit our Research Unit’s needs. We needed a program that would enable our team to collaborate across operating systems, easily adding in new team members as needed, while providing a user-friendly experience without a high learning curve. We also needed something that would adhere to our institution’s IRB requirements for data security and preferred a program that didn’t require a subscription. However, the programs I examined were either subscription-based, too cumbersome, or did not meet our institution’s IRB requirements for data security. It seemed that there just wasn’t a program out there to suit our team’s needs.

However, after weeks of continued searching, I found a YouTube video entitled “Coding Text Using Microsoft Word” (Harold Peach, 2014). At first, I assumed this would show me how to use Word comments to highlight certain text in a transcript, which is a handy function, but what about collating those codes into a table or Excel file? What about tracking which member of the team codes certain text? I assumed this would be an explanation of manual coding using Word, which works fine for some projects, but not for our team.

Picture of a dummy transcript using Lorem Ipsum placeholder text. Sentences are highlighted in red or blue depending upon the user. Highlighted passages have an associated “comment” where users have written codes.

Fortunately, my assumption was wrong. Dr. Harold Peach, Associate Professor of Education at Georgetown College, had developed a Word Macro to identify and pull all comments from the word document into a table (Peach, n.d.). A macro is “a series of commands and instructions that you group together as a single command to accomplish a task automatically” (Create or Run a Macro – Microsoft Support, n.d.). Once downloaded, the “Extract Comments to New Document” macro opens a template and produces a table of the coded information as shown in the image below. The macro identifies the following properties:

  • Page: the page on which the text can be found
  • Comment scope: the text that was coded
  • Comment text: the text contained in the comment; for the purpose of our projects, the code title
  • Author: which member of the team coded the information
  • Date: the date on which the text was coded

Picture of a table of dummy text that was generated from the “Extract Comments to New Document” Macro. The table features the following columns: Page, Comment Scope, Comment Text, Author, and Date.

You can move the data from the Word table into an Excel sheet where you can sort codes for patterns or frequencies, a function that our team was looking for in a program as shown below:

A picture of the dummy text table in an Excel sheet where codes have been sorted and grouped together by code name to establish frequencies.

This Word Macro was a good fit for our team for many reasons. First, our members could create comments on a Word document, regardless of their operating system. Second, we could continue to house our data on our institution’s servers, ensuring our projects meet strict IRB data security measures. Third, the Word macro allowed for basic coding features (coding multiple passages multiple times, highlighting coded text, etc.) and had a very low learning curve: teaching someone how to use Word Comments. Lastly, our institution provides access to the complete Microsoft Suite so all team members including students that would be working on projects already had access to the Word program. We contacted our IT department to have them verify that the macro was safe and for help downloading the macro.

Testing the Word Macro       

Once installed, I tested out the macro with our undergraduate research assistant on a qualitative project and found it to be intuitive and helpful. We coded independently and met multiple times to discuss our work. Eventually we ran the macro, pulled all comments from our data, and moved the macro tables into Excel where we manually merged our work. Through this process, we found some potential drawbacks that could impact certain teams.

First, researchers can view all previous comments made which might impact how teammates code or how second-cycle coding is performed; other programs let you hide previous codes so researcher can come at the text fresh.

Second, coding across paragraphs can create issues with the resulting table; cells merge in ways that make it difficult to sort and filter if moved to Excel, but a quick cleaning of the data took care of this issue.

Lastly, we manually merged our work, negotiating codes and content, as our codes were inductively generated; researchers working on deductive projects may bypass this negotiation and find the process of merging much faster.

Despite these potential drawbacks, we found this macro sufficient for our project as it was free to use, easy to learn, and a helpful way to organize our data. The following table summarizes the pro and cons of this macro.

Pros and Cons of the “Extract Comments to New Document” Word Macro

Pros

  • Easy to learn and use: simply providing comments in a Word document and running the macro
  • Program tracks team member codes which can be helpful in discussions of analysis
  • Team members can code separately by generating separate Word documents, then merge the documents to consensus code
  • Copying Word table to Excel provides a more nuanced look at the data
  • Program works across operating systems
  • Members can house their data in existing structures, not on cloud infrastructures
  • Macro is free to download

Cons

  • Previous comments are visible through the coding process which might impact other members’ coding or second round coding
  • Coding across paragraph breaks creates cell breaks in the resulting table that can make it hard to sort
  • Team members must manually merge their codes and negotiate code labels, overlapping data, etc.

Scientific work can be enhanced and advanced by the right tools; however, it can be difficult to distinguish which computer-assisted qualitative data analysis software program is right for a team or a project. Any of the programs mentioned in this paper would be good options for individuals who do not need to collaborate or for those who are working with publicly available data that require different data security protocols. However, the Word macro highlighted here is a great option for many research teams. In all, although there are many powerful computer-assisted qualitative data analysis software programs out there, our team found the simplest option was the best option for our projects and our needs.

References 

Create or run a macro—Microsoft Support. (n.d.). Retrieved July 17, 2023, from https://support.microsoft.com/en-us/office/create-or-run-a-macro-c6b99036-905c-49a6-818a-dfb98b7c3c9c

Harold Peach (Director). (2014, June 30). Coding text using Microsoft Word. https://www.youtube.com/watch?v=TbjfpEe4j5Y

Peach, H. (n.d.). Extract comments to new document – Word macros and tips – Work smarter and save time in Word. Retrieved July 17, 2023, from https://www.thedoctools.com/word-macros-tips/word-macros/extract-comments-to-new-document/

by Greta Underhill

Are you interested in qualitative research? Are you currently working on a qualitative project? Some researchers find it helpful to use a computer-assisted qualitative data analysis software (CAQDAS) program to help them organize their data through the analysis process. Although some programs can perform basic categorization for researchers, most software programs simply help researchers to stay organized while they conduct the deep analysis needed to produce scientific work. You may find a good CAQDAS program especially helpful when multiple researchers work with the same data set at different times and in different ways. Choosing the right CAQDAS for your project or team can take some time and research but is well worth the investment. You may need to consider multiple factors before determining a software program such as cost, operating system requirements, data security, and more.

For the Ecampus Research Unit, issues with our existing CAQDAS prompted our team to search for another program that would fit our specific needs: Here’s what we were looking for:

NeedsReasoning
General qualitative analysisWe needed a program for general analysis for multiple types of projects; Other programs are designed for specific forms of analysis such as Leximancer for content analysis
Compatibility across computer operating systems (OS)Our team used both Macs and PCs
Adherence to our institution’s IRB security requirementsLike many others, our institution and our team adhere to strict data security and privacy requirements, necessitating a close look at how a program would manage our data
Basic coding capabilitiesAlthough many programs offer robust coding capabilities, our team needed basic options such as coding one passage multiple times and visually representing coding through highlights
Export of codes into tables or Excel booksThis function is helpful for advanced analysis and reporting themes in multiple file formats for various audiences
A low learning-curveWe regularly bring in temporary team members on various projects for mentorship and research experience, making this a helpful function
A one-time purchaseA one-time purchase was the best fit for managing multiple and temporary team members on various projects

Testing a CAQDAS

I began systematically researching different CAQDAS options for the team. I searched “computer-assisted qualitative data analysis software” and “qualitative data analysis” in Google and Google Scholar. I also consulted various qualitative research textbooks and articles, as well as blogs, personal websites, and social media handles of qualitative researchers to identify software programs. Over the course of several months, I generated a list of programs to examine and test. Several programs were immediately removed from consideration as they are designed for different types of analysis: DiscoverText, Leximancer, MAXQDA, QDA Miner. These programs are powerful, but best suited for specific analysis, such as text mining. With the remaining programs, I signed up for software trials, attended several product demonstrations, participated in training sessions, borrowed training manuals from the library, studied how-to videos online, and contacted other scholars to gather information about the programs. Additionally, I tested whether programs would work across different operating systems. I kept recorded details about each of the programs tested, including how they handled data, the learning curve for each, their data security, whether they worked across operating system, how they would manage the export of codes, and whether they required a one-time or subscription-based payment. I started with three of the most popular programs, NVivo, Dedoose, and ATLAS.ti. The table below summarizes which of these programs fit our criteria.

NVivoDedooseATLAS.ti
General Qualitative Analysis
Cross-OS Collaboration
Data security
Basic coding capabilities
Export codes
Low learning curve
One-time purchase
A table demonstrating whether three programs (NVivo, Dedoose, and ATLAS.ti) meet the team’s requirements. Details of requirements will be discussed in the text of the blog below.

NVivo

I began by evaluating NVivo, a program I had used previously. NVivo is a powerful program that adeptly handled large projects and is relatively easy to learn. The individual license was available for one-time purchase and allowed the user to maintain their data on their own machine or institutional servers. However, it had no capabilities for cross-OS collaboration, even when clients purchased a cloud-based subscription. Our team members could download and begin using the program, but we would not be able to collaborate across operating systems.

Dedoose

I had no prior experience with Dedoose, so I signed up for a trial of the software. I was impressed with the product demonstration, which significantly helped in figuring out how to use the program. This program excelled at data visualization and allowed a research team to blind code the same files for interrater reliability if that suited the project. Additionally, I appreciated the options to view code density (how much of the text was coded) as well as what codes were present across transcripts. I was hopeful this cloud-based program would solve our cross-OS collaboration problem, but it did not pass the test for our institution’s IRB data security requirements because it housed our data on Dedoose servers.

ATLAS.ti

ATLAS.ti was also a new program for me, so I signed up for a trial of this software. It is a well-established program with powerful analysis functions such as helpful hierarchical coding capabilities and institutive links among codes, quotations, and comments. But the cross-OS collaboration, while possible via the web, proved to be cumbersome and this too did not meet the data security threshold for our institution’s IRB. Furthermore, the price point meant we would need to rethink our potential collaborations with other organizational members.

Data Security

Many programs are now cloud-based, which offer powerful analysis options, but unfortunately did not meet our IRB data security requirements. Ultimately, we had to cut Delve, MAXQDA, Taguette, Transana, and webQDA. All of these programs would have been low-learning curve options with basic coding functionality and cross-OS collaboration; however, for our team to collaborate, we would need to purchase a cloud-based subscription, which can quickly become prohibitively expensive, and house our data on company servers, which would not pass our institutional threshold for data security.

Note-taking programs

After testing multiple programs, I started looking beyond just qualitative software programs and into note-taking programs such as DevonThink, Obsidian, Roam Research, and Scrintal. I had hoped these might provide a work around by organizing data on collaborative teams in ways that would facilitate analysis. However, most of them did not have functionalities that could be used for coding or had high learning curves that precluded our team using them.

It seemed like I had exhausted all options and I still did not have a program to bring back to the Research Unit. I had no idea that a low-cost option was just a YouTube video away. Stay tuned for the follow-up post where we dive into the solution that worked best for our team.

 

Some form of group work is a common activity that I help design with faculty every term. Oftentimes, faculty ask how to consider the different levels of engagement from individual group members and how to assess group work, often in the form of a group grade. Improving group work in asynchronous courses and group contracts to promote accountability are some of many ways to guide students into collaborative work. However, collaborative work may require offering equitable opportunities to all students to succeed. Based on the work by Feldman (2019), I’d like to outline some suggestions for assessment design through an equity lens.

Before jumping into assessing group work, Feldman outlines three pillars of equitable grades:

  1. “They are mathematically accurate, validly reflecting a student’s academic performance.
  2. They are bias-resistant, preventing biased subjectivity from infecting our grades.
  3. They motivate students to strive for academic success, persevere, accept struggles and setbacks, and to gain critical lifelong skills” (Feldman, p. 71).

With these three pillars in mind, let’s examine some potential issues with a group receiving one grade for their work.

  1. Accuracy: a collective group grade does not necessarily reflect an individual’s contribution to the group work or assess an individual student’s learning in terms of outcomes. For example, if a group splits up sections of a project into individual responsibilities, a student who did their assigned section very well may not have had an opportunity to gain new knowledge or build on their learning for aspects where they were struggling. And a group grade does not accurately capture their individual work or learning.
  2. Bias: Many times peer evaluations of group work come with some kind of group contract or accountability measure. However, there is a possibility for bias in how students evaluate their peers, especially if that evaluation is based on behaviors like turning things in on time and having strong social skills instead of learning. For example, maybe one of the group members had a job with a variable schedule from week to week, making it difficult to join regular group discussions and complete work at the same pace every week for the duration of the project. Other group members may perceive them as difficult to work with or inconsistent in their commitment and award them fewer points in a peer evaluation, especially if other group members did not have outside factors noticeably impacting their performance.
  3. Motivation: Group contracts and using evaluation as a way to promote productivity is an external motivator and does not instill a sense of internal relevance for students participating in group work. Instead, students may feel resentful that their peers may evaluate them harshly for things outside of their control, which can quickly snowball into a student disengaging from group work entirely.

“The purpose of group work is not to create some product in which all members participate, but for each student to learn specific skills or content through the group’s work together.”

Feldman, p. 104

So how do we assess this learning? Individually. If we can reimagine group work as a journey toward an individual reaching a learning outcome, then instead of assessing a behavior (working well and timeliness in a group) or what a group produces, we can instead create an assessment that captures the individual impact of the group work instead. Feldman outlines some tips for encouraging group work without a group grade:

  1. Have a clear purpose statement and overview for the group work that outlines the rationale and benefit of learning that content in a group context.
  2. Have clear evaluation criteria that shows the alignment of the group work with a follow-up individual assessment.
  3. If possible, include students in the process by having a brainstorm or pre-work discussion ahead of time about what makes groups productive, how to ensure students learn material when working in groups, and what kinds of collaborative expectations can be set for a particular cohort of students.
  4. Be patient with students navigating a new assessment strategy for the first time and offer ample feedback throughout the process so students are set up for success on their assessments.
  5. Ensure the follow-up individual assessment is in alignment with learning outcomes and is focused on the content or skills students are expected to gain through group work.

As an added bonus, assessing group work individually in this way is often simpler than elaborate group work rubrics with separate peer evaluations factored in, making it both easier for the instructor and easier for the student to understand how their grade is calculated. Additionally, it will be important to design this group work with intention—if an individual could learn the material on their own, then what is the purpose of the group interaction? Think about a group project you may have assigned or designed in the past. What was the intention for that journey as a group? And how might you reimagine it if there was an individual assessment after its completion? I hope these questions are great starting points for reflecting on group work assessments and redesigning with equity in mind!

References

Feldman, J. (2019). Grading for equity: What it is, why it matters, and how it can transform schools and classrooms. Thousand Oaks, CA: Corwin.

An illustration of a person kneeling and question marks around

Have you ever been assigned a task but found yourself asking: “What’s the point of this task? Why do I need to do this?” Very likely, no one has informed you of the purpose of this task! Well, it likely was because that activity was missing to show a critical element: the purpose. Just like the purpose of a task can be easily left out, in the context of course design, a purpose statement for an assignment is often missing too.

Creating a purpose statement for assignments is an activity that I enjoy very much. I encourage instructors and course developers to be intentional about that statement which serves as a declaration of the underlying reasons, directions, and focus of what comes next in an assignment. But most importantly, the statement responds to the question I mentioned at the beginning of this blog…why…?

Just as a purpose statement should be powerful to guide, shape, and undergird a business (Yohn, 2022), a purpose statement for an assignment can guide students in making decisions about using strategies and resources, shape students’ motivation and engagement in the process of completing the assignment, and undergird their knowledge and skills.  Let’s look closer at the power of a purpose statement.

What does “purpose” mean?

Merriam-Webster defines purpose as “something set up as an object or end to be”, while Cambridge Dictionary defines it as “why you do something or why something exists”. These definitions show us that the purpose is the reason and the intention behind an action.

Why a purpose is important in an assignment?

The purpose statement in an assignment serves important roles for students, instructors, and instructional designers (believe it or not!).

For students

The purpose will:

  1. answer the question “why will I need to complete this assignment?”
  2. give the reason to spend time and resources working out math problems, outlining a paper, answering quiz questions, posting their ideas in a discussion, and many other learning activities.
  3. highlight its significance and value within the context of the course.
  4. guide them in understanding the requirements and expectations of the assignment from the start.

For instructors

The purpose will:

  1. guide the scope, depth, and significance of the assignment.
  2. help to craft a clear and concise declaration of the assignment’s objective or central argument.
  3. maintain the focus on and alignment with the outcome(s) throughout the assignment.
  4. help identify the prior knowledge and skills students will be required to complete the assignment.
  5. guide the selection of support resources.

For instructional designers

The purpose will:

  1. guide building the structure of the assignment components.
  2. help identify additional support resources when needed.
  3. facilitate an understanding of the alignment of outcome(s).
  4. help test the assignment from the student’s perspective and experience.

Is there a wrong purpose?

No, not really. But it may be lacking or it may be phrased as a task. Let’s see an example (adapted from a variety of real-life examples) below:

Project Assignment:

“The purpose of this assignment is to work in your group to create a PowerPoint presentation about the team project developed in the course. Include the following in the presentation:

  • Title
  • Context
  • Purpose of project
  • Target audience
  • Application of methods
  • Results
  • Recommendations
  • Sources (at least 10)
  • Images and pictures

The presentation should be a minimum of 6 slides and must include a short reflection on your experience conducting the project as a team.”

What is unclear in this purpose? Well, unless the objective of the assignment is to refine students’ presentation-building skills, it is unclear why students will be creating a presentation for a project that they have already developed. In this example, creating a presentation and providing specific details about its content and format looks more like instructions instead of a clear reason for this assignment to be.

A better description of the purpose could be:

“The purpose of this assignment is to help you convey complex information and concepts in visual and graphic formats. This will help you practice your skills in summarizing and synthesizing your research as well as in effective data visualization.”

The purpose statement particularly underscores transparency, value, and meaning. When students know why, they may be more compelled to engage in the what and how of the assignment. A specific purpose statement can promote appreciation for learning through the assignment (Christopher, 2018).

Examples of purpose statements

Below you will find a few examples of purpose statements from different subject areas.

Example 1: Application and Dialogue (Discussion assignment)

Courtesy of Prof. Courtney Campbell – PHL /REL 344

Example 2: An annotated bibliography (Written assignment)

Courtesy of Prof. Emily Elbom – WR 227Z

Example 3: Reflect and Share (Discussion assignment)

Courtesy of Profs. Nordica MacCarty and Shaozeng Zhang – ANTH / HEST 201

With the increased availability of language learning models (LLMs) and artificial intelligence (AI) tools (e.g., ChatGPT, Claude2), many instructors worry that students would resort to these tools to complete the assignments. While a clear and explicit purpose statement won’t deter the use of these highly sophisticated tools, transparency in the assignment description could be a good motivator to complete the assignments with no or little AI tools assistance.

Conclusion

Knowing why you do what you do is crucial” in life says Christina Tiplea. The same applies to learning, when “why” is clear, the purpose of an activity or assignment can become a more meaningful and crucial activity that motivates and engages students. And students may feel less motiavted to use AI tools (Trust, 2023).

Note: This blog was written entirely by me without the aid of any artificial intelligence tool. It was peer-reviewed by a human colleague.

Resources:

Christopher, K. (02018). What are we doing and why? Transparent assignment design benefits students and faculty alike. The Flourishing Academic.

Sinek, S. (2011). Start with why. Penguin Publishing Group.

Trust, T. (2023). Addressing the Possibility of AI-Driven Cheating, Part 2. Faculty Focus.

Yohn, D.L. (2022). Making purpose statements matter. SHR Executive Network.

I have always struggled with test anxiety. As a student, from first-grade spelling tests through timed essay questions while earning my Masters of Science in Education, I started exams feeling nauseous and underprepared. (My MSEd GPA was 4.0). I blame my parents. Both were college professors and had high expectations for my academic performance. I am in my 50s, and I still shutter remembering bringing home a low B on a history test in eighth grade. My father looked disappointed and told me, “Debbie, I only expect you to do the best you can do. But I do not think this is the best you can do.” 

I am very glad my parents instilled in me a high value of education and a strong work ethic. This guidance heavily influenced my own desire to work in Higher Ed. Reflecting on my own journey and the lingering test anxiety that continues to haunt me, it has become evident that equipping students with comprehensive information to prepare for and navigate quizzes or exams holds the potential to alleviate the anxiety I once struggled with.

Overlooking the instructions section for an exam, assignment, or quiz is common among instructors during online course development. This might seem inconsequential, but it can significantly impact students’ performance and overall learning experience. Crafting comprehensive quiz instructions can transform your course delivery, fostering a more supportive and successful student learning environment.

The Role of Quizzes in Your Course

Quizzes serve as diagnostic and evaluative tools. They assess students’ comprehension and application of course materials, helping identify knowledge gaps and areas for additional study. The feedback instructors receive through student quiz scores enables instructors to evaluate the effectiveness of the course learning materials and activities and understand how well students are mastering the skills necessary to achieve the course learning outcomes. This enables instructors to identify aspects of the course design needing improvement and modify and adjust their teaching strategies and course content accordingly. By writing thorough and clear quiz instructions, you support students’ academic growth and improve the overall quality of your course.

Explain the Reason

Explain how the quiz will help students master specific skills to motivate them to study. The skills and knowledge students are expected to develop should be clearly defined and communicated. Connect it to course learning outcomes and encourage students to track their progress against them (Align Assessments, Objectives, Instructional Strategies – Eberly Center – Carnegie Mellon University, n.d.).

Why did you assign the quiz? Would you like your students to receive frequent feedback, engage with learning materials, prepare for high-stakes exams, or improve their study habits?

Equipping Students for Successful Quiz Preparation

Preparing for a quiz can be daunting for students. To help them navigate this process, provide a structured guide for preparation. Leading up to the quiz, you may want to encourage your students to:

  1. Review the lectures: Highlight the importance of understanding key concepts discussed.
  2. Review the readings: Encourage students to reinforce their understanding by revisiting assigned readings and additional materials.
  3. Engage in review activities: Suggest using review materials, practice questions, or study guides to cement knowledge.
  4. Participate in discussions: Reflecting on class discussions can offer unique insights and deepen understanding.
  5. Seek clarification: Remind students to contact their instructor or teaching assistant for any questions or clarifications. You add a Q&A discussion forum for students to post questions leading up to the quiz.

Crafting Clear and Detailed Quiz Instructions 

When taking the quiz, clear instructions are vital to ensure students understand what is expected of them. Here’s a checklist of details to include in your quiz instructions:

  1. Time Limit: Explicitly mention the duration of the quiz, the amount of time students have to complete the quiz once they have started it, or if it’s untimed. Suggest how they may want to pace the quiz to ensure they have time to complete all the questions.
  2. Availability Window: You should specify an availability window for asynchronous online students. It refers to the time frame during which the quiz can be accessed and started. By giving an extended window, you allow students to take the quiz at a time that suits them. Once they begin, the quiz duration will apply.
  3. Number of Attempts: Indicate whether students have multiple attempts or just a single opportunity to take the quiz.
  4. Question Format: Provide information about the types of questions included and any specific formatting requirements. 
  5. Quiz Navigation: Have you enforced navigational restrictions on the quiz, such as preventing students from returning to a question or only showing questions one at a time? Share this information in the instructions and explain the reasoning.
  6. Point Allocation: Break down how points are distributed, including details for varying point values and partial credit.
  7. Resources: Specify whether students can use external resources, textbooks, or notes during the quiz.
  8. Academic Integrity Reminders: Reinforce the importance of academic integrity, detailing expectations for honest conduct during the quiz.
  9. Feedback and Grading: Clarify how and when students will receive feedback and their grades.
  10. Showing Work: If relevant, provide clear guidelines on how students present their work (solving equations, pre-writing activities, etc.) or reasoning for particular question types.

End with a supportive “Good Luck!” to ease students’ nerves and inspire confidence.

Crafting comprehensive quiz instructions is a vital step in ensuring successful course delivery. Providing students with clear expectations, guidelines, and support enhances their quiz experience and contributes to a positive and productive learning environment (Detterman & Andrist, 1990). As course developers and designers, we are responsible for fostering these optimal conditions for student success. Plus, as my father would say, it is satisfying to know you have “done the best you can do.”

References

Align Assessments, Objectives, Instructional Strategies—Eberly Center—Carnegie Mellon University. (n.d.). Eberly Center: Carnegie Mellon University. Retrieved June 28, 2023, from https://www.cmu.edu/teaching/assessment/basics/alignment.html

Detterman, D. K., & Andrist, C. G. (1990). Effect of Instructions on Elementary Cognitive Tasks Sensitive to Individual Differences. The American Journal of Psychology, 103(3), 367–390. https://doi.org/10.2307/1423216

Footnote: My son called as I was wrapping up this post. I told him I was finishing up a blog post for Ecampus. “I kind of threw Grandpa under the bus,” I said. After I shared the history test example, he said, “you didn’t learn much.” He and his sister felt similar academic pressure; I may have even used the same line about the best you can do. In my defense, he is now. Ph.D. candidate in Medicinal Chemistry and his sister just completed a Masters in Marine Bio.

This month brings the new and improved QM Higher Education Rubric, Seventh Edition! To see the detailed changes, you can order the new rubric or take the Rubric Update Session, which is a self-paced workshop that will be required for all QM role holders. In the meantime, if you’d like a short summary of the revisions, continue reading below.

The main changes include:

  • The number of Specific Review Standards has increased from 42 to 44.
  • The points value scheme was also slightly revised, with the total now being 101.
  • A few terminology updates were implemented.
  • The descriptions and annotations for some of the general and specific standards were revised.
  • The instructions were expanded and clarified, with new additions for synchronous and continuous education courses.

Most of the standards (general or specific) have undergone changes consisting of revised wording, additional special instructions, and/or new examples to make the standards clearer and emphasize the design of inclusive and welcoming courses. In addition, some standards have received more substantial revisions – here are the ones that I found the most significant:

Standard 3: There is a new Specific Standard: SRS 3.6: “The assessments provide guidance to the learner about how to uphold academic integrity.” This standard is met if “the course assessments incorporate or reflect how the institution’s academic integrity policies and standards are relevant to those assessments.” SRS 3.6 is the main addition to the 7th edition, and a very welcome one, especially considering the new complexities of academic integrity policies.

Standard 4: SRS 4.5 (“A variety of instructional materials is used in the course.”) has received an important annotation revision – this standard is met if at least one out of three of the following types of variety are present in the course: variety of type of media; different perspectives/representations of ideas; diverse, non-stereotypical representations of persons or demographic groups. I was really happy to see this clarification, since it’s always been a little difficult to evaluate what constitutes “variety”, and reviewers will certainly appreciate the recognition of diversity of people and ideas.

Standard 8: SRS 8.3 was divided into two separate Specific Standards: SRS 8.3 “Text in the course is accessible.” and SRS 8.4 “Images in the course are accessible.” At the same time 8.5 (former 8.4) was turned into “Video and audio content in the course is accessible.” This should allow for a more nuanced evaluation of the various accessibility elements, and it is nice to see the focus on captions for both video and audio materials. Moreover, these three standards (SRS 8.3, 8.4, and 8.5) now include publisher-created content – this is an important step forward in terms of advocating for all educational materials to be made accessible upfront.

In addition to the standards themselves, some changes were made to the Course Format Chart, the Course Worksheet, and the Glossary. Notably, a course/alignment map is now required with the Course Worksheet – a change that is sure to spark delight among QM reviewers. The definitions of activities and assessments were also revised to clarify the distinction between the two – another much-needed modification that should eliminate a common point of confusion.

Overall, the new edition brings about clearer instructions, more relevant examples, and a deeper inclusion of diversity, accessibility, and academic integrity. Reviewers and course designers should find it easier to evaluate or create high quality courses with this updated guidance.