An illustration of a person kneeling and question marks around

Have you ever been assigned a task but found yourself asking: “What’s the point of this task? Why do I need to do this?” Very likely, no one has informed you of the purpose of this task! Well, it likely was because that activity was missing to show a critical element: the purpose. Just like the purpose of a task can be easily left out, in the context of course design, a purpose statement for an assignment is often missing too.

Creating a purpose statement for assignments is an activity that I enjoy very much. I encourage instructors and course developers to be intentional about that statement which serves as a declaration of the underlying reasons, directions, and focus of what comes next in an assignment. But most importantly, the statement responds to the question I mentioned at the beginning of this blog…why…?

Just as a purpose statement should be powerful to guide, shape, and undergird a business (Yohn, 2022), a purpose statement for an assignment can guide students in making decisions about using strategies and resources, shape students’ motivation and engagement in the process of completing the assignment, and undergird their knowledge and skills.  Let’s look closer at the power of a purpose statement.

What does “purpose” mean?

Merriam-Webster defines purpose as “something set up as an object or end to be”, while Cambridge Dictionary defines it as “why you do something or why something exists”. These definitions show us that the purpose is the reason and the intention behind an action.

Why a purpose is important in an assignment?

The purpose statement in an assignment serves important roles for students, instructors, and instructional designers (believe it or not!).

For students

The purpose will:

  1. answer the question “why will I need to complete this assignment?”
  2. give the reason to spend time and resources working out math problems, outlining a paper, answering quiz questions, posting their ideas in a discussion, and many other learning activities.
  3. highlight its significance and value within the context of the course.
  4. guide them in understanding the requirements and expectations of the assignment from the start.

For instructors

The purpose will:

  1. guide the scope, depth, and significance of the assignment.
  2. help to craft a clear and concise declaration of the assignment’s objective or central argument.
  3. maintain the focus on and alignment with the outcome(s) throughout the assignment.
  4. help identify the prior knowledge and skills students will be required to complete the assignment.
  5. guide the selection of support resources.

For instructional designers

The purpose will:

  1. guide building the structure of the assignment components.
  2. help identify additional support resources when needed.
  3. facilitate an understanding of the alignment of outcome(s).
  4. help test the assignment from the student’s perspective and experience.

Is there a wrong purpose?

No, not really. But it may be lacking or it may be phrased as a task. Let’s see an example (adapted from a variety of real-life examples) below:

Project Assignment:

“The purpose of this assignment is to work in your group to create a PowerPoint presentation about the team project developed in the course. Include the following in the presentation:

  • Title
  • Context
  • Purpose of project
  • Target audience
  • Application of methods
  • Results
  • Recommendations
  • Sources (at least 10)
  • Images and pictures

The presentation should be a minimum of 6 slides and must include a short reflection on your experience conducting the project as a team.”

What is unclear in this purpose? Well, unless the objective of the assignment is to refine students’ presentation-building skills, it is unclear why students will be creating a presentation for a project that they have already developed. In this example, creating a presentation and providing specific details about its content and format looks more like instructions instead of a clear reason for this assignment to be.

A better description of the purpose could be:

“The purpose of this assignment is to help you convey complex information and concepts in visual and graphic formats. This will help you practice your skills in summarizing and synthesizing your research as well as in effective data visualization.”

The purpose statement particularly underscores transparency, value, and meaning. When students know why, they may be more compelled to engage in the what and how of the assignment. A specific purpose statement can promote appreciation for learning through the assignment (Christopher, 2018).

Examples of purpose statements

Below you will find a few examples of purpose statements from different subject areas.

Example 1: Application and Dialogue (Discussion assignment)

Courtesy of Prof. Courtney Campbell – PHL /REL 344

Example 2: An annotated bibliography (Written assignment)

Courtesy of Prof. Emily Elbom – WR 227Z

Example 3: Reflect and Share (Discussion assignment)

Courtesy of Profs. Nordica MacCarty and Shaozeng Zhang – ANTH / HEST 201

With the increased availability of language learning models (LLMs) and artificial intelligence (AI) tools (e.g., ChatGPT, Claude2), many instructors worry that students would resort to these tools to complete the assignments. While a clear and explicit purpose statement won’t deter the use of these highly sophisticated tools, transparency in the assignment description could be a good motivator to complete the assignments with no or little AI tools assistance.

Conclusion

Knowing why you do what you do is crucial” in life says Christina Tiplea. The same applies to learning, when “why” is clear, the purpose of an activity or assignment can become a more meaningful and crucial activity that motivates and engages students. And students may feel less motiavted to use AI tools (Trust, 2023).

Note: This blog was written entirely by me without the aid of any artificial intelligence tool. It was peer-reviewed by a human colleague.

Resources:

Christopher, K. (02018). What are we doing and why? Transparent assignment design benefits students and faculty alike. The Flourishing Academic.

Sinek, S. (2011). Start with why. Penguin Publishing Group.

Trust, T. (2023). Addressing the Possibility of AI-Driven Cheating, Part 2. Faculty Focus.

Yohn, D.L. (2022). Making purpose statements matter. SHR Executive Network.

Introduction

We’ve all heard by now of ChatGPT, the large language model-based chat bot that can seemingly answer most any question you present it. What if there were a way to provide this functionality to students on their learning management system, and it could answer questions they had about course content? Sure, this would not completely replace the instructor, nor would it be intended to. Instead, for quick course content questions, a chatbot with access to all course materials could provide students with speedy feedback and clarifications in far less time than the standard turnaround required through the usual channels. Of course, more involved questions about assignments and course content questions outside of the scope of course materials would be more suited to the instructor, and the exact usage of a tool like this would need to be explained, as with anything.

Such a tool could be a useful addition to an online course because not only could it potentially save a lot of time, but it could also keep students on the learning platform instead of using a 3rd-party solution to answer questions as is the suspected case right now with currently available chatbots.

To find out what this would look like, I researched a bit on potential LLM chatbot candidates, and came up with a plan to integrate one into a Canvas page.

Disclaimer!
This is simply a proof of concept, and is not in production due to certain unknowns such as origin of the initial training data, CPU-bound performance, and pedagogical implications. See the Limitations and Considerations section for more details.

How it works

The main powerhouse behind this is an open source, Large Language Model (LLM) called privateGPT. privateGPT is designed to let you “ask questions to your documents” offline, with privacy as the goal. It therefore seemed like the best way to test this concept out. The owner of the privateGPT repository, Iván Martínez, notes that privacy is prioritized over accuracy. To quote the ReadMe file from GitHub:

100% private, no data leaves your execution environment at any point. You can ingest documents and ask questions without an internet connection!

privateGPT, GitHub Site

privateGPT, at the time of writing, was licensed under the Apache-2.0 license, but during this test, no modifications were made to the privateGPT code. Initially, when you run privateGPT, train it on your documents, and ask it questions, you are doing all of this locally through a command line interface in a terminal window. This obviously will not do if we want to integrate it into something like Canvas, so additional tools needed to be built to bridge the gap.

I therefore set about making two additional pieces of software: a web-interface chat box that would later be embedded into a Canvas page, and a small application to connect what the student would type in the chat box to privateGPT, then strip irrelevant data from its response (such as redundant words like “answer” or listing the source documents for the answer) and push that back to the chat box.

A diagram showing how the front-end of the system (what the user sees) interacts with the back-end of the system (what the user does not see). Self-creation.

Once created, the web interface portion, running locally, allows us to plug it into a Canvas page, like so:

A screenshot showing regular Canvas text on the left, and the chat box interface on the right, connected to the LLM.

Testing how it works

To begin, I let the LLM ‘ingest’ the Ecampus Essentials document provided to course developers on the Ecampus website. Then I asked some questions to test it out, one of which was: “What are the Ecampus Essentials?”

I am not sure what I expected here, as it is quite an open ended question, only that it would scan its trained model data and the ingested files looking for an answer. After a while (edited for time) the bot responded:

A video showing the result of asking the bot “What are the Ecampus Essentials?”

A successful result! It has indeed pulled text from the Ecampus Essentials document, but interestingly has also paraphrased certain parts of it as well. Perhaps this is down to the amount of text it is capable of generating, along with the model that was initially selected.

A longer text example

So what happens if you give it a longer text, such as an OpenStax textbook? Would it be able to answer questions students might have about course content inside the book?

To find out, I gave the chatbot the OpenStax textbook Calculus 1, which you can download for free at the OpenStax website. No modifications were made to this text.

Then I asked the chatbot some calculus questions to see what it came up with:

Asking two questions about certain topics in the OpenStax Calculus 1 book.

It would appear that if students had any questions about mathematical theory, they could get a nice (and potentially accurate) summary from a chatbot such as this. Though this brings up some pedagogical considerations such as: would this make students less likely to read textbooks? Would this be able to search for answers to quiz questions and/or assignment problems? It is already common to ask ChatGPT to provide summaries and discussion board replies, would this bot function in much the same way?

Asking the chatbot to calculate things, however, is where one would run into the current limitations of the program, as it is not designed for that. Simple sums such as “1 + 1” return the correct answer, as this is part of the training data or otherwise common knowledge. Asking it to do something like calculate the hypotenuse of a triangle using Pythagorus’ theorem will not be successful (even using a textbook example of 32 + 42 = c2). The bot will attempt to give an answer, but its accuracy will vary wildly based on the data given to it. I could not get it to give me the correct response, but that was expected as this was not in the ingested documentation.

Limitations and Considerations

OK, so it’s not all perfect – far from it, in fact! The version of privateGPT I was using, while impressive, had some interesting quirks in certain responses. Responses were never identical either, but perhaps that is to be expected from a generative LLM. Still, this would require further investigation and testing in a production-ready model.

How regular and substantive interaction (RSI) might be affected is an important point to consider, as a more capable chatbot could impact the student-instructor Q&A discussion board side of things without prior planning on intended usage.

A major technical issue was that I was limited to using the central processing unit (CPU) instead of the much faster graphics processing unit (GPU) used in other LLMs and generative AI tools. This meant that the time between the question being sent and the answer being generated was far higher than desired. As of writing, there appears to be a way to switch privateGPT to GPU instead, which would greatly increase performance on systems with a modern GPU. The processing power required for a chatbot that more than one user would interact with simultaneously would be substantial.

Additionally, the incorporation of a chatbot like this has some other pedagogical implications, such as how the program would respond to questions related to assignment answers, which would need to be researched.

We also need to consider the technical skill required to create and upkeep a chatbot. Despite going through all of this, I am no Artificial Intelligence or Machine Learning expert; a dedicated team would be required to maintain the chatbot’s functionality to a high-enough standard.

Conclusion

In the end, the purpose of this little project was to test if this could be a tool students might find useful and could help them with content questions faster than contacting the instructor. From the small number of tests I conducted, it is very promising, and perhaps a properly built version could be used as a private alternative to ChatGPT, which is already being used by students for this very purpose. A major limitation was running the program from a single computer with consumer components made 3 years ago. With modern hardware and software – perhaps a first-party integrated version built directly into a learning management system like Canvas – students could be provided with their own course- or platform-specific chatbot for course documents and texts.

If you can see any additional uses, or potential benefits or downsides to something like this, leave a comment!

Notes

  1. Martínez Toro, I., Gallego Vico, D., & Orgaz, P. (2023). PrivateGPT [Computer software]. https://github.com/imartinez/privateGPT.
  2. “Calculus 1” is copyrighted by Rice University and licensed under an Attribution-NonCommercial-Sharealike 4.0 International License (CC BY-NC-SA).

I have always struggled with test anxiety. As a student, from first-grade spelling tests through timed essay questions while earning my Masters of Science in Education, I started exams feeling nauseous and underprepared. (My MSEd GPA was 4.0). I blame my parents. Both were college professors and had high expectations for my academic performance. I am in my 50s, and I still shutter remembering bringing home a low B on a history test in eighth grade. My father looked disappointed and told me, “Debbie, I only expect you to do the best you can do. But I do not think this is the best you can do.” 

I am very glad my parents instilled in me a high value of education and a strong work ethic. This guidance heavily influenced my own desire to work in Higher Ed. Reflecting on my own journey and the lingering test anxiety that continues to haunt me, it has become evident that equipping students with comprehensive information to prepare for and navigate quizzes or exams holds the potential to alleviate the anxiety I once struggled with.

Overlooking the instructions section for an exam, assignment, or quiz is common among instructors during online course development. This might seem inconsequential, but it can significantly impact students’ performance and overall learning experience. Crafting comprehensive quiz instructions can transform your course delivery, fostering a more supportive and successful student learning environment.

The Role of Quizzes in Your Course

Quizzes serve as diagnostic and evaluative tools. They assess students’ comprehension and application of course materials, helping identify knowledge gaps and areas for additional study. The feedback instructors receive through student quiz scores enables instructors to evaluate the effectiveness of the course learning materials and activities and understand how well students are mastering the skills necessary to achieve the course learning outcomes. This enables instructors to identify aspects of the course design needing improvement and modify and adjust their teaching strategies and course content accordingly. By writing thorough and clear quiz instructions, you support students’ academic growth and improve the overall quality of your course.

Explain the Reason

Explain how the quiz will help students master specific skills to motivate them to study. The skills and knowledge students are expected to develop should be clearly defined and communicated. Connect it to course learning outcomes and encourage students to track their progress against them (Align Assessments, Objectives, Instructional Strategies – Eberly Center – Carnegie Mellon University, n.d.).

Why did you assign the quiz? Would you like your students to receive frequent feedback, engage with learning materials, prepare for high-stakes exams, or improve their study habits?

Equipping Students for Successful Quiz Preparation

Preparing for a quiz can be daunting for students. To help them navigate this process, provide a structured guide for preparation. Leading up to the quiz, you may want to encourage your students to:

  1. Review the lectures: Highlight the importance of understanding key concepts discussed.
  2. Review the readings: Encourage students to reinforce their understanding by revisiting assigned readings and additional materials.
  3. Engage in review activities: Suggest using review materials, practice questions, or study guides to cement knowledge.
  4. Participate in discussions: Reflecting on class discussions can offer unique insights and deepen understanding.
  5. Seek clarification: Remind students to contact their instructor or teaching assistant for any questions or clarifications. You add a Q&A discussion forum for students to post questions leading up to the quiz.

Crafting Clear and Detailed Quiz Instructions 

When taking the quiz, clear instructions are vital to ensure students understand what is expected of them. Here’s a checklist of details to include in your quiz instructions:

  1. Time Limit: Explicitly mention the duration of the quiz, the amount of time students have to complete the quiz once they have started it, or if it’s untimed. Suggest how they may want to pace the quiz to ensure they have time to complete all the questions.
  2. Availability Window: You should specify an availability window for asynchronous online students. It refers to the time frame during which the quiz can be accessed and started. By giving an extended window, you allow students to take the quiz at a time that suits them. Once they begin, the quiz duration will apply.
  3. Number of Attempts: Indicate whether students have multiple attempts or just a single opportunity to take the quiz.
  4. Question Format: Provide information about the types of questions included and any specific formatting requirements. 
  5. Quiz Navigation: Have you enforced navigational restrictions on the quiz, such as preventing students from returning to a question or only showing questions one at a time? Share this information in the instructions and explain the reasoning.
  6. Point Allocation: Break down how points are distributed, including details for varying point values and partial credit.
  7. Resources: Specify whether students can use external resources, textbooks, or notes during the quiz.
  8. Academic Integrity Reminders: Reinforce the importance of academic integrity, detailing expectations for honest conduct during the quiz.
  9. Feedback and Grading: Clarify how and when students will receive feedback and their grades.
  10. Showing Work: If relevant, provide clear guidelines on how students present their work (solving equations, pre-writing activities, etc.) or reasoning for particular question types.

End with a supportive “Good Luck!” to ease students’ nerves and inspire confidence.

Crafting comprehensive quiz instructions is a vital step in ensuring successful course delivery. Providing students with clear expectations, guidelines, and support enhances their quiz experience and contributes to a positive and productive learning environment (Detterman & Andrist, 1990). As course developers and designers, we are responsible for fostering these optimal conditions for student success. Plus, as my father would say, it is satisfying to know you have “done the best you can do.”

References

Align Assessments, Objectives, Instructional Strategies—Eberly Center—Carnegie Mellon University. (n.d.). Eberly Center: Carnegie Mellon University. Retrieved June 28, 2023, from https://www.cmu.edu/teaching/assessment/basics/alignment.html

Detterman, D. K., & Andrist, C. G. (1990). Effect of Instructions on Elementary Cognitive Tasks Sensitive to Individual Differences. The American Journal of Psychology, 103(3), 367–390. https://doi.org/10.2307/1423216

Footnote: My son called as I was wrapping up this post. I told him I was finishing up a blog post for Ecampus. “I kind of threw Grandpa under the bus,” I said. After I shared the history test example, he said, “you didn’t learn much.” He and his sister felt similar academic pressure; I may have even used the same line about the best you can do. In my defense, he is now. Ph.D. candidate in Medicinal Chemistry and his sister just completed a Masters in Marine Bio.

Image by: pingebat, licensed from Adobe Stock

As higher-ed professionals involved in course design, we have the honor, privilege, and responsibility of shaping the learning experiences for countless students. Among the many tools at our disposal, course mapping stands out as a fundamental technique that deserves a spotlight. Couse mapping fosters clarity, and showcases alignment between the learning outcomes/objectives and course materials, assessments and activities. In this blog post, we will explore the importance of course mapping in online higher-ed courses, highlighting its role in meeting the new requirements in the recently updated Quality Matters (QM) rubric 7th edition. Join us as we delve into the transformative power of course mapping, benefiting course developers, instructors, instructional designers, and learners alike.

The Big-Picture:

The updated QM rubric (7th edition) recognizes the strength of course maps as a design tool, and has now made them a required element for course review. To quote the QM rubric update workshop (2023), “the course map must include all of the following components mapped to one another so the connection between them is apparent: course learning [outcomes/] objectives, module learning outcomes/objectives, assessments, materials, activities, and tools.” At its core, course mapping involves creating a visual representation of the entire course curriculum, breaking it down into manageable units, and illustrating the relationships between various components. This visual often takes the form of a table, but many variations exist. Course mapping is a holistic approach, which provides a roadmap for instructors, course developers, and designers to create a comprehensive, cohesive and well-structured learning experience; and for students to easily navigate and find the content and assignments. By explicitly relating the aforementioned course components, course maps simply demonstrate alignment and make clear the purpose of each element as part of the larger picture. 

Orchestrating a Symphony of Learning & Student Success:

With the implementation of the new QM rubric (7th edition), course mapping has gained significant prominence as a means of ensuring alignment and coherence across the curriculum.  By mapping out the weekly outcomes/objecives, learning activities, materials, tools, and assessments, instructors can ensure that each component of the course aligns with the overall outcomes/objetcives. This process can highlight pathways for students to progress logically through the content. Additionally, course mapping facilitates coordination among multiple instructors or instructional designers involved in a course, enabling a consistent design and a more harmonic learning experience for students. Much like a conductor of an orchestra, a course map provides the nuanced direction to each section. Harmony in a design means that elements are unified. Learners benefit from this because they more clearly connect their learning activities with a specific purpose. 

By imbuing the many learning activities with clear purpose (alignment to the outcomes/objectives), learners understand the work they are being asked to complete.  Mapping out course activities also provides instructors with a high-level view of their course, which helps ensure a balanced distribution of learning strategies, which can help accommodate a variety of learning needs. As a result, students are more likely to be engaged, motivated, and empowered to take ownership of their learning, which can lead to improved learning. Course maps act as a first step towards transparent course design, which empowers learners to take initiative and work through problems independently. If we give them all the pieces and help them make connections, they can forge their own pathway to success.

Efficiency and Continuous Improvement:

Course mapping also acts as a vehicle for efficiency and continuous improvement in higher education courses. By visualizing the entire course, instructors and instructional designers can identify potential gaps, redundancies, or misalignments, leading to more effective course revisions. Moreover, the iterative nature of course mapping promotes reflection and collaboration among course developers, instructors, instructional designers, and course reviewers, fostering a culture of continuous improvement. 

Additionally, for instructors the course map then acts as a blue print for the course, which can enhance the connection between the course elements, which can also be helpful if course outcomes/objectives need to change. For instance, courses with detailed maps might be more efficiently adapted, as instructors can easily identify parts of their courses that will need to change and know where to focus their energy.

Assessment and Accreditation – Meeting Quality Standards:

Accreditation bodies and quality assurance agencies like QM place a strong emphasis on clearly defined learning outcomes/objectives and assessment strategies. Course mapping provides a comprehensive framework for demonstrating alignment with quality standards or accreditation competencies. By mapping learning outcomes/objectives to assessments, instructors can provide evidence of student achievement and ensure that all necessary areas are adequately covered. This not only satisfies accreditation requirements but also enhances transparency and accountability within the course, program, and even the institution. At OSU Ecampus, we use the Ecampus Essentials list to ensure we are creating high-quality online and hybrid learning experiences. All Ecampus courses are expected to meet the essential standards and are strongly encouraged to meet the exemplary standards.

Conclusion:

As higher education professionals, we have a shared responsibility to provide transformative courses and programs that prepare learners for the challenges of the future. Course mapping stands as a crucial tool in achieving this goal by fostering alignment, engagement, and continuous improvement. As the new Quality Matters (QM) rubric (7th edition) recognizes, course mapping is an essential practice in creating intentional and effective courses. By investing time and effort in course mapping, instructors and instructional designers can craft coherent and purposeful learning experiences that empower students and maximize their potential for success.

Let’s embrace course mapping as a tool for success in online higher education, ensuring that our courses are meticulously crafted, intentional, and impactful. 

Course Mapping Tools:

  1. The Online Course Mapping Guide
  2. OSU Ecampus Course Planning Chart
  3. Berkeley Digital Learning Services Course Map Template (Public Use)
  4. University of Arizona Course Map Templates

Course Map Samples Shared in the QM Rubric Update:

  1. ACCT 3551 Course Map
  2. Course Alignment Map for HIS 121 American History to 1865

References:

Beckham, R., Riedford, K., & Hall, M. (2017). Course Mapping: Expectations Visualized. Journal for Nurse Practitioners, 13(10), e471–e476. https://doi.org/10.1016/j.nurpra.2017.07.021 

Digital Learning Hub in the Teaching + Learning Commons at UC San Diego. (n.d.). What is a Course Map? The Online Course Mapping Guide. Retrieved July 5, 2023, from https://www.coursemapguide.com/what-is-a-course-map

Quality Matters. (2023, May 22). QM Course Worksheet, HE Seventh Edition. Retrieved July 5, 2023, from https://docs.google.com/document/d/16d1mDaII_kgXvyjeT_brn-TKqACnr_OY_D_r5SnJlC0/edit 

This month brings the new and improved QM Higher Education Rubric, Seventh Edition! To see the detailed changes, you can order the new rubric or take the Rubric Update Session, which is a self-paced workshop that will be required for all QM role holders. In the meantime, if you’d like a short summary of the revisions, continue reading below.

The main changes include:

  • The number of Specific Review Standards has increased from 42 to 44.
  • The points value scheme was also slightly revised, with the total now being 101.
  • A few terminology updates were implemented.
  • The descriptions and annotations for some of the general and specific standards were revised.
  • The instructions were expanded and clarified, with new additions for synchronous and continuous education courses.

Most of the standards (general or specific) have undergone changes consisting of revised wording, additional special instructions, and/or new examples to make the standards clearer and emphasize the design of inclusive and welcoming courses. In addition, some standards have received more substantial revisions – here are the ones that I found the most significant:

Standard 3: There is a new Specific Standard: SRS 3.6: “The assessments provide guidance to the learner about how to uphold academic integrity.” This standard is met if “the course assessments incorporate or reflect how the institution’s academic integrity policies and standards are relevant to those assessments.” SRS 3.6 is the main addition to the 7th edition, and a very welcome one, especially considering the new complexities of academic integrity policies.

Standard 4: SRS 4.5 (“A variety of instructional materials is used in the course.”) has received an important annotation revision – this standard is met if at least one out of three of the following types of variety are present in the course: variety of type of media; different perspectives/representations of ideas; diverse, non-stereotypical representations of persons or demographic groups. I was really happy to see this clarification, since it’s always been a little difficult to evaluate what constitutes “variety”, and reviewers will certainly appreciate the recognition of diversity of people and ideas.

Standard 8: SRS 8.3 was divided into two separate Specific Standards: SRS 8.3 “Text in the course is accessible.” and SRS 8.4 “Images in the course are accessible.” At the same time 8.5 (former 8.4) was turned into “Video and audio content in the course is accessible.” This should allow for a more nuanced evaluation of the various accessibility elements, and it is nice to see the focus on captions for both video and audio materials. Moreover, these three standards (SRS 8.3, 8.4, and 8.5) now include publisher-created content – this is an important step forward in terms of advocating for all educational materials to be made accessible upfront.

In addition to the standards themselves, some changes were made to the Course Format Chart, the Course Worksheet, and the Glossary. Notably, a course/alignment map is now required with the Course Worksheet – a change that is sure to spark delight among QM reviewers. The definitions of activities and assessments were also revised to clarify the distinction between the two – another much-needed modification that should eliminate a common point of confusion.

Overall, the new edition brings about clearer instructions, more relevant examples, and a deeper inclusion of diversity, accessibility, and academic integrity. Reviewers and course designers should find it easier to evaluate or create high quality courses with this updated guidance.

The following is a guest blog post from Julia DeViney. Julia completed an Instructional Design internship with OSU Ecampus during the Spring of 2023.

What are student perceptions of Voice Thread? I observed the pros and cons of Voice Thread (VT) as both a student in my final term of a cohort-structured program, and on the instructor side as an Ecampus intern. The purpose of this post is to synthesize my experiences with research on VT. Integrated with Canvas as a cloud-based external web tool, VT is an interactive platform that allows instructors and students to create video, audio, and text posts and responses asynchronously. It is used widely at OSU and available for use in all Ecampus courses.

My unique role as both current student using the tool and intern seeing the tool from the instructor’s perspective allowed me to get a thorough understanding of VT. While I was challenged by time requirements and experienced diminishing value with more frequent discussions, used strategically, VT can be a worthwhile tool for instructors and students.

The strengths of VT include fostering dense interaction and strong social presence, and ease of use; drawbacks can be avoided by considering the audience, use frequency, and purpose of using VT in a learning environment.

VT allows users to upload premade slides or images and record text, audio, or video comments to their own and peers’ slides, allowing for a rich back and forth dialogue that fosters dense social presence and interaction in a learning environment.

In my course, students used this video feature exclusively for initial posts, and occasionally used audio recordings for peer responses. Hearing vocal inflection and seeing each other on screen in natural environments helped us witness emotions, interact authentically, and build on each other’s ideas to create richer learning. Delmas (2017) and Ching and Hsu (2012) found similar results in their respective studies of using VT to build online community and support collaborative learning.

Another strength of VT is ease of use. Brief VT navigation instructions provided by the instructor abbreviated the learning curve for students new to this tool. Making a video slide or commenting on peer’s slides was straightforward and simple. VT automatically previews submitter-created slides or comments prior to saving, and this allows students to redo their slide or comment if they are not satisfied with their first attempt. I found this feature particularly helpful.

Students’ prior interactions and frequency of use are considerations for instructors’ use of VT. As a student who already intensely engaged with most of the peers in my cohort through discussions, group projects, presentations, and peer feedback assignments, dense social presence was not as valuable to me in my final term. However, this course included a few students from other disciplines, and I appreciated quickly getting to know them through their posts and responses. This class utilized VT intermittently; in later-term posts, I found myself less motivated to respond as robustly as in the beginning of the term. Chen and Bogachenko (2022) echoed my experience: mandated minimum posting requirements and prompt frequency may influence social presence density results.

Student connection may not increase student engagement and is best-suited for certain types of knowledge construction. Responding to the minimum required number of students was common practice among graduate students in a 2013 study by Ching and Hsu; this differs from findings from a study by Kidd (2012), which focused on student-instructor interactions. Student obligations outside of school are cited as the primary reason for meeting minimum requirements only (Ching & Hsu, 2012). In my experience, a few classmates responded to more than the minimum required responses, as time allowed. Students tended to develop a stronger consensus of ideas shared in video-based interactions than in text-based interactions; future research is needed to evaluate the degree of critical or summarizing skills developed in video-based forums (Guo et al., 2022). In my course, VT discussion prompts were largely reflective, and that maximized the strengths of the tool.

Time may be another drawback for some students. While many of my classmates created unscripted video posts and responses to discussion prompts, a few of us spent extra time scripting posts and responses, which added time to assignments. Ching and Hsu (2013) found that for contemplative or anxious students who “structure their ideas prior to making their ideas public,” the time requirement is a disadvantage (p. 309). I did not experience technological glitches, but that has been mentioned as an additional time consideration.

For instructors, time needed to learn to set up and use VT themselves was cited as a major drawback (Salas & Miller, 2015). However, the instructors studied used VT outside of their institution’s learning management system. At OSU, VT is seamlessly integrated into Canvas and SpeedGrader. Easy-to-follow guides and Ecampus support significantly reduce the risk of use for faculty. VT is a superb tool for creating dense social presence in hybrid or online courses for collaborative assignments or consensus-building discussions.

From the instructor side, I recommend carefully considering the pros and cons of assignment type: a) create, b) comment, or c) watch. Remember that “create” assignments require students to post at least one comment and create a slide. The “comment” assignment type still allows students to create a slide, and instructors have more flexibility in establishing minimum slide and/or comment requirements, provided those minimums match the Canvas assignments. “Watch” assignments could work well for crucial announcements or video-based instruction. For all assignments, I also recommend communicating in both Canvas and VT that clicking the “Submit Assignment” button is a very important step (for continuity with SpeedGrader). Setting up assignments in VT was simple and straightforward once I understood the assignment types.

In short, VT powerfully facilitates dense social presence and community using asynchronous video, audio, and text-based interactions among instructors and students. When used as a tool for reflection or consensus-building, students benefit from VT interactions. Overuse and time constraints may compromise use value, particularly for students with anxiety or needing extra preparation. OSU Ecampus offers support and guides to assist instructors with incorporating VT into Canvas. To reap the benefits of this fantastic tool, I recommend exploring the practical uses of VT in hybrid and online courses.

References

Chen, J., & Bogachenko, T. (2022). Online community building in distance education: The case of social presence in the Blackboard discussion board versus multimodal Voice Thread interaction._ Journal of Educational Technology & Society, 25_(2), 62-75. https://oregonstate.idm.oclc.org/login?url=https://www.proquest.com/scholarly-journals/online-community-building-distance-education-case/docview/2652525579/se-2
Ching, Y-H., & Hsu, Y-C. (2013). Collaborative learning using Voice Thread in an online graduate course._ Knowledge Management & E-Learning, 5_(3), 298-314. https://oregonstate.idm.oclc.org/login?url=https://www.proquest.com/scholarly-journals/collaborative-learning-using-Voice Thread-online/docview/1955098489/se-2
Delmas, P. M. (2017). Using Voice Thread to create community in online learning._ TechTrends, 61_(6), 595-602. https://doi.org/10.1007/s11528-017-0195-z
Guo, C., Shea, P., & Chen, X. (2022). Investigation on graduate students’ social presence and social knowledge construction in two online discussion settings. Education and Information Technologies, 27(2), 2751-2769. https://link-gale-com.oregonstate.idm.oclc.org/apps/doc/A706502995/CDB?u=s8405248&sid=bookmark-CDB&xid=7f135a22
Salas, A., & Moller, L. (2015). The value of voice thread in online learning: Faculty perceptions of usefulness. Quarterly Review of Distance Education, 16(1), 11-24. https://link-gale.com.oregonstate.idm.oclc.org/apps/doc/A436983171/PROF?u=s8405248&sid=bookmark-PROF&xid=2759f021

track late assignments with a simple Canvas quiz

This fall and winter I worked with two instructors from very different disciplines to achieve a common goal – making their courses inclusive through a flexible assignment policy. Both courses gave students opportunities to recover without penalty from what would otherwise be a setback – a late, missed, or low-scoring assignment. Flexible assignment policies aren’t new, but instructors are now extending grace to students without the requirement to ask in advance, provide an excuse, or share official documentation, such as a doctor’s note. Our collective pandemic experience has revealed the inequity of having the instructor adjudicate the validity of the excuse, as well as the impracticality of producing documentation on demand. (Can you readily access a doctor when you’re too sick to work? If you could, wouldn’t that take up time and energy you’d rather use to complete the work itself?) Finally, as a third instructor commented to me, simply reading the traumatic narratives that students share voluntarily (let alone requiring these excuses) implicates her in a form of voyeurism. Discomfort aside, students’ entreaties take up time, requiring at least one exchange of emails, and that can fill up the inbox and prove hard to track over the course of the term.

As an elegant fix, the courses I’ve worked on explicitly state in their syllabi and assignments that no reason is required and that permission is granted automatically – eliminating the need for email back and forth and the inequitable requirement for justification. This policy fit the goals of both instructors – one who was interested in student retention in a difficult course, and the other who had implemented a labor-based grading approach to extend greater agency to 100-level, Gen Ed students.

But the instructors still needed a way to track the excused assignments. So what’s clever and new (to me) about these courses is how the instructors and I executed the policy in our LMS – a graded Canvas quiz (in a 0% weighted assignment group so as not to interfere with the final grade) that inquires only about the logistics:

  • What is the assignment? (Be as specific as possible.) 
  • How are you using this particular opportunity for flexibility (of the three opportunities granted)? (The assignment is late/missing/requires revision.) 
  • When will you turn it in? (This helps me grade it promptly.)

Because the quiz is set to accept just 3 attempts, the instructor doesn’t need to manually tally the number of permissions already granted. And once the student fills out the quiz, safe in the assumption that permission has been secured, they can get down to the business of completing their work, without shame or delay.

One instructor framed these opportunities as “tokens,” like the tokens you spend at Chuck E Cheese — they’re only good when you’re in the establishment, so while it’s great if you finish the term without needing to use them, there’s no particular reason to save them up! “Tokens are my acknowledgment that we all make mistakes, misread instructions, and that life things come up. We falter. Use a token!” (“What are tokens in this class?”, Dr. Jenna Goldsmith). The Canvas quiz was termed a “Token Tracker,” complete with a cartoonish golden coin icon. By week 6 of the term, just short of half the students had availed themselves of it.

I was excited to learn of these instructors’ policies and to help craft their execution in Canvas. If you’d like to try out this inclusive assignment policy, or have an idea for another policy that will be unfamiliar to students, consider how you can use the tools at hand to present the policy as easy and natural — as much a part of our standard operating procedures as the old, inequitable way of doing things. Students will be more likely to benefit that way.

With gratitude to Dr. Jackie Goldman, who first shared the idea of an automated “extension quiz,” and Dr. Jenna Goldman, who adapted the quiz for use in her course.

As educators and instructional designers, one of our tasks is to create online learning environments that students can comfortably use to complete their course activities effectively. These platforms need to be designed in such a way as to minimize extraneous cognitive load and maximize generative processing: that is, making sure that the learners’ efforts are spent on understanding and applying the instructional material and not on figuring out how to use the website or app. Research and practice in User Experience (UX) design – more specifically, usability – can give us insights that we can apply to improve our course page design and organization.

Getting Started: General Recommendations

Steve Krug, in his classic book Don’t Make Me Think: A Common Sense Approach to Web Usability, explains that, in order for a website or app to be easy to use, the essential principle can be stated as “don’t make me think” (Krug, 2014). That may sound like a strange principle in an educational context, but what Krug referred to is precisely the need to avoid wasting the users’ cognitive resources on how a particular platform works (thus reducing extraneous cognitive load), and to make them feel comfortable using that product (enhancing generative processing). When looking at a web page or app, it should be, as much as possible, obvious what information is on there, how it is organized, what can be clicked on, or where to start; this way, the user can focus on the task at hand.

Krug (2014) provided a few guidelines for ensuring that the users effortlessly see and understand what we want them to:

  • Use conventions: Using standardized patterns makes it easier to see them quickly and to know what to do. Thus, in online courses, it helps to have consistency in how the pages are designed and organized: consider using a template and having standard conventions within a program or institution.
  • Create effective visual hierarchies: The visual cues should represent the actual relationships between the things on the page. For instance, the more important elements are larger, and the connected parts are grouped together on the page or designed in the same style. This saves the user effort in the selection and organization processes in the working memory.
  • Separate the content into clearly defined areas: If the content is divided into areas, each with a specific purpose, the page is easier to parse, and the user can quickly select the parts that are the most relevant to them.
  • Make it obvious what is clickable: Figuring out the next thing to click is one of the main things that users do in a digital environment; hence, the designer must make this a painless process. This can be done through shape, location or formatting—for example, buttons can help emphasize important linked content.
  • Eliminate distractions: Too much complexity on a page can be frustrating and impinges on the users’ ability to perform their tasks effectively. Thus, we need to avoid having too many things that are “clamoring for your attention” (Krug, 2014, Chapter 3). This is consistent with the coherence principle of multimedia learning, which states that elements that do not support the learning goal should be kept to a minimum and that clutter should be avoided. Related to this, usability experts recommend avoiding repeating a link on the same page because of potential cognitive overload. This article from the Nielsen Norman Group explains why duplicate links are a bad idea, and when they might be appropriate.
  • Format text to support scanning: Users often need to scan pages to find what they want. We can do a few things towards this goal: include well-written headings, with clear formatting differences between the different levels and appropriate positioning close to the text they head; make the paragraphs short; use bulleted lists; and highlight key terms.

Putting It to the Test: A UX Study in Higher Education

The online learning field has yet to give much attention to UX testing. However, a team from Penn State has recently published a book chapter describing a think-aloud study with online learners at their institution (Gregg et al., 2020). Here is a brief description of their findings and implications for design:

  • Avoid naming ambiguities – keep wording clear and consistent, and use identical terms for an item throughout the course (e.g., “L07”, “Lesson07)
  • Minimize multiple interfaces – avoid adding another tool/platform if it does not bring significant benefits.
  • Design within the conventions of the LMS – for example, avoid using both “units” and “lessons” in a course; stick to the LMS structure and naming conventions as much as possible.
  • Group related information together – for example, instead of having pieces of project information in different places, put them all on one page and link to that when needed.
  • Consider consistent design standards throughout the University – different departments may have their own way of doing things, but it is best to have some standards across all classes.

Are you interested in conducting UX testing with your students? Good news: Gregg et al. (2020) also reflected on their process and generated advice for conducting such testing, which is included in their chapter and related papers. You can always start small! As Krug (2014, Chapter 9) noted, “Testing one user is 100 percent better than testing none. Testing always works, and even the worst test with the wrong user will show you important things you can do to improve your site”.

References

Gregg, A., Reid, R., Aldemir, T., Gray, J., Frederick, M., & Garbrick, A. (2020). Think-Aloud Observations to Improve Online Course Design: A Case Example and “How-to” Guide. In M. Schmidt, A. A. Tawfik, I. Jahnke, & Y. Earnshaw (Eds.), Learner and User Experience Research: An Introduction for the Field of Learning Design & Technology. EdTech Books. https://edtechbooks.org/ux/15_think_aloud_obser

Krug, S. (2014). Don’t make me think, revisited: A common sense approach to Web usability. New Riders, Peachpit, Pearson Education.

Loranger, H. (2016). The same link twice on the same page: Do duplicates help or hurt? Nielsen Norman Group. https://www.nngroup.com/articles/duplicate-links/

Image by Benjamin Abara from Pixabay 

My family and I were preparing for a move. We packed up some of our things, removing extraneous items from our walls and surfaces and preparing our house to list and show. Not willing to part with these things, we rented a small storage unit to temporarily warehouse all this extra “stuff.” Well, as it turned out, we ended up not moving at all, and after a few months went to clear out the storage unit and retrieve our extra things. The funny thing was, we could hardly remember what had gone in there, and as it turns out, we did not miss most of the items we had packed away. We ended up selling most of what was in that storage unit, and shortly thereafter, we did even more “spring cleaning.” One of the bedrooms, which also doubles an office, needed particular attention. The space was dysfunctional, in that multiple doors and drawers were blocked from fully opening. After a little purging and reorganization this room now functions beautifully, with enough space to open every door and drawer. I have been calling this process “moving back into our own house,” and it’s been a joy to rethink, reorganize, and reclaim our living spaces.

Course Design Connection

As I have been working with more instructors who are redeveloping existing courses, I have been trying to bring this mindset into my instructional design work. How can we reclaim our online learning spaces and make them more inviting and functional? How can we help learners open all the proverbial doors and operate fully within the learning environment? You guessed it: While our first instinct might be to add more to the course, the answer might lie in the other direction. With a little editing and a keen eye on alignment, we can very intentionally remove things from our courses that might be needless or even distracting. We can also rearrange our pages and modules to maximize our learner’s attention.

Memory and Course Design

Our working memories, according to Cowan (2010), can only store 3-5 meaningful items at a time. Thus, it becomes essential to consider what is genuinely necessary on any given LMS page. If we focus on helping learners to achieve the learning outcomes when choosing the content to keep in each module, we can intentionally remove distractors. There can be a place for tangential or supplemental information, but those items should not live in the limelight. To help get us started on this “cleaning process,” we can ask ourselves a few simple questions. Are there big-ticket items (assignments, discussions, readings) that are not directly helping learners reach the outcomes? Are we formatting pages and arranging content in beneficial and progressive ways? Might we express longer bodies of text in ways that are more concisely or clearly? Can we break text up with related visuals? Below are some tips to help guide your process as you “clean” up your course and direct your learners where to focus.

Cut out the Bigger Extraneous Content

It is simple to assume that for your learners to meet the course outcomes, they must read and comprehend many things and complete a wide variety of assignments. When planning your learning activities, it’s crucial to keep in mind the limits of the brain and also that giving learners opportunities to practice applying content will be more successful than asking them to memorize and restate it. For courses with dense content, lean into your course outcomes to guide your editing process. Focusing on the objectives can help you remove extraneous readings and activities.  This will allow your learners to concentrate on the key points. (Cowden & Sze, 2012)

Review Instructions

For the items you choose to keep in your course, reviewing assignment instructions, and discussion prompts is helpful.  Consider inviting a non-expert to read these items.  An outside eye might help you to simplify what you are asking your learners to accomplish by calling to your attention any points of confusion. You may be tempted to add more detail, but try to figure out where you can remove text when possible. Why use a paragraph to explain something that only needs a few sentences? Simplifying your language can enable learners to get to the point faster. (For more on this, see the post by intern Aimee L. Lomeli Garcia about  Improving Readability). When reviewing your instructions and prompts, think about what learners want to know:

·       What should they pay attention to?

·       Where do they start?

·       What do they do next?

·       What is expected?

·       How are they being assessed/graded?

(Grennan, 2018)

Utilize Best Practices for Formatting

Use native formatting tools like styles, headers, and lists to help visually break up content and make it more approachable. Here are some examples:

If I were to list my favorite animals here without a list, it would look like this: dogs, turtles, hummingbirds, frogs, elephants, and cheetahs. 

Suppose I give you that same list using a header and number list format. In that case, it becomes much easier to digest mentally, and it looks nicer on the page:

Julie’s Favorite Animals

  1. Dogs
  2. Turtles
  3. Hummingbirds
  4. Frogs
  5. Elephants
  6. Cheetahs

Provide High-Level Overviews

If an assignment does need a more thorough explanation, and your instructions are running long, you can always create a high-level overview, calling out the main points of the page. You could place this in a call-out box or its own section (preferably at the top). This is where learners can quickly look for reminders about what to do next and how to do it. Providing a high-level overview alongside detailed instructions will cater to a variety of learning preferences and help set up your learners for success.

Module Organization

Scaling up beyond single pages and assignments to module organization, consider the order you want learners to encounter ideas and accomplish tasks. Don’t be afraid to move pages around within your modules to help learners find the most efficient and helpful pathway through your material (Shift Elearning, n.d.).

Wrapping It Up

The culture of “more is better” is pervasive, and it’s almost always easier to add rather than to remove information. In online learning, when we buy into the “culture of more” we can impede the success of our learners. But more isn’t always better; sometimes more is just more. Instead, don’t be afraid to dust off that delete button and start reclaiming and reorganizing your course for ultimate learner success. Sometimes less is best. For more on the art of subtraction, see Elisabeth McBrien’s blog post from February of 2022.

References

Cowan, N. (2010). The magical mystery four. Current Directions in Psychological Science, 19(1), 51–57. https://doi.org/10.1177/0963721409359277

Cowden, P., & Sze, S. (2012). ONLINE LEARNING: THE CONCEPT OF LESS IS MORE. Allied Academies International Conference.Academy of Information and Management Sciences.Proceedings, 16(2), 1-6. https://oregonstate.idm.oclc.org/login?url=https://www.proquest.com/scholarly-journals/online-learning-concept-less-is-more/docview/1272095325/se-2

Grennan, H. (2018, April 30). Why less is more in Elearning. Belvista Studios – eLearning Blog. Retrieved April 4, 2023, from http://blog.belvistastudios.com/2018/04/why-less-is-more-in-elearning.html

Lomeli Garcia, A. L. (2023, January 17). Five Tips on Improving Readability in Your Courses. Ecampus Course Development and training. Retrieved April 4, 2023, from https://blogs.oregonstate.edu/inspire/2023/01/17/five-tips-on-improving-readability-in-your-courses/

McBrien, E. (2022, February 24). Course design challenge: Try subtraction. Ecampus Course Development and training. Retrieved April 4, 2023, from https://blogs.oregonstate.edu/inspire/2022/02/24/course-design-challenge-try-subtraction/

Parker, R. (2022, June 30). Why less is more for e-learning course materials. Synergy Learning. Retrieved April 4, 2023, from https://synergy-learning.com/blog/why-less-is-sometimes-more-when-it-comes-to-your-e-learning-course-materials/

Shift Elearning. (n.d.). The art of simplification in Elearning Design. The Art of Simplification in eLearning Design. Retrieved April 4, 2023, from https://www.shiftelearning.com/blog/the-art-of-simplification-in-elearning-design

University of Waterloo, Queen’s University, & University of Toronto; and Conestoga Colleg (n.d.). Module 3: Quality course structure and content. In High Quality Online Courses . essay, Pressbooks Open Library, from https://ecampusontario.pressbooks.pub/hqoc/chapter/3-1-module-overview/

Online writing support center appointment options. 50 minute Zoom or written feedback via email

Ecampus students have access to a number of online resources to support their academic success at OSU. Receiving guidance and feedback on their writing assignments can be helpful across courses, throughout their planning and revision process. In this post, we will share more information about the current writing resources available to students, no matter where they are located, along with resources for faculty.

OSU Writing Center

The OSU Writing Center supports any type of writing project, during any stage of the writing process. Instructors can share this resource with students, or even integrate the writing center’s support as a step to receive guidance and feedback from a consultant in coordination with a class assignment.

Online Writing Support (OWS)

According to the OWS website, both written feedback and virtual support (held over Zoom) are available to all OSU community members, including Ecampus students.

Any OSU community member can submit writing for written feedback or schedule a Zoom appointment. This includes students, faculty, staff, and alumni. However, graduate students working on dissertations, theses, IRB applications, grant applications, manuscripts, and other advanced graduate projects should connect with the Graduate Writing Center for support.

Students can choose one of the following appointment types when they submit their request online:

  • Consultation (50 minutes, Zoom)
  • Written Feedback (Replies are usually within 24 hours, Email)
Image of the appointment options on the OWS website. One is a writing consultation over Zoom and the other is written feedback via Email.
Scheduling options for Online Writing Support (OWS)

The Writing Center’s website includes answers to common questions. Here are some of the responses to questions students might have about this resource:

  1. How often can I use Online Writing Support?
    • You can request written feedback on up to three writing projects (or three drafts of the same project) per week. You can make Zoom appointments as often as you like. We welcome repeat writers as we enjoy being a part of your writing process. You cannot schedule an appointment more than two weeks in advance, but we invite you to work with us often. 
  2. What kind of writing can I submit for written feedback?
    • You can submit any kind of writing, as long as it doesn’t exceed 25 double-spaced pages (around 6,250 words). Ideally, for longer projects, you should be prepared to request several written feedback consultations, each focusing on a different section of the project.
  3. How can I provide my instructor with confirmation that I used Online Writing Support?
    • All OWS consultations will receive an email confirmation after the appointment occurs or after the feedback has been sent to you—usually the next morning. If your instructor requests confirmation that you sought assistance from the OWS, you may forward or capture a screen shot of the confirmation email.

For more information about the type of support the Writing Center provides, please see their overview video below.

An overview of the resources provided by the OSU Writing Center and how to submit requests via the website

Academic Success Center – Writing Resources

Student Resources

  • Academic Success Workshop Series – Each term the ASC hosts a series of workshops on a variety of topics. Their remote series is available for online registration and hosted via Zoom.
    • For the Spring 2023 term, the workshop schedule is listed below and features a writing-focused workshop in Week 6.
    • The details of the workshop series, along with links to register, are available on the Remote Workshop Series website.
  • The Learning Corner – The learning corner provides a number of online tools, such as guides and fillable worksheets, to support students in reaching their academic goals.
  • Services & Programs – Supplemental Instruction (SI) is available for certain courses via Zoom, as well as academic coaching support.

Faculty Resources

A number of faculty support options are offered on the Faculty Resources page, including an optional Canvas module, PowerPoint slides, and a sample Syllabus statement. The Online Writing Support group and Academic Success Center partner with faculty to collaborate on assignments and course-specific tips for implementing writing support for their online students.

Instructors can email writing.center@oregonstate.edu to discuss ideas for implementation in their course.