By Greta Underhill

In my last post, I outlined my search for a computer-assisted qualitative data analysis software (CAQDAS) program that would fit our Research Unit’s needs. We needed a program that would enable our team to collaborate across operating systems, easily adding in new team members as needed, while providing a user-friendly experience without a high learning curve. We also needed something that would adhere to our institution’s IRB requirements for data security and preferred a program that didn’t require a subscription. However, the programs I examined were either subscription-based, too cumbersome, or did not meet our institution’s IRB requirements for data security. It seemed that there just wasn’t a program out there to suit our team’s needs.

However, after weeks of continued searching, I found a YouTube video entitled “Coding Text Using Microsoft Word” (Harold Peach, 2014). At first, I assumed this would show me how to use Word comments to highlight certain text in a transcript, which is a handy function, but what about collating those codes into a table or Excel file? What about tracking which member of the team codes certain text? I assumed this would be an explanation of manual coding using Word, which works fine for some projects, but not for our team.

Picture of a dummy transcript using Lorem Ipsum placeholder text. Sentences are highlighted in red or blue depending upon the user. Highlighted passages have an associated “comment” where users have written codes.

Fortunately, my assumption was wrong. Dr. Harold Peach, Associate Professor of Education at Georgetown College, had developed a Word Macro to identify and pull all comments from the word document into a table (Peach, n.d.). A macro is “a series of commands and instructions that you group together as a single command to accomplish a task automatically” (Create or Run a Macro – Microsoft Support, n.d.). Once downloaded, the “Extract Comments to New Document” macro opens a template and produces a table of the coded information as shown in the image below. The macro identifies the following properties:

  • Page: the page on which the text can be found
  • Comment scope: the text that was coded
  • Comment text: the text contained in the comment; for the purpose of our projects, the code title
  • Author: which member of the team coded the information
  • Date: the date on which the text was coded

Picture of a table of dummy text that was generated from the “Extract Comments to New Document” Macro. The table features the following columns: Page, Comment Scope, Comment Text, Author, and Date.

You can move the data from the Word table into an Excel sheet where you can sort codes for patterns or frequencies, a function that our team was looking for in a program as shown below:

A picture of the dummy text table in an Excel sheet where codes have been sorted and grouped together by code name to establish frequencies.

This Word Macro was a good fit for our team for many reasons. First, our members could create comments on a Word document, regardless of their operating system. Second, we could continue to house our data on our institution’s servers, ensuring our projects meet strict IRB data security measures. Third, the Word macro allowed for basic coding features (coding multiple passages multiple times, highlighting coded text, etc.) and had a very low learning curve: teaching someone how to use Word Comments. Lastly, our institution provides access to the complete Microsoft Suite so all team members including students that would be working on projects already had access to the Word program. We contacted our IT department to have them verify that the macro was safe and for help downloading the macro.

Testing the Word Macro       

Once installed, I tested out the macro with our undergraduate research assistant on a qualitative project and found it to be intuitive and helpful. We coded independently and met multiple times to discuss our work. Eventually we ran the macro, pulled all comments from our data, and moved the macro tables into Excel where we manually merged our work. Through this process, we found some potential drawbacks that could impact certain teams.

First, researchers can view all previous comments made which might impact how teammates code or how second-cycle coding is performed; other programs let you hide previous codes so researcher can come at the text fresh.

Second, coding across paragraphs can create issues with the resulting table; cells merge in ways that make it difficult to sort and filter if moved to Excel, but a quick cleaning of the data took care of this issue.

Lastly, we manually merged our work, negotiating codes and content, as our codes were inductively generated; researchers working on deductive projects may bypass this negotiation and find the process of merging much faster.

Despite these potential drawbacks, we found this macro sufficient for our project as it was free to use, easy to learn, and a helpful way to organize our data. The following table summarizes the pro and cons of this macro.

Pros and Cons of the “Extract Comments to New Document” Word Macro

Pros

  • Easy to learn and use: simply providing comments in a Word document and running the macro
  • Program tracks team member codes which can be helpful in discussions of analysis
  • Team members can code separately by generating separate Word documents, then merge the documents to consensus code
  • Copying Word table to Excel provides a more nuanced look at the data
  • Program works across operating systems
  • Members can house their data in existing structures, not on cloud infrastructures
  • Macro is free to download

Cons

  • Previous comments are visible through the coding process which might impact other members’ coding or second round coding
  • Coding across paragraph breaks creates cell breaks in the resulting table that can make it hard to sort
  • Team members must manually merge their codes and negotiate code labels, overlapping data, etc.

Scientific work can be enhanced and advanced by the right tools; however, it can be difficult to distinguish which computer-assisted qualitative data analysis software program is right for a team or a project. Any of the programs mentioned in this paper would be good options for individuals who do not need to collaborate or for those who are working with publicly available data that require different data security protocols. However, the Word macro highlighted here is a great option for many research teams. In all, although there are many powerful computer-assisted qualitative data analysis software programs out there, our team found the simplest option was the best option for our projects and our needs.

References 

Create or run a macro—Microsoft Support. (n.d.). Retrieved July 17, 2023, from https://support.microsoft.com/en-us/office/create-or-run-a-macro-c6b99036-905c-49a6-818a-dfb98b7c3c9c

Harold Peach (Director). (2014, June 30). Coding text using Microsoft Word. https://www.youtube.com/watch?v=TbjfpEe4j5Y

Peach, H. (n.d.). Extract comments to new document – Word macros and tips – Work smarter and save time in Word. Retrieved July 17, 2023, from https://www.thedoctools.com/word-macros-tips/word-macros/extract-comments-to-new-document/

For the first part of this post, please see Media Literacy in the Age of AI, Part I: “You Will Need to Check It All.”

Just how, exactly, we’re supposed to follow Ethan Mollick’s caution to “check it all” happens to be the subject of a lively, forthcoming collaboration from two education researchers who have been following the intersection of new media and misinformation for decades.

In Verified: How to Think Straight, Get Duped Less, and Make Better Decisions about What to Believe Online (University of Chicago Press, November 2023), Mike Caulfield and Sam Wineburg provide a kind of user’s manual to the modern internet. The authors’ central concern is that students—and, by extension, their teachers—have been going about the process of verifying online claims and sources all wrong—usually by applying the same rhetorical skills activated in reading a deep-dive on Elon Musk or Yevgeny Prigozhin, to borrow from last month’s headlines. Academic readers, that is, traditionally keep their attention fixed on the text—applying comprehension strategies such as prior knowledge, persisting through moments of confusion, and analyzing the narrative and its various claims about technological innovation or armed rebellion in discipline-specific ways.

The Problem with Checklists

Now, anyone who has tried to hold a dialogue on more than a few pages of assigned reading at the college level knows that sustained focus and critical thinking can be challenging, even for experienced readers. (A majority of high school seniors are not prepared for reading in college, according to 2019 data.) And so instructors, partnering with librarians, have long championed checklists as one antidote to passive consumption, first among them the CRAAP test, which stands for currency, relevance, authority, accuracy, and purpose. (Flashbacks to English 101, anyone?) The problem with checklists, argue Caulfield and Wineburg, is that in today’s media landscape—awash in questionable sources—they’re a waste of time. Such routines might easily keep a reader focused on critically evaluating “gameable signals of credibility” such as functional hyperlinks, a well-designed homepage, airtight prose, digital badges, and other supposedly telling markers of authority that can be manufactured with minimal effort or purchased at little expense, right down to the blue checkmark made infamous by Musk’s platform-formerly-known-as-Twitter.

Three Contexts for Lateral Reading

One of the delights in reading Verified is drawing back the curtains on a parade of little-known hoaxes, rumors, actors, and half-truths at work in the shadows of the information age—ranging from a sugar industry front group posing as a scientific think tank to headlines in mid-2022 warning that clouds of “palm-sized flying spiders” were about to descend on the East Coast. In the face of such wild ideas, Caulfield and Wineburg offer a helpful, three-point heuristic for navigating the web—and a sharp rejoinder to the source-specific checklists of the early aughts. (You will have to read the book to fact-check the spider story, or as the authors encourage, you can do it yourself after reading, say, the first chapter!) “The first task when confronted with the unfamiliar is not analysis. It is the gathering of context” (p. 10). More specifically:

  • The context of the source — What’s the reputation of the source of information that you arrive at, whether through a social feed, a shared link, or a Google search result?
  • The context of the claim — What have others said about the claim? If it’s a story, what’s the larger story? If a statistic, what’s the larger context?
  • Finally, the context of you — What is your level of expertise in the area? What is your interest in the claim? What makes such a claim or source compelling to you, and what could change that?
“The Three Contexts” from Verified (2023)

At a regional conference of librarians in May, Wineburg shared video clips from his scenario-based research, juxtaposing student sleuths with professional fact checkers. His conclusion? By simply trying to gather the necessary context, learners with supposedly low media literacy can be quickly transformed into “strong critical thinkers, without any additional training in logic or analysis” (Caulfield and Wineburg, p. 10). What does this look like in practice? Wineburg describes a shift from “vertical” to “lateral reading” or “using the web to read the web” (p. 81). To investigate a source like a pro, readers must first leave the source, often by opening new browser tabs, running nuanced searches about its contents, and pausing to reflect on the results. Again, such findings hold significant implications for how we train students in verification and, more broadly, in media literacy. Successful information gathering, in other words, depends not only on keywords and critical perspective but also on the ability to engage in metacognitive conversations with the web and its architecture. Or, channeling our eight-legged friends again: “If you wanted to understand how spiders catch their prey, you wouldn’t just look at a single strand” (p. 87).

SIFT graphic by Mike Caulfield with icons for stop, investigate the source, find better coverage, and trace claims, quotes, and media to the original context.

Image 2: Mike Caulfield’s “four moves”

Reconstructing Context

Much of Verified is devoted to unpacking how to gain such perspective while also building self-awareness of our relationships with the information we seek. As a companion to Wineburg’s research on lateral reading, Caulfield has refined a series of higher-order tasks for vetting sources called SIFT, or “The Four Moves” (see Image 2). By (1) Stopping to take a breath and get a look around, (2) Investigating the source and its reputation, (3) Finding better sources of journalism or research, and (4) Tracing surprising claims or other rhetorical artifacts back to their origins, readers can more quickly make decisions about how to manage their time online. You can learn more about the why behind “reconstructing context” at Caulfield’s blog, Hapgood, and as part of the OSU Libraries’ guide to media literacy. (Full disclosure: Mike is a former colleague from Washington State University Vancouver.)

If I have one complaint about Caulfield and Wineburg’s book, it’s that it dwells at length on the particulars of analyzing Google search results, which fill pages of accompanying figures and a whole chapter on the search engine as “the bestie you thought you knew” (p. 49). To be sure, Google still occupies a large share of the time students and faculty spend online. But as in my quest for learning norms protocols, readers are already turning to large language model tools for help in deciding what to believe online. In that respect, I find other chapters in Verified (on scholarly sources, the rise of Wikipedia, deceptive videos, and so-called native advertising) more useful. And if you go there, don’t miss the author’s final take on the power of emotion in finding the truth—a line that sounds counterintuitive, but in context adds another, rather moving dimension to the case against checklists.

Given the acceleration of machine learning, will lateral reading and SIFTing hold up in the age of AI? Caulfield and Wineburg certainly think so. Building out context becomes all the more necessary, they write in a postscript on the future of verification, “when the prose on the other side is crafted by a convincing machine” (p. 221). On that note, I invite you and your students to try out some of these moves on your favorite chatbot.

Another Postscript

The other day, I gave Microsoft’s AI-powered search engine a few versions of the same prompt I had put to ChatGPT. In “balanced” mode, Bing dutifully recommended resources from Stanford, Cornell, and Harvard on introducing norms for learning in online college classes. Over in “creative” mode, Bing’s synthesis was slightly more offbeat—including an early-pandemic blog post on setting norms for middle school faculty meetings in rural Vermont. More importantly, the bot wasn’t hallucinating. Most of the sources it suggested seemed worth investigating. Pausing before each rabbit hole, I took a deep breath.

Related Resource

Oregon State Ecampus recently rolled out its own AI toolkit for faculty, based on an emerging consensus that developing capacities for using this technology will be necessary in many areas of life. Of particular relevance to this post is a section on AI literacy, conceptualized as “a broad set of skills that is not confined to technical disciplines.” As with Verified, I find the toolkit’s frameworks and recommendations on teaching AI literacy particularly helpful. For instance, if students are allowed to use ChatGPT or Bing to brainstorm and evaluate possible topics for a writing assignment, “faculty might provide an effective example of how to ask an AI tool to help, ideally situating explanation in the context of what would be appropriate and ethical in that discipline or profession.”

References

Caulfield, M., & Wineburg, S. (2023). Verified: How to think straight, get duped less, and make better decisions about what to believe online. University of Chicago Press.

Mollick, E. (2023, July 15). How to use AI to do stuff: An opinionated guide. One Useful Thing.

Oregon State Ecampus. (2023). Artificial Intelligence Tools.

Have you found yourself worried or overwhelmed in thinking about the implications of artificial intelligence for your discipline? Whether, for example, your department’s approaches to teaching basic skills such as library research and source evaluation still hold up? You’re not alone. As we enter another school year, many educators continue to think deeply about questions of truth and misinformation, creativity, and how large language model (LLM) tools such as chatbots are reshaping higher education. Along with our students, faculty (oh, and instructional designers) must consider new paradigms for our collective media literacy.

Here’s a quick backstory for this two-part post. In late spring, shortly after the “stable release” of ChatGPT to iOS, I started chatting with bot model GPT-3.5, which innovator Ethan Mollick describes as “very fast and pretty solid at writing and coding tasks,” if a bit lacking in personality. Other, internet-connected models, such as Bing, have made headlines for their resourcefulness and darker, erratic tendencies. But so far, access to GPT-4 remains limited, and I wanted to better understand the more popular engine’s capabilities. At the time, I was preparing a workshop for a creative writing conference. So, I asked ChatGPT to write a short story in the modern style of George Saunders, based in part on historical events. The chatbot’s response, a brief burst of prose it titled “Language Unleashed,” read almost nothing like Saunders. Still, it got my participants talking about questions of authorship, originality, representation, etc. Check, check, check.

The next time I sat down with the GPT-3.5, things went a little more off-script.

One faculty developer working with Ecampus had asked our team about establishing learning norms in a 200-level course dealing with sensitive subject matter. As a writing instructor, I had bookmarked a few resources in this vein, including strategies from the University of Colorado Boulder. So, I asked ChatGPT to create a bibliographic citation of Creating Collaborative Classroom Norms, which it did with the usual lightning speed. Then I got curious about what else this AI model could do, as my colleagues Philip Chambers and Nadia Jaramillo Cherrez have been exploring. Could ChatGPT point me to some good resources for faculty on setting norms for learning in online college classes?

“Certainly!” came the cheery reply, along with a summary of five sources that would provide me with “valuable information and guidance” (see Image 1). Noting OpenAI’s fine-print caveat (“ChatGPT may produce inaccurate information about people, places, or facts”), I began opening each link, expecting to be teleported to university teaching centers across the country. Except none of the tabs would load properly.

“Sorry we can’t find what you’re looking for,” reported Inside Higher Ed. “Try these resources instead,” suggested Stanford’s Teaching Commons. A closer look with Internet Archive’s Wayback Machine confirmed that the five sources in question were, like “Language Unleashed,” entirely fictitious.

An early chat with ChatGPT-3.5, asking whether the chatbot can point the author to some good resources for faculty on setting classroom norms for learning in online college classes. "Certainly," replies ChatGPT, in recommending five sources that "should provide you with valuable information and guidance."

Image 1: An early, hallucinatory chat with ChatGPT-3.5

As Mollick would explain months later: “it is very easy for the AI to ‘hallucinate’ and generate plausible facts. It can generate entirely false content that is utterly convincing. Let me emphasize that: AI lies continuously and well. Every fact or piece of information it tells you may be incorrect. You will need to check it all.”

The fabrications and limitations of chatbots lacking real-time access to the ever-expanding web have by now been well-documented. But as an early adopter, the speed and confidence ChatGPT brought to the task of inventing and describing fake sources felt unnerving. And without better guideposts for verification, I expect students less familiar with the evolution of AI will continue to experience confusion, or worse. As the Post recently reported, chatbots can easily say offensive things and act in culturally-biased ways—”a reminder that they’ve ingested some of the ugliest material the internet has to offer, and they lack the independent judgment to filter that out.”

Just how, exactly, we’re supposed to “check it all” happens to be the subject of a lively, forthcoming collaboration from two education researchers who have been following the intersection of new media and misinformation for decades.

Stay tuned for an upcoming post with the second installment of “Media Literacy in the Age of AI,” a review of Verified: How to Think Straight, Get Duped Less, and Make Better Decisions about What to Believe Online by Mike Caulfield and Sam Wineburg (University of Chicago Press, November 2023).

References

Mollick, E. (2023, July 15). How to use AI to do stuff: An opinionated guide. One Useful Thing.

Wroe, T., & Volckens, J. (2022, January). Creating collaborative classroom norms. Office of Faculty Affairs, University of Colorado Boulder.

Yu Chen, S., Tenjarla, R., Oremus , W., & Harris, T. (2023, August 31). How to talk to an AI chatbot. The Washington Post.

Introduction

We’ve all heard by now of ChatGPT, the large language model-based chat bot that can seemingly answer most any question you present it. What if there were a way to provide this functionality to students on their learning management system, and it could answer questions they had about course content? Sure, this would not completely replace the instructor, nor would it be intended to. Instead, for quick course content questions, a chatbot with access to all course materials could provide students with speedy feedback and clarifications in far less time than the standard turnaround required through the usual channels. Of course, more involved questions about assignments and course content questions outside of the scope of course materials would be more suited to the instructor, and the exact usage of a tool like this would need to be explained, as with anything.

Such a tool could be a useful addition to an online course because not only could it potentially save a lot of time, but it could also keep students on the learning platform instead of using a 3rd-party solution to answer questions as is the suspected case right now with currently available chatbots.

To find out what this would look like, I researched a bit on potential LLM chatbot candidates, and came up with a plan to integrate one into a Canvas page.

Disclaimer!
This is simply a proof of concept, and is not in production due to certain unknowns such as origin of the initial training data, CPU-bound performance, and pedagogical implications. See the Limitations and Considerations section for more details.

How it works

The main powerhouse behind this is an open source, Large Language Model (LLM) called privateGPT. privateGPT is designed to let you “ask questions to your documents” offline, with privacy as the goal. It therefore seemed like the best way to test this concept out. The owner of the privateGPT repository, Iván Martínez, notes that privacy is prioritized over accuracy. To quote the ReadMe file from GitHub:

100% private, no data leaves your execution environment at any point. You can ingest documents and ask questions without an internet connection!

privateGPT, GitHub Site

privateGPT, at the time of writing, was licensed under the Apache-2.0 license, but during this test, no modifications were made to the privateGPT code. Initially, when you run privateGPT, train it on your documents, and ask it questions, you are doing all of this locally through a command line interface in a terminal window. This obviously will not do if we want to integrate it into something like Canvas, so additional tools needed to be built to bridge the gap.

I therefore set about making two additional pieces of software: a web-interface chat box that would later be embedded into a Canvas page, and a small application to connect what the student would type in the chat box to privateGPT, then strip irrelevant data from its response (such as redundant words like “answer” or listing the source documents for the answer) and push that back to the chat box.

A diagram showing how the front-end of the system (what the user sees) interacts with the back-end of the system (what the user does not see). Self-creation.

Once created, the web interface portion, running locally, allows us to plug it into a Canvas page, like so:

A screenshot showing regular Canvas text on the left, and the chat box interface on the right, connected to the LLM.

Testing how it works

To begin, I let the LLM ‘ingest’ the Ecampus Essentials document provided to course developers on the Ecampus website. Then I asked some questions to test it out, one of which was: “What are the Ecampus Essentials?”

I am not sure what I expected here, as it is quite an open ended question, only that it would scan its trained model data and the ingested files looking for an answer. After a while (edited for time) the bot responded:

A video showing the result of asking the bot “What are the Ecampus Essentials?”

A successful result! It has indeed pulled text from the Ecampus Essentials document, but interestingly has also paraphrased certain parts of it as well. Perhaps this is down to the amount of text it is capable of generating, along with the model that was initially selected.

A longer text example

So what happens if you give it a longer text, such as an OpenStax textbook? Would it be able to answer questions students might have about course content inside the book?

To find out, I gave the chatbot the OpenStax textbook Calculus 1, which you can download for free at the OpenStax website. No modifications were made to this text.

Then I asked the chatbot some calculus questions to see what it came up with:

Asking two questions about certain topics in the OpenStax Calculus 1 book.

It would appear that if students had any questions about mathematical theory, they could get a nice (and potentially accurate) summary from a chatbot such as this. Though this brings up some pedagogical considerations such as: would this make students less likely to read textbooks? Would this be able to search for answers to quiz questions and/or assignment problems? It is already common to ask ChatGPT to provide summaries and discussion board replies, would this bot function in much the same way?

Asking the chatbot to calculate things, however, is where one would run into the current limitations of the program, as it is not designed for that. Simple sums such as “1 + 1” return the correct answer, as this is part of the training data or otherwise common knowledge. Asking it to do something like calculate the hypotenuse of a triangle using Pythagorus’ theorem will not be successful (even using a textbook example of 32 + 42 = c2). The bot will attempt to give an answer, but its accuracy will vary wildly based on the data given to it. I could not get it to give me the correct response, but that was expected as this was not in the ingested documentation.

Limitations and Considerations

OK, so it’s not all perfect – far from it, in fact! The version of privateGPT I was using, while impressive, had some interesting quirks in certain responses. Responses were never identical either, but perhaps that is to be expected from a generative LLM. Still, this would require further investigation and testing in a production-ready model.

How regular and substantive interaction (RSI) might be affected is an important point to consider, as a more capable chatbot could impact the student-instructor Q&A discussion board side of things without prior planning on intended usage.

A major technical issue was that I was limited to using the central processing unit (CPU) instead of the much faster graphics processing unit (GPU) used in other LLMs and generative AI tools. This meant that the time between the question being sent and the answer being generated was far higher than desired. As of writing, there appears to be a way to switch privateGPT to GPU instead, which would greatly increase performance on systems with a modern GPU. The processing power required for a chatbot that more than one user would interact with simultaneously would be substantial.

Additionally, the incorporation of a chatbot like this has some other pedagogical implications, such as how the program would respond to questions related to assignment answers, which would need to be researched.

We also need to consider the technical skill required to create and upkeep a chatbot. Despite going through all of this, I am no Artificial Intelligence or Machine Learning expert; a dedicated team would be required to maintain the chatbot’s functionality to a high-enough standard.

Conclusion

In the end, the purpose of this little project was to test if this could be a tool students might find useful and could help them with content questions faster than contacting the instructor. From the small number of tests I conducted, it is very promising, and perhaps a properly built version could be used as a private alternative to ChatGPT, which is already being used by students for this very purpose. A major limitation was running the program from a single computer with consumer components made 3 years ago. With modern hardware and software – perhaps a first-party integrated version built directly into a learning management system like Canvas – students could be provided with their own course- or platform-specific chatbot for course documents and texts.

If you can see any additional uses, or potential benefits or downsides to something like this, leave a comment!

Notes

  1. Martínez Toro, I., Gallego Vico, D., & Orgaz, P. (2023). PrivateGPT [Computer software]. https://github.com/imartinez/privateGPT.
  2. “Calculus 1” is copyrighted by Rice University and licensed under an Attribution-NonCommercial-Sharealike 4.0 International License (CC BY-NC-SA).

Ashlee M. C. Foster, MSEd | Instructional Design Specialist | Oregon State University Ecampus

This is the final installment of a three-part series on project-based learning. The first two articles, Architecture for Authenticity and Mindful Design, explore the foundational elements of project-based learning. This article shifts our attention to generating practical application ideas for your unique course. This series will conclude with a showcase of an exemplary Ecampus course project. 

Over the last couple of years, as an instructional designer, I have observed my faculty developers shifting how they assess student learning. Frequent and varied low-stakes assessments are replacing high-stakes exams, in their courses. Therefore, students increasingly have more opportunities to actively engage in meaningful ways. What an exciting time!

Activity Ideas

Instructors commonly express that adopting a new, emerging, or unfamiliar pedagogical approach can be challenging for two reasons: 1) identifying an appropriate activity and 2) thoughtfully designing the activity into a course. Sometimes a brainstorming session is just the ticket. Here are a few activity ideas to get you started.

TitleDescriptionResource
Oral HistoryStudents pose a problem steeped with historical significance (e.g., racism). Students conduct research using primary sources which corroborate and contextualize the issue. Experts and/or those with direct/indirect experience are interviewed. Interviews are documented with multimedia. 
Oregon State University SCARC Oral History Program
Renewable Course MaterialsStudents write, design, and edit a course website that takes the place of a course textbook.Open Pedagogy Notebook
App LabStudents collaboratively design an application that will serve a relevant societal need, resolve barriers, or fix a problem.CODE
Problem solved!Students select a problem that affects the local, regional, state, national, or global community and conduct research. Students collaboratively create scenarios that authentically contextualize the problem. Students develop solutions that utilize the main course concepts while engaging with the problem within a real-world context.Oregon State University Bioenergy Summer Bridge Program
GenderMag ProjectGenderMagis a process that guides individuals/groups through any form of technology (e.g., websites, software, systems) to find gender inclusivity “bugs.” After going through the GenderMag process, the investigators can then provide recommendations and fix the bugs.The GenderMag Project

Take a moment to explore a few of the following resources for additional project ideas:

While exploring project-based activities and/or assessments, it may also be helpful to consider the following questions: 

  • Does this activity align with the course learning outcomes? 
  • What type of prerequisite knowledge and skills do students need?
  • What types of knowledge and skills will students need after completing the project?
  • Can the activity be modified/customized to fit the needs of the course?
  • What strategies will be employed to foster authentic learning? 
  • What strategies can be used to guide and/or coach teams through the activity?
  • How will the activity foster equitable engagement and active participation?
  • What strategies can be utilized to nurture and build a strong learning community?

Project Spotlight

Becky Crandall

Becky is an Associate Professor of Practice in the Adult and Higher Education (AHE) program at Oregon State University. We had the pleasure of collaborating on the Ecampus course development for AHE 623, Contemporary Issues in Higher Education. With two decades of experience in postsecondary settings, Becky came to the table with a wealth of knowledge, expertise, and strong perspectives grounded in social justice, all of which situated her to create a high-quality, engaging, and inclusive Ecampus course. When interviewing her for this article, she shared her pedagogical approach to teaching online and hybrid courses, which provides a meaningful context for the project design

“At the start of every term, I take time to explain the idea that shapes the approach I take as an educator and the expectations that I have of the class: ‘we are a community.’ Inspired by educational heroes like Paulo Freire, bell hooks, and Marcia Baxter Magolda, as well as the excellent teachers who shaped me as a student, I take a constructivist approach to teaching. I also center the ‘so what’ and ‘now what’ of the material we cover through active learning exercises that create space for students to reflect on their learning and its applicability to the real world. Admittedly, such active learning exercises are engaging. Research also highlights their effectiveness as a pedagogical strategy. More importantly, however, they provide a means of disrupting power structures within the classroom (i.e., the students are positioned as experts too), and they serve as mechanisms through which the students and I can bring our full selves to the course.” ~Becky Crandall

The Project

In AHE 623, students complete a term-long project entitled the “Mini-Conference.” The project situates students as the experts, “by disrupting traditional classroom power structures” and provides an opportunity to “simulate the kind of proposal writing and presenting they would do at a professional conference.” The project’s intended goal is to foster deep learning through the exploration of contemporary real-world higher education issues.

Design

The project is a staged design with incremental milestones throughout the 11-week academic term. The project design mimics the process of a professional conference, from proposal to presentation. The project consists of “two elements: (1) a conference proposal that included an abstract, learning outcomes, a literature review, policy and/or practice implications, and a presentation outline and (2) a 20-minute presentation.”  As the term concludes, students deliver the presentation (i.e., conference workshop) that actively engages the audience with the self-selected topic. Students have varied opportunities to receive peer and instructor feedback. The information gleaned from the feedback helps to refine student proposals for submission to a professional organization.

Becky shared how she conceptualized and designed the project using backward design principles. “Specifically, I began by considering the goals of the course and the project. I then researched professional associations’ conference proposal calls to determine what elements to include in the project. When developing learning exercises, I often ask, ‘How might the students use this in the real world?’” By using an intentional design process, the result is a project which is strongly aligned, structures learning, and has authentic application.

Project Overview Page

Delivery

The first delivery of AHE 623 was successfully launched in the Spring of 2022 with minimal challenges other than the limited time. “The students engaged fully in the mini-conference. As reflected in the outcomes, they not only learned but were left hungry for more.”  Requests flew in for additional opportunities to apply what they had learned! The students raved about this project such that they even asked if they could host a virtual conference using their presentations.The project proved to be a transformational experience for students. “Multiple students noted that this opportunity helped them refine their dissertation ideas and related skills.” As Becky looks forward, she hopes to consider restructuring the design into a rotating roundtable format. Doing so will ensure that students are exposed to their peers’ perspectives in the course.

Remember that course design and development is an iterative process. Please know you do not have to get it right the first time or even the tenth. Your students do value your enthusiasm for the subject and appreciate the effort you have put into crafting valuable learning experiences for them. You have got this!

Inspire!

Visit the Ecampus Course Development and Training team blog for application tips, course development and design resources, online learning best practices and standards, and emerging trends in Higher Education. We look forward to seeing you there.

Acknowledgments

Dr. Becky Crandall, thank you for candidly sharing your core pedagogical approaches, philosophy of teaching, and the course project with the Oregon State community. Your commitment to social justice continues to shine in your course designs and instructional delivery.

I was recently assigned to be the Instructional Designer for an introductory programming course here at OSU. While working with the instructor, I was happy to see his inventiveness in assessment design. As one example, the instructor created an assignment to introduce loops, a block of code in a computer program that repeats while a condition is true. Here’s how he described the assignment to the students:

Your assignment is to simulate the progression of a zombie epidemic as it spreads through Portland, Oregon, beginning in the year 2001 (which was about the time that zombies became unnervingly popular). This assignment will test whether you can use loops when translating from a problem to a computational solution.

(Scaffidi, 2019)

I was excited about the design possibilities this introduced to a usually dry topic. Zombies! I built the page in our LMS, Canvas, and was excited to review it with him.

“Isn’t this fun?” I asked, showing him the assignment page I had created:

Zombie epidemic programming assignment introduction

“I guess so,” he said, “is there any research to indicate that decorative graphics support learning?” he asked me. I guess that’s fair to ask, even if it was a bit of a buzzkill.

I had no idea if including cool pictures was a research-based best practice in online course design. While I really wanted it to be true and felt like it should be true, I could not immediately cite peer-reviewed studies that supported the use of zombie images to improve learner engagement; I had never seen such research. But, I was determined to look before our next meeting.

The instructor’s research challenge led me to discover Research Rabbit. Research Rabbit is a relatively new online platform that helps users find academic research. Research Rabbit has users organize found research into collections. As articles are added to a collection, Research Rabbit helps identify related research.

Without realizing how much time I was exploring, four hours quickly passed in which I was wholly engrossed in the search to justify including a zombie picture in one assignment for one instructor. Below, I will share a few of the features that enamored me with Research Rabbit and why I continue to use it regularly.

Why I love Research Rabbit

Visualization of Search Results

Rather than combing through reference lists at the bottom of a paper, you can quickly view any works cited by a paper you have selected or change views and get a list of articles that have cited the selected document. Those results are presented in a list view, a network view, or on a timeline.

A Tool for Discovery

Research Rabbit starts generating suggested additions as soon as you add a paper to a collection. The more papers you add, the more accurate these recommendations become. It works somewhat like personalized Netflix or Spotify recommendations (ResearchRabbit, n.d.), helping you discover research you may not have been aware of in this same area of study.

Using their discovery functionality, you can identify clusters of researchers (those that have published together or frequently cite each other’s work). You can also use the “Earlier Work” option to see when research on a particular topic may have started and identify foundational papers in the field. Looking for “Later Work” helps you find the latest research and stay current on your research topic.

Free Forever

The Research Rabbit founders explain their reasoning for keeping their tool Free Forever as follows:

Why? It’s simple, really.

Researchers commit years of time, energy, and more to advance human knowledge. Our job is to help you discover work that is relevant, not to sell your work back to you.

(Research Rabbit FAQ)

Research Rabbit Syncs Collections to Zotero

I would have lost a lot of enthusiasm for Research Rabbit if I had to manually add each new paper to my Zotero collection. But Research Rabbit integrates with Zotero, and automatically syncs any designated collections. If you use a different reference tool, you can also export Research Rabbit collections in common bibliographic formats.

A Tool for Sharing and Collaboration

Once you have created a collection, you can invite other researchers to view or edit a collection based on the permissions you set. Collaborators can also add comments to individual items. Research Rabbit also gives you an opportunity to create public collections that can be shared with a custom link.

How to Explore Research Rabbit on Your Own

The feature set of Research Rabbit is beautifully demoed on the Research Rabbit website. From there, you can explore how to visualize papers, discover author networks, and start building collections. There is also a growing list of introductory and instructional videos by the academic community online.

So What Happened with the Zombies?

You can review some of the research yourself by checking out my Research Rabbit Collection of Articles on Visual Design in Online Learning.  Much to my delight, after conducting my (4-hour) search, I did find some research-based evidence that aesthetics improved engagement and recall (Deanna Grant-Smith et al., 2019). Many of the studies, however, also suggested that visuals in online courses should also have some instructional function and help communicate ideas to avoid cognitive overload (Rademacher, 2019).

Maybe next time, I’ll suggest embedding this:

A flowchart of a conditional loop feature Zombie images.
Zombie Images by Freepik

References

Deanna Grant-Smith, Timothy Donnet, James Macaulay, Renee Chapman, & Renee Anne Chapman. (2019). Principles and practices for enhanced visual design in virtual learning environments: Do looks matter in student engagement? https://doi.org/10.4018/978-1-5225-5769-2.ch005

Rademacher, C. (2019, May 13). Value of Images in Online Learning. Ecampus Course Development & Training. http://blogs.oregonstate.edu/inspire/2019/05/13/the-value-of-images-in-online-learning/

Research Rabbit FAQ. (n.d.). [Online tool]. Research Rabbit. Retrieved October 3, 2022, from https://researchrabbit.notion.site/Welcome-to-the-FAQ-c33b4a61e453431482015e27e8af40d5

ResearchRabbit. (n.d.). ResearchRabbit. Retrieved October 4, 2022, from https://www.researchrabbit.ai

Scaffidi, C. (2019). CS 201: Computer Programming for Non-CS Majors.

In Dr. Freeman Hrabowski’s TED Talk “4 Pillars of College Success in Science”, he told the story of Nobel laureate Isidor Isaac Rabi’s mother’s famous question: Did you ask a good question today? Let’s pause for a minute and reflect: What is a good question? What questions do you ask most frequently? What questions do your students or children ask most?

Question
Question

Types of Questions

Teachers usually encourage students to ask questions. Dr. Peter Liljedahl, author of “Building Thinking Classrooms in Mathematics” and professor of Mathematics Education at Simon Fraser University in Canada, however, points out that not all questions need and should be answered directly. According to Liljedahl, there are three types of questions and only one type of questions requires direct answers. Liljedahl categorizes questions in K-12 mathematics classrooms into the following three types:

  1. Proximity Questions
  2. Stop Thinking Questions
  3. Keep Thinking Questions (Liljedahl, 2020)
Building Thinking Classrooms Book Cover

Proximity questions refer to questions students ask when the teacher is close by, as the name suggests. Liljedahl’s research showed that the information gained from such proximity questions was not being used at all. Stop-Thinking Questions are questions students ask just to get the teacher to do the thinking for them, with the hope that the teacher will answer it and they can stop thinking, such as “Is this right?”, “Do we have to learn this?”, or “Is this going to be on the test?” Unlike the first two types of questions, keep-thinking questions are often clarification questions or about extensions the students want to pursue. According to to Liljedahl, if you have an authentic and level-appropriate task for students to work on, 90% of the questions being asked are proximity questions or stop-thinking questions and only 10% of questions students ask are keep-thinking questions. Liljedahl pointed out that answering proximity questions and stop-thinking questions are harmful to learning because it stops students from thinking.

Next, how could teachers differentiate the types of questions being asked? Liljedahl offers a simple solution to separate keep-thinking questions from the other two types of questions: Are they asking for more activity or less, more work or less, more thinking or less?

After differentiating the types of questions, what should teachers do with these proximity questions and stop-thinking question? Ignore them? No, not at all! Liljedahl emphasizes that there is a big difference between having students’ questions heard and not answered, and having their questions not heard. How should teachers answers these proximity questions and stop-thinking questions then?

Ten Things to Say to Proximity And Stop-Thinking Questions

Liljedahl provides the following list of ten responses to a proximity or stop-thinking question so that you are not giving away the answer and taking the thinking opportunity away from students. Basically, you turn the questions back to your students!

  1. Isn’t that interesting?
  2. Can you find something else?
  3. Can you show me how you did that?
  4. Is that always true?
  5. Why do you think that is?
  6. Are you sure?
  7. Does that make sense?
  8. Why don’t you try something else?
  9. Why don’t you try another one?
  10. Are you asking me or telling me? (Liljedahl, 2021, p. 90)

Cross-Discipline Nature of Good Questions

“Building Thinking Classrooms“  is recommended to me by some college biology  teachers in the US. Biology teachers recommending math teaching book, isn’t that interesting? The reasoning behind this recommendation is that the techniques being taught in this book could be easily applied to any other teaching context to get your students engaged in thinking, whether it is K12 education or college education, math teaching or teaching of another subject.

If this brief introduction got you interested in reading the rest of the book and find out the rest of what the author has to share, it is available at Oregon State University library as an ebook or you can purchase it online.

Asking Good Questions for Management and Education Administration

If you are not directly involved in teaching and learning, but in administrative or management role in an organization, Dr. Amy Edmondson has some practical suggestions for asking good questions to keep organization growing healthily. Dr. Amy Edmondson, author of  “The Fearless Organization”, Novartis professor of Leadership and Management at the Harvard Business School, states that good questions focus on what matters, invite careful thought, and give people room to respond. Edmondson also suggests three strategies for framing good questions:

  1. To broaden the discussion. For example: What do others think?
  2. What are we missing? For example: What other options could we consider?
  3. How would XXX (such as our role model, our mentor, or our competitor) approach this? For example: Who has a different perspective?

With the above tips for asking questions, are you ready to ask a good question today?

References

Edmondson, A. (2018). The Fearless Organization: Creating Psychological Safety in the Workplace for Learning, Innovation and Growth. Hoboken, NJ: John Wiley & Sons, Inc.

Hrabowski, F. (2013). 4 Pillars of College Success in Science. TED Talk. https://www.ted.com/talks/freeman_hrabowski_4_pillars_of_college_success_in_science?language=en

Liljedahl, P. (2020). Building Thinking Classrooms in Mathematics, Grades K-12 : 14 Teaching Practices for Enhancing Learning. Thousand Oaks: Corwin, 2020

This is a guest post by Ecampus Instructional Design Intern Chandler Gianattasio.

At DePaul School for Dyslexia, I teach 5th-8th graders conceptual mathematics, ranging from basic number sense to advanced topics in Algebra 1. Through this experience, I have discovered something I had never heard of before, a learning disability commonly seen coinciding with dyslexia, called dyscalculia. Many people describe dyscalculia as “dyslexia for math.” Dyscalculia affects one’s ability to take in mathematical information, connect with and build upon prior concepts learned, discern cues for application, and accurately retrieve information. DePaul provides an alternative education for students with learning disabilities, emphasizing explicit instruction to best support their students. In this post, I will discuss the common deficits that make up dyscalculia, where it falls in the realm of disabilities, and some ways we can accommodate students with dyscalculia in higher education. 

Having dyscalculia can be debilitating, making it seem nearly impossible to keep up with neuro-typical classmates, especially when your class is on a fast-paced schedule, and when you are subliminally being told that asking for extra help will make you appear like you are lazy, unintelligent, unable to help yourself, and will just be one more problem that the instructor has to resolve. In Figure 1 (seen below), I have divided deficits of dyscalculia into five categories: executive functioning deficits1, auditory processing deficits2, nonverbal learning deficits3, language processing deficits4, and visual-spatial deficits5. Each category has a set of commonly experienced difficulties below them. However, these lists of difficulties are not exhaustive. 

Figure 1. “Common Challenges Faced by Learners with Dyscalculia” By Chandler Gianattasio – CC BY-NC.

Looking at some of the common symptoms of dyscalculia, you may be thinking that many of these symptoms are also found in other disabilities, such as dyslexia, autism, ADHD, and more, and you would be correct. There is a large amount of overlap amongst developmental disabilities, and each disabled individual presents with their own unique combination of symptoms and, often, coexisting disabilities. To understand where dyscalculia falls within the world of developmental disabilities, I referred to the Individuals with Disabilities Education Act (IDEA) by the Department of Education. Through IDEA, 13 distinctions of disabilities were made:

  1. [Specific] Learning Disability ([S]LD)
  2. Other Health Impairment (conditions that limit a child’s strength, energy or alertness)
    • ADHD, EFD, NVLD
  3. Autism Spectrum Disorder
  4. Emotional Disturbance 
    • Generalized Anxiety Disorder, Bipolar Personality Disorder, Obsessive-Compulsive Disorder, Major Depressive Disorder, etc. 
  5. Speech or Language Impairment 
    • LPD
  6. Visual Impairment (including blindness)
    • VPD
  7. Deafness
  8. Hearing Impairment 
    • APD
  9. Deaf-Blindness
  10. Orthopedic Impairment
  11. Intellectual Disability
  12. Traumatic Brain Injury
  13. Multiple Disabilities 

From this list, you may notice that there are only three diagnoses recognized as LDs: dyslexia, a reading disability, dysgraphia, a writing disability, and dyscalculia, a mathematical disability. All three of these LDs, as well as the majority of the other disabilities, have very similar biological limitations, each with the potential for a visual-spatial deficit, language-processing deficit, nonverbal learning deficit, auditory processing deficit, and executive functioning deficit. Due to individuals within each diagnosis having their own unique combination of symptoms, some may not have deficits in each of these areas. Figure 2 represents the potential combination of deficits an individual with each of these LDs may have. For instance, if you were to draw a straight line from dyslexia to the outer edge of the figure, the various deficits intersected would be representative of the profile of one individual. This figure shows the fluidity between different diagnoses and how easily co-existing conditions occur, due to a very similar underlying makeup. 

Figure 2. “Potential Combinations of Deficits Behind Each Learning Disability” By Chandler  Gianattasio, CC BY-NC-SA.

Figure 3 is shown below to reiterate the significant overlap found between disabilities. Looking specifically at weaker listening skills in this study, a characteristic originally classified as an APD trait, is also a very prevalent trait in individuals with LPD, dyslexia, ADHD and other LDs.

Auditory Processing Disorder (APD), Specific Language Impairment (SLI),  Learning Disorders (LD), Attention Deficit Hyperactivity Disorder (ADHD), Autism Spectrum Disorders (ASD). [Colors shown in the charts do not correlate with Figures 1 & 2].

Figure 3. Same or Different: The Overlap Between Children with Auditory Processing Disorders and Children with Other Developmental Disorders: A Systematic Review” By Ellen de Wit, et al, CC BY-NC-ND.

Supportive Instructional Strategies

Supporting Executive Functioning Deficits

Students struggling with executive functioning become overwhelmed often due to either external or internal stimuli. They often struggle with rejection sensitivity dysphoria, emotional dysregulation and “time blindness”. The most significant way you can support these students is by creating a safe, non-judgemental space for them to communicate with you, and encourage them to always self-advocate, no matter what. You can further support students navigating executive functioning difficulties by doing the following:

  • Providing easily accessible reminders of important events and assignment due dates. Time management can be a major difficulty, especially with projects over an extended period of time.
  • Giving lots of positive reinforcement, checking in frequently, and having regularly scheduled meetings.
  • Pointing out any prior knowledge that is being built upon, and have them answer any questions based on prior knowledge that they’ve mastered in class to boost their confidence and increase their motivation.
  • Reducing extraneous stimuli and eliminating background noise as much as possible – whether that be in their environment, on a worksheet, in a presentation, in reference material, etc. 

Supporting Visual-Spatial Deficits

Students with visual-spatial deficits often struggle with creating their own visual representations of concepts being discussed, especially abstract or microscopic concepts. When navigating directions, these students also struggle with orientation – both cardinal and left versus right. This often presents when they are performing calculations with negative numbers. Most students with dyscalculia learn adding and subtracting via number lines that they manually traverse to understand the connection between operations properly. To help your students who have visual-spatial difficulties, you could offer the following:

  • Providing tangible manipulatives that they can touch and physically move in order to see how parts function together 
    • This could be provided when introducing a new concept as an addition to 2D drawings or descriptions
    • It’s always beneficial to have manipulatives available for your students whether that’s physical objects for in-person sessions or interactive virtual manipulatives for online sessions.
  • Providing graph paper, templates, and/or graphic organizers can be highly beneficial to your students to organize their thoughts and break up information. 
  • Integrating as much UDL in as possible! Presenting information in multiple formats and allowing the students to demonstrate their learning in multiple ways will allow these students to participate in class and showcase their knowledge confidently. 

Supporting Auditory Processing Deficits

Students with an auditory processing deficit struggle to comprehend directions and content from listening. The most significant way you can help your students who struggle with auditory processing is by doing the following:

  • Speaking clearly. 
  • Trying your best not to explain things too quickly.
  • Being very consistent with the terminology you use (try not to use multiple names for one entity or idea).
  • Checking in with students often to see if they understand what is being taught or asked of them.
  • Always encouraging students to ask questions as they work.

Supporting Nonverbal Learning Deficits

Students with nonverbal learning deficits are very literal and struggle to see the overall picture, especially when it comes to abstract concepts. You can support students with an NVLD by doing the following:

  • Creating a lot of associations and parallels between content and what they already know (common, everyday nuances) could help these individuals a lot. 
  • Teaching concepts alongside any procedural knowledge you want them to retain will help them significantly, as they understand the “why” behind the steps. 
  • Making sure to provide a lot of concrete examples, especially when introducing a new topic. 
  • Structuring classes, making them as consistent and repetitive as possible. If they know what to expect and they are confident that they know how to handle it, these students will thrive. 

Supporting Language Processing Deficits

Language processing deficits very commonly occur alongside auditory processing deficits. Students with this difficulty struggle with comprehending directions and content both from listening and reading. To help support these students, try the following:

  • Keeping language used simple and consistent.
  • Ensuring adequate (and modifiable if possible) background to text contrast/color.
  • Using 1.5+ line spacing and increased spacing between letters or symbols.
  • Chunking content and utilizing bullet points when possible.
  • Using consistent color associations with certain topics can go a long way (i.e. red for negative and blue for positive).
  • Starting lessons off with a graphic organizer or outline, showing how ideas fit together.
  • Including simple diagrams to illustrate concepts or procedures.
  • Highlighting keywords, numbers in word problems, or other important information you want to make sure they see.
  • Creating accessible asynchronous/recorded lectures. 
  • Providing access to pre-written notes, “cheat sheets” displaying steps and formulas needed and worked-out sample problems so students can see what they are to do.

Remember that improving the learning experiences of our learners with special needs almost never comes at a cost to the “typical” learner – improving access, accessibility, and support for one improves these areas for all.

Available Accommodations at OSU:

In the Comments Below, Tell Me: 

Do you have any experience designing for learners with dyscalculia? What are some strategies you have found beneficial?

Resources to Explore:

Information shown in Figures 1 & 2  derived from the following sources:

Dyscalculia: neuroscience and education” By Liane Kaufmann;  “Double dissociation of functions in developmental dyslexia and dyscalculia” By O. Rubinsten & A. Henik; “Numerical estimation in adults with and without developmental dyscalculia” By S. Mejias, J. Grégoire, M. Noël; “A general number-to-space mapping deficit in developmental dyscalculia” By S. Huber, et al.; “Developmental Dyscalculia in Adults: Beyond Numerical Magnitude Impairment” By A. de Visscher, et al.; “Working Memory Limitations in Mathematics Learning: Their Development, Assessment, and Remediation” By Daniel Berch; “Learning Styles and Dyslexia Types-Understanding Their Relationship and its Benefits in Adaptive E-learning Systems” By A. Y. Alsobhi  & K. H. Alyoubi; “The Cognitive Profile of Math Difficulties: A Meta-Analysis Based on Clinical Criteria” By S. Haberstroh & G. Schulte-Körne; “Mathematical Difficulties in Nonverbal Learning Disability or Co-Morbid Dyscalculia and Dyslexia” By I. Mammarella,  et al.; “Developmental dyscalculia is related to visuo-spatial memory and inhibition impairment” By D. Scuzs, et al.;  “Dyscalculia and the Calculating Brain” By Isabelle Rapin

The distinction of potential difficulties one might experience were determined by the symptoms commonly seen in the following disorders:

  1. Executive Functioning Disorder (EFD)
  2. Auditory Processing Disorder (APD)
  3. Nonverbal Learning Disorder (NVLD)
  4. Language Processing Disorder (LPD)
  5. Visual Processing Disorder (VPD)

The concept of resilient teaching has come to the forefront in the 2-plus years since the COVID-19 pandemic suddenly and radically altered the landscape of higher education. As faculty, students, and administrators devise strategies to cope with the myriad changes brought about by the pandemic, questions are ubiquitous about how to best support students and colleagues; how to adapt to changes in perceptions, practices, and expectations regarding teaching and learning; and how to avoid burnout.

Definitions and Scope

What is resilience? Resilience, in the physical sciences, refers to “the capability of a strained body to recover its size and shape after deformation caused especially by compressive stress.” Resilience, in a psychological sense, is “the process and outcome of successfully adapting to difficult or challenging life experiences, especially through mental, emotional, and behavioral flexibility and adjustment to external and internal demands.” And a classic definition of ecological resilience states, “Resilience is the capacity of complex systems of people and nature to withstand disturbance without shifting into an alternate regime, or a different type of system organized around different processes and structures.” 

How about resilience in teaching? In the early months of the pandemic, Rebecca Quintana and James DeVaney of the University of Michigan Center for Academic Innovation posited an emergent definition of resilient teaching

The ability to facilitate learning experiences that are designed to be adaptable to fluctuating conditions and disruptions. This teaching ability can be seen as an outcome of a design approach that attends to the relationship between learning goals and activities, and the environments they are situated in. Resilient teaching approaches take into account how a dynamic learning context may require new forms of interactions between teachers, students, content, and tools. 

As Quintana and Devaney note, resilient teaching goes far beyond pedagogy per se. Narrowly, resilient teaching might be seen as coping with the pivot to emergency remote teaching. More broadly, resilient teaching stretches to encompass course design, and includes the well-being of students and faculty, the capacity of instructors to avoid burnout and sustain productive careers, all while attending to the importance of student success, equity, and inclusion.

Need

Because totally asynchronous online teaching–the mode of most Ecampus courses–doesn’t involve synchronous class meetings, it may receive less attention than on-campus teaching in conversations about resilient teaching. However, the manifold effects and overall disruption created by COVID-19, and the importance of instructor and student wellness are profoundly applicable to asynchronous online learning. Resilient teaching is essential in all teaching and learning modalities.

How serious is the need for resilient teaching and learning? Nationally, faculty report A Stunning Level of Student Disconnection, and both faculty and administrators see the value of supporting students through trauma-informed teaching practices. Many institutions are also concerned about The Great Faculty Disengagement and anecdotal evidence that many faculty are Calling It Quits. UC Irvine has even created a new position for a pedagogical wellness specialist both to support the well-being of faculty and graduate teaching assistants and to train them in teaching practices that support student well-being. 

OSU Activities

An ongoing series of activities at Oregon State University is focusing on resilient teaching:

  • Inara Scott, Senior Associate Dean in the College of Business, has written about Increasing Resilience through Modular Teaching. Scott has proposed modular course design of on-campus courses as a way to build in future flexibility:

Modular teaching allows us to transition quickly from in-person to remote synchronous or HyFlex teaching; it also creates pathways for addressing quarantines and family emergencies–both our own and our students. Finally, it lays the foundation for future blended learning experiences, where students might learn in a hybrid format with both in-person and remote online elements. 

Scott’s modular approach is in keeping with Ecampus online and hybrid course design principles, using backward course design to align learning activities, assignments, and assessments with course learning outcomes. Scott urges instructors to rethink activities that are “modality limited” to synchronous classroom delivery, and to be prepared for the exigencies of asynchronous or remote delivery. 

  • In April 2022, the OSU Center for Teaching and Learning sponsored a Resilient Teaching Symposium. In the keynote, Inara Scott discussed exhaustion, cynical detachment, and reduced sense of efficacy as distinct types of burnout that could affect teaching faculty. Scott offered the modular design approach as a tool to build teaching resilience. 

Symposium participants reflected in small groups on their resiliency and evidence they’ve seen of possible burnout. Then they discussed potential strategies to avoid burnout and build resilience into their teaching and their lives. Their resilient teaching suggestions on a participant Jamboard included:

    • Taking stock at the beginning of each term to think about resilience. 
    • Remember to be kind! Being kind and flexible shows strength not weakness.
    • Finding ways to connect individually with my students, especially in my online classes where it’s easy to disconnect.
    • Curriculum planning with lots of options to pivot and adapt.\
    • Saying “yes” to most things and “no” to others.
  •  OSU instructional faculty have been exploring resilient teaching in term-long resilient teaching faculty learning communities. These professional development opportunities are co-sponsored by Academic Technologies and the Center for Teaching and Learning. Participants learn about flexible solutions for teaching challenges, techniques for integrating in-class and online learning activities, and strategies to build resilience in teaching. Sound interesting? See the Call for Participation for the Spring ’24 resilient teaching faculty learning community.

Learn More

Want to learn more about resilient teaching? Recommended starting points:

Resilient teaching and learning will continue to garner much-needed attention as higher education moves through the long wake of the pandemic. What are your strategies for maintaining resilience? Let’s talk about it.

Introduction

Getting students to read the syllabus is often a challenge in online courses. It is not uncommon for students to ask faculty questions that have answers easily found in the document. Even if students do read the syllabus, they may only skim through it. Ways to encourage a thorough reading include strategies like “easter egg” hunts where students find particular items to pass a syllabus quiz. This article will explore another method that uses a software application called Perusall, which is designed to encourage close reading.

Perusall is used at the Oregon State University Ecampus as a learning technology integration with Canvas, the learning management system. Using Perusall, students can highlight, make comments, and ask questions on a document. There is a grading interface with Canvas and a variety of settings, including reminders for students to complete the assignment. It offers a useful way for students to engage with the syllabus together, which can lead to closer reading than if they had done so individually.

Results

To test this idea, a professor used this approach with a 400/500 class that involved multiple assignments in Perusall throughout the term. If the syllabus assignment proved useful in Perusall, then it would also serve as an introduction to the platform for students. Here are some examples of student engagement that resulted from this activity:

  • Requests for additional background material to check for prerequisite knowledge.
  • Interest in the website of the professor (linked to in the syllabus).
  • Shoutouts to the course teaching assistant.
  • Concerns about the prerequisites for the class, which were addressed by the professor specifically.
  • Questions about technology used in the course based on students’ previous experiences in other courses.
  • Gratitude for ending the course week on Mondays instead of Sundays.
  • Confirmation by a student that the textbook is available as an electronic copy at the library.
  • Inquiries into the length and other logistics of Zoom office hours.
  • Excitement expressed by a student about a focus paper requirement.
  • Queries about how grade numbers are rounded and types of quiz questions.
  • Exchanges between a TA and a student looking forward to further discussions in Perusall.
  • Clarifications about the different work expected for undergraduates and graduates.
  • Ideas about how to communicate as a class.
  • Questions about the details of major assignments.
  • Appreciation of opportunities to participate in frequent knowledge checks.
  • Thanks for the late assignment policy and statements about flexibility.
  • Advice about how to check assignment due dates.

Conclusions

Students’ comments and conversations helped to initiate a feeling of community in the course. Many logistical issues were clarified for students by providing and encouraging a forum for discussion. There were highlights and comments by students on seven of ten pages of the syllabus. The three pages that were not discussed were university required policies. There were no negative comments about using Perusall as a syllabus activity. So this seems like a good method to engage students at the beginning of a course to prepare them for success. It may be especially helpful for classes using Perusall in other assignments because it provides a way to practice using the application.

References

  • Johnson. (2006). Best practices in syllabus writing: contents of a learner-centered syllabus. The Journal of Chiropractic Education, 20(2), 139–144. https://doi.org/10.7899/1042-5055-20.2.139
  • Lund Dean, & Fornaciari, C. J. (2014). The 21st-Century Syllabus. Journal of Management Education, 38(5), 724–732. https://doi.org/10.1177/1052562913504764
  • Sager, Azzopardi, W., & Cross, H. (2008). Syllabus selection: innovative learning activity. The Journal of Nursing Education, 47(12), 576–576.
  • Stein, & Barton, M. H. (2019). The “Easter egg” syllabus: Using hidden content to engage online and blended classroom learners. Communication Teacher, 33(4), 249–255. https://doi.org/10.1080/17404622.2019.1575440
  • Wagner, Smith, K. J., Johnson, C., Hilaire, M. L., & Medina, M. S. (2022). Best Practices in Syllabus Design. American Journal of Pharmaceutical Education, 8995–8995. https://doi.org/10.5688/ajpe8995