In the ever-evolving landscape of higher education, online, distance learning has emerged as a dynamic and accessible platform for students worldwide. However, with this shift to asynchronous online classrooms we must prioritize inclusivity and engagement in our educational strategies. Recognizing this need, Ecampus embarked on a journey to understand inclusive course design and teaching practices through the eyes of the learners.

Survey Summary 

In 2021, Ecampus implemented an Inclusive Excellence Strategic Plan. One goal of this plan focused on enhancing inclusive teaching and learning in online courses. As part of this initiative, a pilot study was conducted during the academic year 2022-2023, to develop a mechanism for students to provide feedback on their learning experiences. The study employed a series of weekly surveys, designed to elicit responses regarding moments of engagement and distancing within online courses.

Administered across five Ecampus courses, the pilot study garnered responses from 163 enrolled students. The findings provide invaluable insights into the nuances of online learning design and offer actionable recommendations for educators seeking to cultivate inclusive excellence in their own asynchronous, online classrooms. The questions were as follows:

  1. At what moment (point) in class this week were you most engaged as a learner?
  2. At what moment (point) in class this week were you most distanced as a learner?
  3. What else about your experience as a learner this week would you like to share?

These questions were carefully crafted to elicit responses related to diversity, equity, and inclusion (DEI). By using the verbs “engaged” and “distanced,” students were prompted to reflect on moments of connection and disconnection within their learning environments. The open-ended nature of the questions allowed students to provide contextual feedback, offering valuable insights beyond the scope of predefined categories.

The results of the survey provide a multifaceted understanding of students’ experiences in online courses. Across all five courses, certain patterns emerged regarding elements that students found most engaging and most distancing. These insights served as a springboard for the development of actionable recommendations aimed at enhancing course design and fostering inclusive learning environments.

Alignment

One crucial area highlighted by the survey results was the importance of alignment. Students noticed when their courses had assessments that were aligned with course content, and they noticed when this alignment was missing. Ensuring that learning objectives are represented in instructional materials, practice activities, assessments, and evaluation criteria is key. For more on this, please see “Alignment” by Karen Watté from 2017.

Learning Materials

Another prominent theme in the survey responses was the overwhelming nature of long, uncurated lists of readings and learning materials, which tended to alienate learners. To address this, providing a reading guide or highlighting key points can alleviate feelings of overwhelm. Optimizing content presentation and learning activities emerged as a key factor in promoting engagement and inclusivity. 

Incorporating interactive elements such as knowledge checks and practice activities within or between short lectures keeps students actively engaged and reinforces learning objectives. By utilizing multiple modes of content delivery–videos, lectures, and readings–educators can cater to diverse learning styles and preferences. Providing study guides is also noted as an effective strategy for enhancing comprehension and engagement with learning materials. 

Community & Connection

Supporting student-to-student interaction is pivotal in fostering a sense of community and participation (Akyol & Garrison, 2008). Many learners noted that they enjoyed engaging in small group discussions, in fact 50% of students in one course noted that the week 1 introductory discussion was the point they felt most engaged. Additionally, students across the courses were excited to view and respond to the creative work of their peers. Community-building course elements like these foster a sense of community and collaboration within the virtual classroom. 

While some students had mixed feelings about peer review activities–voicing concerns about feeling unqualified to judge their peer’s work–distinct guidelines and rubrics can empower learners to develop critical thinking, increase ownership, and enhance their communication skills. Thus, thoughtfully crafted peer review processes can also help to enhance the educational experience.

Authentic Activities

Incorporating authentic or experiential learning activities was also highlighted in student responses as a means of connecting course content to real-world scenarios. By integrating professional case studies, practical exercises, real-world applications, and reflective activities, educators can deepen students’ understanding of course material. Survey respondents noted again, and again how they felt engaged when coursework was relevant and applicable outside the classroom. This type of authentic work in courses can also increase learner motivation. (Gulikers, Bastiaens, & Kirschner, 2004)

Timely Feedback & RSI

By offering timely feedback on student work, online educators demonstrate their active presence and assist students in understanding the critical aspects of assessments, ultimately enhancing their chances of success. One student is quoted as saying,

“I really appreciate the involvement of the instructor. In the past I’ve had Ecampus classes where the teacher was doing the bare minimum and didn’t grade things until the last minute so I wasn’t even sure how I was doing in the class until it was almost over. I appreciate the speed at which things have been graded and the feedback I’ve already received. I appreciate the care put into announcements too!”

Timely feedback and time-bound announcements are also notable ways to showcase Regular and Substantive Interaction (RSI). Please also see “Regular & Substantive Interaction in Your Online Course” by Christine Scott.

Scaffolding

Another noteworthy recommendation from the survey findings was the importance of providing scaffolding and support throughout the course. Respondents expressed appreciation for feedback from peers and instructors to improve their writing. One student noted, “When I used my peer review feedback to improve my draft.” Offering additional resources and tutorials for unfamiliar or complex concepts ensures that all students have the support they need to succeed.Moreover, breaking down larger, high-stakes assignments into smaller, manageable tasks, can reduce feelings of overwhelm, provide a sense of accomplishment, increase early feedback and promote overall success. 

Autonomy

Furthermore, offering choice and flexibility in assignments and assessments empowers students to take ownership of their learning journey. Whether it’s offering choice in topics, deliverable types, or exercise formats, providing students with agency fosters a sense of autonomy and engagement. One respondent noted, “I think choosing a project topic was the most engaging part of this week, because allowing students to research things that they are interested [in,] within some constraints is a good way to get them engaged and interested in the topics.” 

Note on Survey Administration

One final take away from the study underscores the importance of thoughtful survey administration. While weekly surveys offer robust results, participating faculty indicated that surveying students every week was too frequent.   Instead, it’s recommended to conduct surveys between one to three times throughout the course, striking a balance between gathering insights and respecting students’ time. Additionally, transparent communication about the purpose and use of student feedback is essential for fostering trust and eliciting honest responses. Students should understand that their feedback is valued and how it will be utilized to improve their learning experience in both the current term and future iterations of the course.

Conclusion

Engagement and inclusion in online education is multifaceted and ongoing. By listening to student feedback, implementing actionable recommendations, and fostering a culture of continuous improvement, educators can create transformative learning experiences that empower students to thrive in the digital age. Together, let us embark on this journey towards inclusive excellence, ensuring that every learner has the opportunity to succeed while feeling valued, supported, and empowered to reach their full potential.

References

Akyol, Z., & Garrison, D. R. (2008). The development of a community of inquiry over time in an online course: Understanding the progression and integration of social, cognitive and teaching presence. Journal of Asynchronous Learning Networks, 12(3-4), 3-22. 10.24059/olj.v12i3.72 

Gulikers, J.T.M., Bastiaens, T.J. & Kirschner, P.A. (2004). A five-dimensional framework for authentic assessment. ETR&D 52, 67–86. https://doi.org/10.1007/BF02504676 

Scott, C. (2022, November 7). Regular & Substantive Interaction in Your Online Course. Ecampus Course Development & Training Blog.https://blogs.oregonstate.edu/inspire/2022/11/07/regular-substantive-interaction-in-your-online-course/ 

Watté, K. (2017, January 27). Alignment. Ecampus Course Development & Training Blog.   https://blogs.oregonstate.edu/inspire/2017/01/27/alignment/  


By Greta Underhill

In my last post, I outlined my search for a computer-assisted qualitative data analysis software (CAQDAS) program that would fit our Research Unit’s needs. We needed a program that would enable our team to collaborate across operating systems, easily adding in new team members as needed, while providing a user-friendly experience without a high learning curve. We also needed something that would adhere to our institution’s IRB requirements for data security and preferred a program that didn’t require a subscription. However, the programs I examined were either subscription-based, too cumbersome, or did not meet our institution’s IRB requirements for data security. It seemed that there just wasn’t a program out there to suit our team’s needs.

However, after weeks of continued searching, I found a YouTube video entitled “Coding Text Using Microsoft Word” (Harold Peach, 2014). At first, I assumed this would show me how to use Word comments to highlight certain text in a transcript, which is a handy function, but what about collating those codes into a table or Excel file? What about tracking which member of the team codes certain text? I assumed this would be an explanation of manual coding using Word, which works fine for some projects, but not for our team.

Picture of a dummy transcript using Lorem Ipsum placeholder text. Sentences are highlighted in red or blue depending upon the user. Highlighted passages have an associated “comment” where users have written codes.

Fortunately, my assumption was wrong. Dr. Harold Peach, Associate Professor of Education at Georgetown College, had developed a Word Macro to identify and pull all comments from the word document into a table (Peach, n.d.). A macro is “a series of commands and instructions that you group together as a single command to accomplish a task automatically” (Create or Run a Macro – Microsoft Support, n.d.). Once downloaded, the “Extract Comments to New Document” macro opens a template and produces a table of the coded information as shown in the image below. The macro identifies the following properties:

  • Page: the page on which the text can be found
  • Comment scope: the text that was coded
  • Comment text: the text contained in the comment; for the purpose of our projects, the code title
  • Author: which member of the team coded the information
  • Date: the date on which the text was coded

Picture of a table of dummy text that was generated from the “Extract Comments to New Document” Macro. The table features the following columns: Page, Comment Scope, Comment Text, Author, and Date.

You can move the data from the Word table into an Excel sheet where you can sort codes for patterns or frequencies, a function that our team was looking for in a program as shown below:

A picture of the dummy text table in an Excel sheet where codes have been sorted and grouped together by code name to establish frequencies.

This Word Macro was a good fit for our team for many reasons. First, our members could create comments on a Word document, regardless of their operating system. Second, we could continue to house our data on our institution’s servers, ensuring our projects meet strict IRB data security measures. Third, the Word macro allowed for basic coding features (coding multiple passages multiple times, highlighting coded text, etc.) and had a very low learning curve: teaching someone how to use Word Comments. Lastly, our institution provides access to the complete Microsoft Suite so all team members including students that would be working on projects already had access to the Word program. We contacted our IT department to have them verify that the macro was safe and for help downloading the macro.

Testing the Word Macro       

Once installed, I tested out the macro with our undergraduate research assistant on a qualitative project and found it to be intuitive and helpful. We coded independently and met multiple times to discuss our work. Eventually we ran the macro, pulled all comments from our data, and moved the macro tables into Excel where we manually merged our work. Through this process, we found some potential drawbacks that could impact certain teams.

First, researchers can view all previous comments made which might impact how teammates code or how second-cycle coding is performed; other programs let you hide previous codes so researcher can come at the text fresh.

Second, coding across paragraphs can create issues with the resulting table; cells merge in ways that make it difficult to sort and filter if moved to Excel, but a quick cleaning of the data took care of this issue.

Lastly, we manually merged our work, negotiating codes and content, as our codes were inductively generated; researchers working on deductive projects may bypass this negotiation and find the process of merging much faster.

Despite these potential drawbacks, we found this macro sufficient for our project as it was free to use, easy to learn, and a helpful way to organize our data. The following table summarizes the pro and cons of this macro.

Pros and Cons of the “Extract Comments to New Document” Word Macro

Pros

  • Easy to learn and use: simply providing comments in a Word document and running the macro
  • Program tracks team member codes which can be helpful in discussions of analysis
  • Team members can code separately by generating separate Word documents, then merge the documents to consensus code
  • Copying Word table to Excel provides a more nuanced look at the data
  • Program works across operating systems
  • Members can house their data in existing structures, not on cloud infrastructures
  • Macro is free to download

Cons

  • Previous comments are visible through the coding process which might impact other members’ coding or second round coding
  • Coding across paragraph breaks creates cell breaks in the resulting table that can make it hard to sort
  • Team members must manually merge their codes and negotiate code labels, overlapping data, etc.

Scientific work can be enhanced and advanced by the right tools; however, it can be difficult to distinguish which computer-assisted qualitative data analysis software program is right for a team or a project. Any of the programs mentioned in this paper would be good options for individuals who do not need to collaborate or for those who are working with publicly available data that require different data security protocols. However, the Word macro highlighted here is a great option for many research teams. In all, although there are many powerful computer-assisted qualitative data analysis software programs out there, our team found the simplest option was the best option for our projects and our needs.

References 

Create or run a macro—Microsoft Support. (n.d.). Retrieved July 17, 2023, from https://support.microsoft.com/en-us/office/create-or-run-a-macro-c6b99036-905c-49a6-818a-dfb98b7c3c9c

Harold Peach (Director). (2014, June 30). Coding text using Microsoft Word. https://www.youtube.com/watch?v=TbjfpEe4j5Y

Peach, H. (n.d.). Extract comments to new document – Word macros and tips – Work smarter and save time in Word. Retrieved July 17, 2023, from https://www.thedoctools.com/word-macros-tips/word-macros/extract-comments-to-new-document/

by Greta Underhill

Are you interested in qualitative research? Are you currently working on a qualitative project? Some researchers find it helpful to use a computer-assisted qualitative data analysis software (CAQDAS) program to help them organize their data through the analysis process. Although some programs can perform basic categorization for researchers, most software programs simply help researchers to stay organized while they conduct the deep analysis needed to produce scientific work. You may find a good CAQDAS program especially helpful when multiple researchers work with the same data set at different times and in different ways. Choosing the right CAQDAS for your project or team can take some time and research but is well worth the investment. You may need to consider multiple factors before determining a software program such as cost, operating system requirements, data security, and more.

For the Ecampus Research Unit, issues with our existing CAQDAS prompted our team to search for another program that would fit our specific needs: Here’s what we were looking for:

NeedsReasoning
General qualitative analysisWe needed a program for general analysis for multiple types of projects; Other programs are designed for specific forms of analysis such as Leximancer for content analysis
Compatibility across computer operating systems (OS)Our team used both Macs and PCs
Adherence to our institution’s IRB security requirementsLike many others, our institution and our team adhere to strict data security and privacy requirements, necessitating a close look at how a program would manage our data
Basic coding capabilitiesAlthough many programs offer robust coding capabilities, our team needed basic options such as coding one passage multiple times and visually representing coding through highlights
Export of codes into tables or Excel booksThis function is helpful for advanced analysis and reporting themes in multiple file formats for various audiences
A low learning-curveWe regularly bring in temporary team members on various projects for mentorship and research experience, making this a helpful function
A one-time purchaseA one-time purchase was the best fit for managing multiple and temporary team members on various projects

Testing a CAQDAS

I began systematically researching different CAQDAS options for the team. I searched “computer-assisted qualitative data analysis software” and “qualitative data analysis” in Google and Google Scholar. I also consulted various qualitative research textbooks and articles, as well as blogs, personal websites, and social media handles of qualitative researchers to identify software programs. Over the course of several months, I generated a list of programs to examine and test. Several programs were immediately removed from consideration as they are designed for different types of analysis: DiscoverText, Leximancer, MAXQDA, QDA Miner. These programs are powerful, but best suited for specific analysis, such as text mining. With the remaining programs, I signed up for software trials, attended several product demonstrations, participated in training sessions, borrowed training manuals from the library, studied how-to videos online, and contacted other scholars to gather information about the programs. Additionally, I tested whether programs would work across different operating systems. I kept recorded details about each of the programs tested, including how they handled data, the learning curve for each, their data security, whether they worked across operating system, how they would manage the export of codes, and whether they required a one-time or subscription-based payment. I started with three of the most popular programs, NVivo, Dedoose, and ATLAS.ti. The table below summarizes which of these programs fit our criteria.

NVivoDedooseATLAS.ti
General Qualitative Analysis
Cross-OS Collaboration
Data security
Basic coding capabilities
Export codes
Low learning curve
One-time purchase
A table demonstrating whether three programs (NVivo, Dedoose, and ATLAS.ti) meet the team’s requirements. Details of requirements will be discussed in the text of the blog below.

NVivo

I began by evaluating NVivo, a program I had used previously. NVivo is a powerful program that adeptly handled large projects and is relatively easy to learn. The individual license was available for one-time purchase and allowed the user to maintain their data on their own machine or institutional servers. However, it had no capabilities for cross-OS collaboration, even when clients purchased a cloud-based subscription. Our team members could download and begin using the program, but we would not be able to collaborate across operating systems.

Dedoose

I had no prior experience with Dedoose, so I signed up for a trial of the software. I was impressed with the product demonstration, which significantly helped in figuring out how to use the program. This program excelled at data visualization and allowed a research team to blind code the same files for interrater reliability if that suited the project. Additionally, I appreciated the options to view code density (how much of the text was coded) as well as what codes were present across transcripts. I was hopeful this cloud-based program would solve our cross-OS collaboration problem, but it did not pass the test for our institution’s IRB data security requirements because it housed our data on Dedoose servers.

ATLAS.ti

ATLAS.ti was also a new program for me, so I signed up for a trial of this software. It is a well-established program with powerful analysis functions such as helpful hierarchical coding capabilities and institutive links among codes, quotations, and comments. But the cross-OS collaboration, while possible via the web, proved to be cumbersome and this too did not meet the data security threshold for our institution’s IRB. Furthermore, the price point meant we would need to rethink our potential collaborations with other organizational members.

Data Security

Many programs are now cloud-based, which offer powerful analysis options, but unfortunately did not meet our IRB data security requirements. Ultimately, we had to cut Delve, MAXQDA, Taguette, Transana, and webQDA. All of these programs would have been low-learning curve options with basic coding functionality and cross-OS collaboration; however, for our team to collaborate, we would need to purchase a cloud-based subscription, which can quickly become prohibitively expensive, and house our data on company servers, which would not pass our institutional threshold for data security.

Note-taking programs

After testing multiple programs, I started looking beyond just qualitative software programs and into note-taking programs such as DevonThink, Obsidian, Roam Research, and Scrintal. I had hoped these might provide a work around by organizing data on collaborative teams in ways that would facilitate analysis. However, most of them did not have functionalities that could be used for coding or had high learning curves that precluded our team using them.

It seemed like I had exhausted all options and I still did not have a program to bring back to the Research Unit. I had no idea that a low-cost option was just a YouTube video away. Stay tuned for the follow-up post where we dive into the solution that worked best for our team.

 

Each year, the Oregon State University Ecampus Research Unit funds projects, up to $20,000 each, to support the research, development and scholarship efforts of faculty and/or departments in the area of online education through the OSU Ecampus Research Fellows program.

This program aims to:

  • Fund research that is actionable and impacts student online learning
  • Provide resources and support for research leading to external grant applications
  • Promote effective assessment of online learning
  • Encourage the development of a robust research pipeline on online teaching and learning at Oregon State

Fellows program applications are due Nov. 1 each year. If you are interested in submitting an application, reach out to Naomi Aguiar, the OSU Ecampus assistant director of research. Research Unit staff are available to help you design a quality research project and maximize your potential for funding.

Many Oregon State colleagues have had transformative experiences in this program.  A Fellows study funded in 2020 highlights the ways in which these projects have advanced research in online/hybrid education, as well as Fellows’ programs of research.

Fellows program highlight

Funding recipients expand the inclusivity mindset of computer science students

Lara Letaw, an experienced online instructor and lead researcher from Oregon State’s School of Electrical Engineering and Computer Science, partnered with Heather Garcia, an OSU Ecampus inclusive instructional designer on a research study called “Impacting the Inclusivity Mindset of Online Computer Science Students.”

Together with their team, Letaw and Garcia implemented an intervention that was designed to improve feelings of gender inclusivity among online computer science students and to train these students to develop more gender-inclusive software applications.

In this intervention, online computer science students experienced new curriculum developed by Letaw and Garcia’s team. The curriculum was based on GenderMag, a software inspection method for identifying and correcting gender biases in software. Curriculum for teaching GenderMag concepts can be found on the GenderMag Teach website. Students completed a set of assignments and, if they chose to participate in the research study, questionnaires about inclusivity climate, both in the course and in the computer science major. Students’ software design work was also evaluated for the use of gender-inclusive principles.

The image below shows examples of the cognitive facet values people (e.g., Letaw and Garcia) bring to their use of software, shown across the spectra of GenderMag facets (information processing style, learning style, motivations, attitude toward risk, and computer self-efficacy).

examples of the cognitive facet values people (e.g., Letaw and Garcia) bring to their use of software, shown across the spectra of GenderMag facets (information processing style, learning style, motivations, attitude toward risk, and computer self-efficacy).

Computer science students in the Ecampus courses Letaw and Garcia modified learned about their own cognitive styles and those of their teammates. They also built software that supports the cognitive diversity of software users. One student reflected, “Identifying my facet values was tremendously helpful [for articulating what had] been abstract… I feel much more confident.”

The results of their study showed that, overall, students felt included by the GenderMag curriculum (nobody felt excluded by it), it increased their interest in computer science, and it had positive effects on their team dynamics and self-acceptance. Students who completed the GenderMag intervention were also more effective in developing gender-inclusive software designs, and they reported greater recognition and respect for the diversity of software users.

The image below highlights what students considered when designing a software user interface before (left) and after (right) learning GenderMag concepts. As one student put it, “Now when I think of users using a piece of software I don’t picture them … just jumping in and tinkering … I am more aware that there are [people whose] interests in using a software … might not align with mine.”

what students considered when designing a software user interface before (left) and after (right) learning GenderMag concepts

As a result of this project, Letaw and Garcia published a paper in the ACM’s International Computing Education Research conference proceedings in 2021. This project contributed to a $300,000 National Science Foundation grant awarded to Oregon State’s Margaret Burnett, Letaw, and Kean University. With this funding from the NSF, they will partner on a project entitled, “Embedding Equitable Design through Undergraduate Computing Curricula.”

This Fellows project has also provided research opportunities for two female Ecampus computer science students (Rosalinda Garcia and Aishwarya Vellanki), a group that is typically underrepresented in STEM fields. Rosalinda Garcia successfully defended her honors thesis with these data in the spring of 2021, and Vellanki is currently working on her own.

Join the Ecampus Research Fellows Program

Learn more about the Fellows Program and what materials are needed to prepare your proposal.

Last fall, my colleague featured the Ecampus Research Fellows (ECRF) program in her blog post. The ECRF program, which began in 2016, funds OSU faculty-led research on online and hybrid education. Each year, approximately five projects are selected to receive funding. One unique aspect of the program is that, in the past few years, 1-2 members of the Ecampus Course Development and Training (CDT) team are paired with the faculty on funded research projects. The CDT team includes instructional designers and media developers. These professionals have expressed interest in conducting research, but in most cases, have had few opportunities to engage in formal research projects. Similar to faculty, CDT fellows have to apply to the ECRF program.

For this blog post, I’d like to share some takeaways from my experience as a CDT research fellow, as well as some takeaways my CDT colleagues have shared with me. I will also share some feedback from faculty fellows who have had CDT colleagues join their research teams. But before I dig into these valuable takeaways from past participants, let me first address the importance of this program for instructional designers and related disciplines.

In 2017, the Ecampus Research Unit published a report titled “Research Preparation and Engagement of Instructional Designers in U.S. Higher Education.” This report was the result of a national study of instructional designers working in higher education environments. Among the many findings of this study, one compelling finding was that more than half (55%) of respondents indicated that instructional designers need more training in research methods to fulfill their role. Instructional designers also indicated why they think it is important to gain more experience in research. Among the reasons, respondents indicated that research skill development would allow them to grow professionally, further their discipline, better understand the needs of students and faculty, and collaborate with faculty.

The Ecampus Research Unit (ECRU) answers this call through their CDT research fellows program.

In the summer of 2020 at the NWeLearn conference, three CDT fellows reflected upon their participation in the program, sharing valuable insights and experience. I, Heather Garcia, was one of them. The other participants were Susan Fein and Tianhong Shi. The full recording can be viewed on YouTube at this link, but I’ll summarize some highlights from the session in the following paragraphs.

The projects undertaken by CDT research fellows in partnership with faculty spanned disciplines from computer science to field-based courses. 

When asked why they were interested in being research fellows, all three participants indicated that they were pursuing additional graduate education at the time they applied. One participant also indicated that acquiring more knowledge and experience with research would allow faculty to see course design suggestions as “more convincing and easily accepted,” giving her additional credibility when recommending new design approaches to faculty.

The fellows also shared details about their contributions to the research projects they were working on. All of the instructional designers spoke to ways their existing expertise was valued by the researchers. They gave examples of the expertise they offered, which ranged from reviewing course design and educational technologies to designing surveys to offering a fresh perspective and a critical eye. In addition to contributing their design expertise to the research projects, CDT research fellows contributed to the research processes as well, through data analysis and research paper writing and reviewing.

All of the CDT research fellows indicated that they learned a lot from their experiences partnering with faculty on research. One particular highlight in this area is that fellows learned that they contribute diverse perspectives to the research process; they have different backgrounds, experiences, and areas of expertise, and everyone on the team contributes something valuable. CDT fellows also indicated that they learned about the IRB process and the importance of asking questions. Perhaps most importantly, they learned that their expertise is valuable to research teams.

Faculty fellows were also given the opportunity to share how having a CDT fellow on their research team enhanced the research experience, and their feedback was shared in the conference session. They expressed many positive sentiments about the experience including the following:

  • “Our research team started as a group of inspired but like-minded computer scientists wanting to make better online classrooms for diverse students. After she joined the team as an instructional design fellow, the work became credentialed, interdisciplinary, and stronger. She brings expertise and sees what we miss—she not only makes us better able to serve the students we hope to, she makes our team better by adding diversity of thought.”
  • “The combined knowledge and experience of teaching faculty and an instructional designer is incredibly powerful.”
  • “She viewed the scope of the research and content of the courses involved through a different lens than I did.”
  • “The instructional designer provided valuable input on areas of my project merging the instructional design with the research.”
  • “My work with the instructional designer let me explore very practical logistic issues that are often not included in the literature.”

Altogether, it becomes clear that many instructional designers are eager to participate on research projects and they are valuable contributors to the research process. The questions I have now are: How can we continue these partnerships into the future? And, how can we create more research partnership opportunities for other instructional designers and teaching and learning professionals, who aren’t traditionally involved in research?

References

Dello Stritto, M.E., Fein, S., Garcia, H., Shi, T. (2020). Instructional Designers and Faculty Partnerships in Online Teaching Research. NWeLearn 2020 Conference.

Linder, K. & Dello Stritto, M.E. (2017). Research Preparation and Engagement of Instructional Designers in U.S. Higher Education. Corvallis, OR: Oregon State University Ecampus Research Unit.

Loftin, D. (2020). Ecampus Research Fellows Program. Ecampus CDT Blog.

Over the last several years, research on online education has been growing rapidly. There has been an increased demand for quality research online teaching and learning. This demand now seems more urgent as teaching modalities are changing due to the COVID-19 pandemic. Since 2016, the Ecampus Research Unit has been funding OSU faculty-led research on online and hybrid education through the Ecampus Research Fellows Program. The goals of the program are the following:

  • To fund research that is actionable and that impacts students’ learning online;
  • To provide the resources and support to “seed” pilot research leading to external grant applications;
  • To promote effective assessment of online learning at the course and program-levels at OSU;
  • To encourage the development of a robust research pipeline on online teaching and learning at OSU.

Ecampus Research Fellows are funded for one year to engage in an independent research project on a topic related to online teaching and learning. Fellows may apply for up to $20,000 to support their research project. Up to 5 projects are funded each year. The program follows a cohort model in which fellows meet on a quarterly basis as a group to discuss their projects and receive support from the Research Unit. Each fellow completes an Institutional Review Board (IRB)-approved independent research project, and they are required to write a white paper based on their project results. The program’s white papers are published by the Ecampus Research Unit.

Actionable research impacting online education

In the past five years, the program has funded 24 projects with 34 faculty from across the university. The funded research has been conducted in anthropology, biology, chemistry, education, engineering, geography, mathematics, philosophy, physics, psychology public health, rangeland science, sociology, statistics and veterinary medicine. The faculty have benefitted from having dedicated time and resources to undertake these research projects. Their fellows’ projects are significant for their own research pipelines, and their findings are valuable Ecampus as we continue to innovate in our development of online courses. An example is geography instructor, Damien Hommel’s project, which led to a larger effort toward expanding experiential education for Ecampus courses beyond his discipline. Other fellows’ projects are providing valuable information about peer influence, inclusive teaching, hybrid laboratories, video segmentation, online research platforms, and more.

Becoming a research fellow

Are you an OSU faculty member interested in doing research on online education in your discipline? Previous experience with classroom-based or human subjects research is not a requirement. The Ecampus Research Unit is available to support you with your application and the research design process. We will be accepting our 6th cohort in 2021. The application is available now and is due on November 1st. Funded projects will be notified by December 1st.

If you have questions about the program contact Mary Ellen Dello Stritto (maryellen.dellostritto@oregonstate.edu), the director of research for OSU Ecampus. Additionally, attend an information session on Tuesday, September 29, 2020 at 1p.m. or Friday, October 2, 2020 at 11a.m. To register for one of these information sessions email: maryellen.dellostritto@oregonstate.edu.

About the Oregon State University Ecampus Research Unit

The Oregon State University Ecampus Research Unit responds to and forecasts the needs and challenges of the online education field through conducting original research; fostering strategic collaborations; and creating evidence-based resources and tools that contribute to effective online teaching, learning and program administration. The OSU Ecampus Research Unit is part of Oregon State Ecampus, the university’s top-ranked online education provider. Learn more at ecampus.oregonstate.edu/research.

 

By Susan Fein, Instructional Designer, OSU Ecampus

I recently volunteered to lead a book club at my institution for staff participating in a professional development program focused on leadership. The book we are using is The 9 Types of Leadership by Dr. Beatrice Chestnut. Using principles from the enneagram personality typing system, the book assesses nine behavioral styles and assesses them in the context of leadership.

At the same time, a colleague asked me to review a book chapter draft she is co-authoring that summarizes contemporary learning pedagogical approaches. These theories are derived from every conceivable arena, including psychology, philosophy, epistemology, neuroscience, and so on. In both of these situations, I found myself immersed in far-reaching and seemingly unlimited perspectives, principles, beliefs and approaches to explain the constructs of human behavior.

Was the universe trying to tell me something?

Here’s What Happened

To prepare for the book club, I completed five or six free online tests designed to identify my predominant enneagram style. Imagine my surprise when my results were all different! A few trends emerged, but the tests failed to consistently identify me as the same enneagram type. Does that mean the tests were flawed? Certainly that may be a partial contribution. After all, these were not the full-length battery that would be used if I were paying for an assessment administered by a certified enneagram practitioner.

But frankly, I think the variation had more to do with me. My mood, the time of day, my frame of mind; was I hungry, was I tired and a myriad of other factors likely affected my responses. The questions were subjective, scenario-based choices, so depending on my perspective in that instant, my selection varied, producing significantly different results. I suddenly realized that I wasn’t the same person from moment to moment!

Does that sound absurdly obvious? Was this a “duh” moment? At one level, yes, but for me, it was also an “ah-ha” moment. As educators, do we expect students to respond or react in a predictable and consistent way? Is that practical or realistic? I don’t think so.

Now I was intrigued! How could my role as an instructional designer be enhanced and improved through recognition of this changeability? How might I apply this new insight to support the design and development of effective online learning?

I didn’t have a clear-cut answer but I recognized a strong desire to communicate this new-found awareness to others. My first thought was to find research articles. Google Scholar to the rescue! After a nearly fruitless search, I found two loosely-related articles. I realized I was grasping at straws trying to cull out a relevant quote. I had to stop myself; why did I feel the need to cite evidence to validate my incident? I was struggling with how to cohesively convey my thoughts and connect them in a practicable, actionable way to my job as an instructional designer. My insight felt important and worth sharing via this blog post, but what could I write that would be meaningful to others? I was stumped!

I decided I should talk it over with a colleague, and that opened up a new inquiry into design thinking. Rushing back to my computer, I pulled up images of the design thinking process, trying to incorporate the phases into my experience. Was my insight empathy? Did it fit with ideation? Once again, I had to force myself to stop and just allow my experience to live on its own, without support from theories, models, or research.

In desperation, I sought advice from another trusted co-worker, explaining my difficulty unearthing some significant conclusion. We had a pleasant conversation and she related my experience to parenting. She said that sometimes she lets stuff roll right off when her teenager acts out, but at other times, under nearly identical circumstances, she struggles to hold it together and not scream. Then she mentioned a favorite educational tool, the grading rubric, and I was immediately relieved. Yes, that’s the ticket! I can relate my situation to a rubric. Hurray! This made sense. I rewrote my blog post draft explaining how rubrics allow us to more fairly and consistently assess student work, despite changes in mood, time of day, energy level, and all the other tiny things that affect us. Done!

Satisfied, I asked a third colleague to review my draft and offer comments. Surely she would be approving. After all, there were no facts, tips, tools, research or actionable conclusions to correct. What could she possibly find to negatively critique? She felt that the ending was rushed and artificially trying to solve a problem. Oh, my, how on target she was! I realized that I had no idea how to elegantly extricate myself from this perilous journey I’d started. My blog posts are usually research-based summaries of the benefits of active learning, blended learning and the like. Safe and secure ground. What was I doing writing a personal reflection with absolutely no solid academic foundation? This was new and scary territory.

Who Cares? I Do

In the end, I had to let go of my need to cite valid research-based arguments. I gave up my desire to offer pithy words of wisdom or quotes from authorities. Ultimately, this was a personal reflection and, as my colleague gently reminded me, I had to be vulnerable.

So what, exactly, is my point? What is it about those chameleon-like outcomes that feels important to share? What do I want to say as a take-away? Honestly, I’m not sure. I only know that in recognizing the influence of human factors on my moment-to-moment reactions, I was unexpectedly expanded. I felt more empathy for the faculty I work with and the students they teach. (Maybe I can fit design thinking in here after all…kidding!) I sensed a stronger connection to my humanity. I deepened my compassion. But is any of this important? I mean, really, who cares?

I do. I care. I work with people and for people. I work to support student success. My job allows me to partner with instructors and bolster their confidence to have positive impact on their students’ futures. If I am more open, more inclusive, more humble, more willing to consider other people’s ideas or perspectives, that’s not such a bad thing. And I don’t need research to validate my experience. It’s okay for me to just be present to a new awareness. It’s okay for me to just be human.

""

Are you interested in reading about research in the field of online teaching and learning? Could you use some help in reading and digesting the results of various research reports in the field? Would you like to be able to identify the strengths and weakness of the study reports that you read? If you answered “yes” to one or more of these questions then you might be interested in the  Ecampus Research Unit’s new resource: the Report Reader Checklist.

The Report Reader Checklist includes a comprehensive set of criteria that offers you a guide to evaluate the quality and rigor of study reports. The checklist is intended to provide an overview of the foundational elements that should be included when reporting on the results of a study. You can apply each checklist criterion to a report to see whether that element has been included or not.

Here is an overview of the six areas of the checklist and the criterion in each area:

  1. Context: Does the report describe the larger purpose of the study? Does it explain the history or theoretical framework? Does the report include research goals and suggestions for further research?
  2. Methodology: Does the report have a methodology section? Is it clear how data were collected and analyzed? If the study used statistics, were they named? If coding was used, was the procedure described?
  3. Sample: Are the study participants described in detail? Is it clear how participants were recruited? Does the sample represent an appropriate level of diversity? Are subgroups appropriately identified?
  4. Reporting Results: Are all numbers in the report easy to comprehend? Is the “N” provided? Does the report identify missing data? Is it clear where study findings fit with the study’s purpose? Do data visualizations enhance your understanding of the results?
  5. Transparency: Are raw data included in the report? Are instruments or study protocols provided in the report? Are the authors clear about any conflicts of interest? Is the discussion rooted in data results?
  6. Reader Experience: Does the report use language that is easy to understand? Is the report ADA accessible? Does it include a summary or abstract? Is the study an appropriate length?

There are no “points” or “weighting” within the checklist, but if you find one area (e.g., “Context” or “Methodology”) that is missing several criteria within a report, that would indicate that a report is weaker in that particular area.

You can download a one-page PDF of the checklist or visit our supplementary website that provides more details on each of the criterion. Further, the site includes sample reports for each criterion so that you can learn more about areas that you are unfamiliar with.

We hope you find this resource useful for reading and evaluating reports in the field. We also hope it helps you make data-driven decisions for your work.

About the Oregon State University Ecampus Research Unit: The Oregon State University Ecampus Research Unit makes research actionable through the creation of evidence-based resources related to effective online teaching, learning and program administration. The OSU Ecampus Research Unit is part of Oregon State Ecampus, the university’s top-ranked online education provider. Learn more at ecampus.oregonstate.edu/research.

 

Mary Ellen Dello Stritto, Assistant Director, Ecampus Research Unit

Online Learning Efficacy Research Database

Person looking at the research database on a computer screen

Despite the prevalence of online and hybrid or blended courses in higher education, there is still skepticism among faculty and administrators about the effectiveness of online learning compared to traditional classroom learning. While some individuals may have a basic awareness of the published research on online learning, some want to know about the research findings in their own home discipline. The Ecampus Research Unit has developed the Online Learning Efficacy Research Database, a tool to help address these needs and concerns. This searchable database contains research published in academic journals from the past 20 years that compare student outcomes in online, hybrid/blended, and face-to-face courses.

Using the Database

Screenshot of Research Database

The database currently includes 206 research citations across 73 discrete disciplines from 153 different journals. The database allows users to find discipline-specific research that compares two or more modalities (e.g. online versus hybrid). Users can search the database by keyword, discipline, modality, sample size, education level, date range, and journal name. The database also includes the ability to filter results by discipline, modality, sample size, and peer review status.

This new database improves upon other older searchable databases by adding the capability to search by specific disciplines. On a monthly basis, the database is updated with the latest published research. To learn more about scope of the database, sign up for monthly database updates, or to suggest a publication for inclusion in the database, see our FAQ page.

The database is also a valuable tool for those who are interested in or are currently engaging in research on the Scholarship of Teaching and Learning. It will provide users with an efficient way to find gaps in discipline specific literature and pursue research to fill those gaps.

What does the latest research say about the most effective approaches to online and blended learning?  Consider adding one or more of these peer-reviewed journals to your summer reading list:

International Review of Research in Open and Distance Learning – The current issue of this twice-a-year journal is a special edition on the hot topic of open educational resources.

Journal of Asynchronous Learning Networks – Articles in the latest issue delve into mobile learning, e-portfolios, and student engagement.  JALN is published by the Sloan Consortium, whose website has a wealth of resources about online and blended learning.

Journal of Interactive Online Learning – Recent articles cover learning analytics as predictive tools, the challenge of establishing a sense of community in an online course, and a comparative study of student performance in online and face-face chemistry courses.

Journal of Online Learning and Teaching – The current issue of JOLT (the best journal acronym here!) includes such diverse topics as instructor-made videos as a tool to scaffold learning, comparative usefulness of web-based tech tools in online teaching, and student perceptions of online discussions.  JOLT is published by MERLOT, the Multimedia Educational Resource for Learning and Online Teaching, a great collection of peer-reviewed open educational materials that could be useful in your online or classroom teaching.