Last fall, my colleague featured the Ecampus Research Fellows (ECRF) program in her blog post. The ECRF program, which began in 2016, funds OSU faculty-led research on online and hybrid education. Each year, approximately five projects are selected to receive funding. One unique aspect of the program is that, in the past few years, 1-2 members of the Ecampus Course Development and Training (CDT) team are paired with the faculty on funded research projects. The CDT team includes instructional designers and media developers. These professionals have expressed interest in conducting research, but in most cases, have had few opportunities to engage in formal research projects. Similar to faculty, CDT fellows have to apply to the ECRF program.

For this blog post, I’d like to share some takeaways from my experience as a CDT research fellow, as well as some takeaways my CDT colleagues have shared with me. I will also share some feedback from faculty fellows who have had CDT colleagues join their research teams. But before I dig into these valuable takeaways from past participants, let me first address the importance of this program for instructional designers and related disciplines.

In 2017, the Ecampus Research Unit published a report titled “Research Preparation and Engagement of Instructional Designers in U.S. Higher Education.” This report was the result of a national study of instructional designers working in higher education environments. Among the many findings of this study, one compelling finding was that more than half (55%) of respondents indicated that instructional designers need more training in research methods to fulfill their role. Instructional designers also indicated why they think it is important to gain more experience in research. Among the reasons, respondents indicated that research skill development would allow them to grow professionally, further their discipline, better understand the needs of students and faculty, and collaborate with faculty.

The Ecampus Research Unit (ECRU) answers this call through their CDT research fellows program.

In the summer of 2020 at the NWeLearn conference, three CDT fellows reflected upon their participation in the program, sharing valuable insights and experience. I, Heather Garcia, was one of them. The other participants were Susan Fein and Tianhong Shi. The full recording can be viewed on YouTube at this link, but I’ll summarize some highlights from the session in the following paragraphs.

The projects undertaken by CDT research fellows in partnership with faculty spanned disciplines from computer science to field-based courses. 

When asked why they were interested in being research fellows, all three participants indicated that they were pursuing additional graduate education at the time they applied. One participant also indicated that acquiring more knowledge and experience with research would allow faculty to see course design suggestions as “more convincing and easily accepted,” giving her additional credibility when recommending new design approaches to faculty.

The fellows also shared details about their contributions to the research projects they were working on. All of the instructional designers spoke to ways their existing expertise was valued by the researchers. They gave examples of the expertise they offered, which ranged from reviewing course design and educational technologies to designing surveys to offering a fresh perspective and a critical eye. In addition to contributing their design expertise to the research projects, CDT research fellows contributed to the research processes as well, through data analysis and research paper writing and reviewing.

All of the CDT research fellows indicated that they learned a lot from their experiences partnering with faculty on research. One particular highlight in this area is that fellows learned that they contribute diverse perspectives to the research process; they have different backgrounds, experiences, and areas of expertise, and everyone on the team contributes something valuable. CDT fellows also indicated that they learned about the IRB process and the importance of asking questions. Perhaps most importantly, they learned that their expertise is valuable to research teams.

Faculty fellows were also given the opportunity to share how having a CDT fellow on their research team enhanced the research experience, and their feedback was shared in the conference session. They expressed many positive sentiments about the experience including the following:

  • “Our research team started as a group of inspired but like-minded computer scientists wanting to make better online classrooms for diverse students. After she joined the team as an instructional design fellow, the work became credentialed, interdisciplinary, and stronger. She brings expertise and sees what we miss—she not only makes us better able to serve the students we hope to, she makes our team better by adding diversity of thought.”
  • “The combined knowledge and experience of teaching faculty and an instructional designer is incredibly powerful.”
  • “She viewed the scope of the research and content of the courses involved through a different lens than I did.”
  • “The instructional designer provided valuable input on areas of my project merging the instructional design with the research.”
  • “My work with the instructional designer let me explore very practical logistic issues that are often not included in the literature.”

Altogether, it becomes clear that many instructional designers are eager to participate on research projects and they are valuable contributors to the research process. The questions I have now are: How can we continue these partnerships into the future? And, how can we create more research partnership opportunities for other instructional designers and teaching and learning professionals, who aren’t traditionally involved in research?

References

Dello Stritto, M.E., Fein, S., Garcia, H., Shi, T. (2020). Instructional Designers and Faculty Partnerships in Online Teaching Research. NWeLearn 2020 Conference.

Linder, K. & Dello Stritto, M.E. (2017). Research Preparation and Engagement of Instructional Designers in U.S. Higher Education. Corvallis, OR: Oregon State University Ecampus Research Unit.

Loftin, D. (2020). Ecampus Research Fellows Program. Ecampus CDT Blog.

Over the last several years, research on online education has been growing rapidly. There has been an increased demand for quality research online teaching and learning. This demand now seems more urgent as teaching modalities are changing due to the COVID-19 pandemic. Since 2016, the Ecampus Research Unit has been funding OSU faculty-led research on online and hybrid education through the Ecampus Research Fellows Program. The goals of the program are the following:

  • To fund research that is actionable and that impacts students’ learning online;
  • To provide the resources and support to “seed” pilot research leading to external grant applications;
  • To promote effective assessment of online learning at the course and program-levels at OSU;
  • To encourage the development of a robust research pipeline on online teaching and learning at OSU.

Ecampus Research Fellows are funded for one year to engage in an independent research project on a topic related to online teaching and learning. Fellows may apply for up to $20,000 to support their research project. Up to 5 projects are funded each year. The program follows a cohort model in which fellows meet on a quarterly basis as a group to discuss their projects and receive support from the Research Unit. Each fellow completes an Institutional Review Board (IRB)-approved independent research project, and they are required to write a white paper based on their project results. The program’s white papers are published by the Ecampus Research Unit.

Actionable research impacting online education

In the past five years, the program has funded 24 projects with 34 faculty from across the university. The funded research has been conducted in anthropology, biology, chemistry, education, engineering, geography, mathematics, philosophy, physics, psychology public health, rangeland science, sociology, statistics and veterinary medicine. The faculty have benefitted from having dedicated time and resources to undertake these research projects. Their fellows’ projects are significant for their own research pipelines, and their findings are valuable Ecampus as we continue to innovate in our development of online courses. An example is geography instructor, Damien Hommel’s project, which led to a larger effort toward expanding experiential education for Ecampus courses beyond his discipline. Other fellows’ projects are providing valuable information about peer influence, inclusive teaching, hybrid laboratories, video segmentation, online research platforms, and more.

Becoming a research fellow

Are you an OSU faculty member interested in doing research on online education in your discipline? Previous experience with classroom-based or human subjects research is not a requirement. The Ecampus Research Unit is available to support you with your application and the research design process. We will be accepting our 6th cohort in 2021. The application is available now and is due on November 1st. Funded projects will be notified by December 1st.

If you have questions about the program contact Mary Ellen Dello Stritto (maryellen.dellostritto@oregonstate.edu), the director of research for OSU Ecampus. Additionally, attend an information session on Tuesday, September 29, 2020 at 1p.m. or Friday, October 2, 2020 at 11a.m. To register for one of these information sessions email: maryellen.dellostritto@oregonstate.edu.

About the Oregon State University Ecampus Research Unit

The Oregon State University Ecampus Research Unit responds to and forecasts the needs and challenges of the online education field through conducting original research; fostering strategic collaborations; and creating evidence-based resources and tools that contribute to effective online teaching, learning and program administration. The OSU Ecampus Research Unit is part of Oregon State Ecampus, the university’s top-ranked online education provider. Learn more at ecampus.oregonstate.edu/research.

 

By Susan Fein, Instructional Designer, OSU Ecampus

I recently volunteered to lead a book club at my institution for staff participating in a professional development program focused on leadership. The book we are using is The 9 Types of Leadership by Dr. Beatrice Chestnut. Using principles from the enneagram personality typing system, the book assesses nine behavioral styles and assesses them in the context of leadership.

At the same time, a colleague asked me to review a book chapter draft she is co-authoring that summarizes contemporary learning pedagogical approaches. These theories are derived from every conceivable arena, including psychology, philosophy, epistemology, neuroscience, and so on. In both of these situations, I found myself immersed in far-reaching and seemingly unlimited perspectives, principles, beliefs and approaches to explain the constructs of human behavior.

Was the universe trying to tell me something?

Here’s What Happened

To prepare for the book club, I completed five or six free online tests designed to identify my predominant enneagram style. Imagine my surprise when my results were all different! A few trends emerged, but the tests failed to consistently identify me as the same enneagram type. Does that mean the tests were flawed? Certainly that may be a partial contribution. After all, these were not the full-length battery that would be used if I were paying for an assessment administered by a certified enneagram practitioner.

But frankly, I think the variation had more to do with me. My mood, the time of day, my frame of mind; was I hungry, was I tired and a myriad of other factors likely affected my responses. The questions were subjective, scenario-based choices, so depending on my perspective in that instant, my selection varied, producing significantly different results. I suddenly realized that I wasn’t the same person from moment to moment!

Does that sound absurdly obvious? Was this a “duh” moment? At one level, yes, but for me, it was also an “ah-ha” moment. As educators, do we expect students to respond or react in a predictable and consistent way? Is that practical or realistic? I don’t think so.

Now I was intrigued! How could my role as an instructional designer be enhanced and improved through recognition of this changeability? How might I apply this new insight to support the design and development of effective online learning?

I didn’t have a clear-cut answer but I recognized a strong desire to communicate this new-found awareness to others. My first thought was to find research articles. Google Scholar to the rescue! After a nearly fruitless search, I found two loosely-related articles. I realized I was grasping at straws trying to cull out a relevant quote. I had to stop myself; why did I feel the need to cite evidence to validate my incident? I was struggling with how to cohesively convey my thoughts and connect them in a practicable, actionable way to my job as an instructional designer. My insight felt important and worth sharing via this blog post, but what could I write that would be meaningful to others? I was stumped!

I decided I should talk it over with a colleague, and that opened up a new inquiry into design thinking. Rushing back to my computer, I pulled up images of the design thinking process, trying to incorporate the phases into my experience. Was my insight empathy? Did it fit with ideation? Once again, I had to force myself to stop and just allow my experience to live on its own, without support from theories, models, or research.

In desperation, I sought advice from another trusted co-worker, explaining my difficulty unearthing some significant conclusion. We had a pleasant conversation and she related my experience to parenting. She said that sometimes she lets stuff roll right off when her teenager acts out, but at other times, under nearly identical circumstances, she struggles to hold it together and not scream. Then she mentioned a favorite educational tool, the grading rubric, and I was immediately relieved. Yes, that’s the ticket! I can relate my situation to a rubric. Hurray! This made sense. I rewrote my blog post draft explaining how rubrics allow us to more fairly and consistently assess student work, despite changes in mood, time of day, energy level, and all the other tiny things that affect us. Done!

Satisfied, I asked a third colleague to review my draft and offer comments. Surely she would be approving. After all, there were no facts, tips, tools, research or actionable conclusions to correct. What could she possibly find to negatively critique? She felt that the ending was rushed and artificially trying to solve a problem. Oh, my, how on target she was! I realized that I had no idea how to elegantly extricate myself from this perilous journey I’d started. My blog posts are usually research-based summaries of the benefits of active learning, blended learning and the like. Safe and secure ground. What was I doing writing a personal reflection with absolutely no solid academic foundation? This was new and scary territory.

Who Cares? I Do

In the end, I had to let go of my need to cite valid research-based arguments. I gave up my desire to offer pithy words of wisdom or quotes from authorities. Ultimately, this was a personal reflection and, as my colleague gently reminded me, I had to be vulnerable.

So what, exactly, is my point? What is it about those chameleon-like outcomes that feels important to share? What do I want to say as a take-away? Honestly, I’m not sure. I only know that in recognizing the influence of human factors on my moment-to-moment reactions, I was unexpectedly expanded. I felt more empathy for the faculty I work with and the students they teach. (Maybe I can fit design thinking in here after all…kidding!) I sensed a stronger connection to my humanity. I deepened my compassion. But is any of this important? I mean, really, who cares?

I do. I care. I work with people and for people. I work to support student success. My job allows me to partner with instructors and bolster their confidence to have positive impact on their students’ futures. If I am more open, more inclusive, more humble, more willing to consider other people’s ideas or perspectives, that’s not such a bad thing. And I don’t need research to validate my experience. It’s okay for me to just be present to a new awareness. It’s okay for me to just be human.

""

Are you interested in reading about research in the field of online teaching and learning? Could you use some help in reading and digesting the results of various research reports in the field? Would you like to be able to identify the strengths and weakness of the study reports that you read? If you answered “yes” to one or more of these questions then you might be interested in the  Ecampus Research Unit’s new resource: the Report Reader Checklist.

The Report Reader Checklist includes a comprehensive set of criteria that offers you a guide to evaluate the quality and rigor of study reports. The checklist is intended to provide an overview of the foundational elements that should be included when reporting on the results of a study. You can apply each checklist criterion to a report to see whether that element has been included or not.

Here is an overview of the six areas of the checklist and the criterion in each area:

  1. Context: Does the report describe the larger purpose of the study? Does it explain the history or theoretical framework? Does the report include research goals and suggestions for further research?
  2. Methodology: Does the report have a methodology section? Is it clear how data were collected and analyzed? If the study used statistics, were they named? If coding was used, was the procedure described?
  3. Sample: Are the study participants described in detail? Is it clear how participants were recruited? Does the sample represent an appropriate level of diversity? Are subgroups appropriately identified?
  4. Reporting Results: Are all numbers in the report easy to comprehend? Is the “N” provided? Does the report identify missing data? Is it clear where study findings fit with the study’s purpose? Do data visualizations enhance your understanding of the results?
  5. Transparency: Are raw data included in the report? Are instruments or study protocols provided in the report? Are the authors clear about any conflicts of interest? Is the discussion rooted in data results?
  6. Reader Experience: Does the report use language that is easy to understand? Is the report ADA accessible? Does it include a summary or abstract? Is the study an appropriate length?

There are no “points” or “weighting” within the checklist, but if you find one area (e.g., “Context” or “Methodology”) that is missing several criteria within a report, that would indicate that a report is weaker in that particular area.

You can download a one-page PDF of the checklist or visit our supplementary website that provides more details on each of the criterion. Further, the site includes sample reports for each criterion so that you can learn more about areas that you are unfamiliar with.

We hope you find this resource useful for reading and evaluating reports in the field. We also hope it helps you make data-driven decisions for your work.

About the Oregon State University Ecampus Research Unit: The Oregon State University Ecampus Research Unit makes research actionable through the creation of evidence-based resources related to effective online teaching, learning and program administration. The OSU Ecampus Research Unit is part of Oregon State Ecampus, the university’s top-ranked online education provider. Learn more at ecampus.oregonstate.edu/research.

 

Mary Ellen Dello Stritto, Assistant Director, Ecampus Research Unit

Online Learning Efficacy Research Database

Person looking at the research database on a computer screen

Despite the prevalence of online and hybrid or blended courses in higher education, there is still skepticism among faculty and administrators about the effectiveness of online learning compared to traditional classroom learning. While some individuals may have a basic awareness of the published research on online learning, some want to know about the research findings in their own home discipline. The Ecampus Research Unit has developed the Online Learning Efficacy Research Database, a tool to help address these needs and concerns. This searchable database contains research published in academic journals from the past 20 years that compare student outcomes in online, hybrid/blended, and face-to-face courses.

Using the Database

Screenshot of Research Database

The database currently includes 206 research citations across 73 discrete disciplines from 153 different journals. The database allows users to find discipline-specific research that compares two or more modalities (e.g. online versus hybrid). Users can search the database by keyword, discipline, modality, sample size, education level, date range, and journal name. The database also includes the ability to filter results by discipline, modality, sample size, and peer review status.

This new database improves upon other older searchable databases by adding the capability to search by specific disciplines. On a monthly basis, the database is updated with the latest published research. To learn more about scope of the database, sign up for monthly database updates, or to suggest a publication for inclusion in the database, see our FAQ page.

The database is also a valuable tool for those who are interested in or are currently engaging in research on the Scholarship of Teaching and Learning. It will provide users with an efficient way to find gaps in discipline specific literature and pursue research to fill those gaps.

What does the latest research say about the most effective approaches to online and blended learning?  Consider adding one or more of these peer-reviewed journals to your summer reading list:

International Review of Research in Open and Distance Learning – The current issue of this twice-a-year journal is a special edition on the hot topic of open educational resources.

Journal of Asynchronous Learning Networks – Articles in the latest issue delve into mobile learning, e-portfolios, and student engagement.  JALN is published by the Sloan Consortium, whose website has a wealth of resources about online and blended learning.

Journal of Interactive Online Learning – Recent articles cover learning analytics as predictive tools, the challenge of establishing a sense of community in an online course, and a comparative study of student performance in online and face-face chemistry courses.

Journal of Online Learning and Teaching – The current issue of JOLT (the best journal acronym here!) includes such diverse topics as instructor-made videos as a tool to scaffold learning, comparative usefulness of web-based tech tools in online teaching, and student perceptions of online discussions.  JOLT is published by MERLOT, the Multimedia Educational Resource for Learning and Online Teaching, a great collection of peer-reviewed open educational materials that could be useful in your online or classroom teaching.