Traveler items

Traveler itemsHave you ever taken a trip with a tour group? Or looked at an itinerary of places and activities to see if it meets your expectations and/or fits into your schedule? Most guided tours include an itinerary with a list of destinations, activities, and time allotted. This helps travelers manage their expectations and time.

Now, have you ever thought of an online course as a guided trip? The instructor is similar to a tour guide, leading student travelers to their destination. And, like travelers, students naturally want to know what to expect and how much time to commit to their learning. They could benefit from a detailed itinerary, or schedule of activities, that includes estimated time commitment for each week.

As an instructional designer for hybrid and online courses, I like to include a detailed schedule for each week to help students organize their time and stay on task. In order to determine what is on that schedule, I begin the design process with a draft of the course syllabus that outlines where the students are headed (learning outcomes) and how the instructor knows they arrived (assessments). This draft helps me understand the instructor’s plans for the course. Together, we look at the learning outcomes and assessments, as well as course requirements like credit hours to determine appropriate learning activities along the way. The course credit hours inform the workload requirements for students.  For example, Oregon State University is on the quarter system and their policy states that one credit hour is equivalent to 3-4 hours of course work each week. If a course is worth 3 credit hours, then students should expect to dedicate 9-12 hours each week to their course. I use a workload estimator created by The Center for Teaching Excellence at Rice University to help with the estimates. This tool provides a reasonable estimation of the workload expectations for students and can be used to verify whether the course meets the university’s guidelines for the assigned credit hours. (For more information on how the estimates are made, see the Rice University CTE blog post.)

While all of this information is useful to instructors, I also encourage them to share a weekly list of activities along with the calculations with students. Tour guides provide detailed schedules informing travelers where they are going, the order of the activities, and the time allotted to each activity, why not do that for students? Below, I’ve included a sample for how I do this in my courses. I create a weekly table on an introduction page at the beginning of each module within our LMS. This table includes a suggested order of the activities, the estimated time commitment to complete the activities, along with the official due dates. Anecdotally, students appreciate the schedule and use it to manage their time. I encourage you to consider using a detailed schedule with your future courses.

Example of a weekly Detailed Schedule

References

Rice Blog: https://cte.rice.edu/blogarchive/2016/07/11/workload

Barre, E. (2016, July 11). How much should we assign? Estimating out of class workload [Blog post]. Retrieved from http://cte.rice.edu/blogarchive/2016/07/11/workload.

Photo by Dariusz Sankowski on Unsplash

This post is the second in a three-part series that summarizes conclusions and insights from research of active, blended, and adaptive learning practices. Part one covered active learning, and today’s article focuses on the value of blended learning.

First Things First

What, exactly, is “blended” learning? Dictionary.com defines it as a “style of education in which students learn via electronic and online media as well as traditional face-to-face learning.” This is a fairly simplistic view, so Clifford Maxwell (2016), on the Blended Learning Universe website, offers a more detailed definition that clarifies three distinct parts:

  1. Any formal education program in which at least part of the learning is delivered online, wherein the student controls some element of time, place, path or pace.
  2. Some portion of the student’s learning occurs in a supervised physical location away from home, such as in a traditional on-campus classroom.
  3. The learning design is structured to ensure that both the online and in-person modalities are connected to provide a cohesive and integrated learning experience.

It’s important to note that a face-to-face class that simply uses an online component as a repository for course materials is not true blended learning. The first element in Maxwell’s definition, where the student independently controls some aspect of learning in the online environment, is key to distinguishing blended learning from the mere addition of technology.

You may also be familiar with other popular terms for blended learning, including hybrid or flipped classroom. Again, the common denominator is that the course design intentionally, and seamlessly, integrates both modalities to achieve the learning outcomes.

Let’s examine what the research says about the benefits of combining asynchronous, student-controlled learning with instructor-driven, face-to-face teaching.

Does Blended Learning Offer Benefits?

Blended Learning Icon

The short answer is yes.

The online component of blended learning can help “level the playing field.” In many face-to-face classes, students may be too shy or reluctant to speak up, ask questions, or offer an alternate idea. A blended environment combines the benefit of giving students time to compose thoughtful comments for an online discussion without the pressure and think-on-your-feet demand of live discourse, while maintaining direct peer engagement and social connections during in-classroom sessions (Hoxie, Stillman, & Chesal, 2014). Blended learning, through its asynchronous component, allows students to engage with materials at their own pace and reflect on their learning when applying new concepts and principles (Margulieux, McCracken, & Catrambone, 2015).

Since well-designed online learning produces equivalent outcomes to in-person classes, lecture and other passive information can be shifted to the online format, freeing up face-to-face class time for active learning, such as peer discussions, team projects, problem-based learning, supporting hands-on labs or walking through simulations (Bowen, Chingos, Lack, & Nygren, 2014). One research study found that combining online activities with in-person sessions also increased students’ motivation to succeed (Sithole, Chiyaka, & McCarthy, 2017).

What Makes Blended Learning So Effective?

Five young people studying with laptop and tablet computers on white desk. Beautiful girls and guys working together wearing casual clothes. Multi-ethnic group smiling.

Nearly all the research reviewed concluded that blended learning affords measurable advantages over exclusively face-to-face or fully online learning (U.S. Department of Education, Office of Planning, Evaluation, and Policy Development, 2009). The combination of technology with well-designed in-person interaction provides fertile ground for student learning. Important behaviors and interactions such as instructor feedback, assignment scaffolding, hands-on activities, reflection, repetition and practice were enhanced, and students also gained advantages in terms of flexibility, time management, and convenience (Margulieux, McCracken, & Catrambone, 2015).

Blended learning tends to benefit disadvantaged or academically underprepared students, groups that typically struggle in fully online courses (Chingosa, Griffiths, Mulhern, and Spies, 2017). Combining technology with in-person teaching helped to mitigate some challenges faced by many students in scientific disciplines, improving persistence and graduation rates. And since blended learning can be supportive for a broader range of students, it may increase retention and persistence for underrepresented groups, such as students of color (Bax, Campbell, Eabron, & Thomson, 2014–15).

Blended learning  benefits instructors, too. When asked about blended learning, most university faculty and instructors believe it to be more effective (Bernard, Borokhovski, Schmid, Tamim, & Abrami, 2014). The technologies used often capture and provide important data analytics, which help instructors more quickly identify under-performing students so they can provide extra support or guidance (McDonald, 2014). Many online tools are interactive, fun and engaging, which encourages student interaction and enhances collaboration (Hoxie, Stillman, & Chesal, 2014). Blended learning is growing in acceptance and often seen as a favorable approach because it synthesizes the advantages of traditional instruction with the flexibility and convenience of online learning (Liu, et al., 2016).

A Leap of Faith

Is blended learning right for your discipline or area of expertise? If you want to give it a try, there are many excellent internet resources available to support your transition.

Though faculty can choose to develop a blended class on their own, Oregon State instructors who develop a hybrid course through Ecampus receive full support and resources, including collaboration with an instructional designer, video creation and media development assistance. The OSU Center for Teaching and Learning offers workshops and guidance for blended, flipped, and hybrid classes. The Blended Learning Universe website, referenced earlier, also provides many resources, including a design guide, to support the transformation of a face-to-face class into a cohesive blended learning experience.

If you are ready to reap the benefits of both online and face-to-face teaching, I urge you to go for it! After all, the research shows that it’s a pretty safe leap.

For those of you already on board with blended learning, let us hear from you! Share your stories of success, lessons learned, do’s and don’ts, and anything else that would contribute to instructors still thinking about giving blended learning a try.

Susan Fein, Oregon State University Ecampus Instructional Designer
susan.fein@oregonstate.edu | 541-747-3364

References

  • Bax, P., Campbell, M., Eabron, T., & Thomson, D. (2014–15). Factors that Impede the Progress, Success, and Persistence to Pursue STEM Education for Henderson State University Students Who Are Enrolled in Honors College and in the McNair Scholars Program. Henderson State University. Arkadelphia: Academic Forum.
  • Bernard, R. M., Borokhovski, E., Schmid, R. F., Tamim, R. M., & Abrami, P. C. (2014). A meta-analysis of blended learning and technology use in higher education: From the general to the applied. J Comput High Educ, 26, 87–122.
  • Bowen, W. G., Chingos, M. M., Lack, K. A., & Nygren, T. I. (2014). Interactive learning online at public universities: Evidence from a six-campus randomized trial. Journal of Policy Analysis and Management, 33(1), 94–111.
  • Chingosa, M. M., Griffiths, R. J., Mulhern, C., & Spies, R. R. (2017). Interactive online learning on campus: Comparing students’ outcomes in hybrid and traditional courses in the university system of Maryland. The Journal of Higher Education, 88(2), 210-233.
  • Hoxie, A.-M., Stillman, J., & Chesal, K. (2014). Blended learning in New York City. In A. G. Picciano, & C. R. Graham (Eds.), Blended Learning Research Perspectives (Vol. 2, pp. 327-347). New York: Routledge.
  • Liu, Q., Peng, W., Zhang, F., Hu, R., Li, Y., & Yan, W. (2016). The effectiveness of blended learning in health professions: Systematic review and meta-analysis. Journal of Medical Internet Research, 18(1). doi:10.2196/jmir.4807
  • Maxwell, C. (2016, March 4). What blended learning is – and isn’t. Blog post. Retrieved from Blended Learning Universe.
  • Margulieux, L. E., McCracken, W. M., & Catrambone, R. (2015). Mixing in-class and online learning: Content meta-analysis of outcomes for hybrid, blended, and flipped courses. In O. Lindwall, P. Hakkinen, T. Koschmann, & P. Tchoun (Ed.), Exploring the Material Conditions of Learning: Computer Supported Collaborative Learning (CSCL) Conference (pp. 220-227). Gothenburg, Sweden: The International Society of the Learning Sciences.
  • McDonald, P. L. (2014). Variation in adult learners’ experience of blended learning in higher education. In Blended Learning Research Perspectives (Vol. 2, pp. 238-257). Routledge.
  • Sithole, A., Chiyaka, E. T., & McCarthy, P. (2017). Student attraction, persistence and retention in STEM programs: Successes and continuing challenges. Higher Education Studies, 7(1).
  • U.S. Department of Education, Office of Planning, Evaluation, and Policy Development. (2009). Evaluation of Evidence-Based Practices in Online Learning: A Meta-Analysis and Review of Online Learning Studies. Washington, D.C.

Image Credits

  • Blended Learning Icon: Innovation Co-Lab Duke Innovation Co-Lab [CC0]
  • Leap of Faith: Photo by Denny Luan on Unsplash
  • School photo created by javi_indy – www.freepik.com

There are many benefits to using rubrics for both instructors and students, as discussed in Rubrics Markers of Quality Part 1 – Unlock the Benefits. Effective rubrics serve as a tool to foster excellence in teaching and learning, so let’s take a look at some best practices and tips to get you started.

Best Practices

Alignment

Rubrics should articulate a clear connection between how students demonstrate learning and the (CLO) Course Learning Outcomes. Solely scoring gateway criteria, the minimum expectations for a task, (e.g., word count, number of discussion responses) can be alluring. Consider a rubric design to move past minimum expectations and assess what students should be able to do after completing a task.

Detailed, Measurable, and Observable

Clear and specific rubrics have the potential to communicate to how to demonstrate learning, how performance evaluation measures, and markers of excellence. The details provide students with a tool to self-assess their progress and level up their performance autonomously.

Language Use

Rubrics create the opportunity to foster an inclusive learning environment. Application of clear and consistent language takes into consideration a diverse student composition. Online students hail from around the world and speak various native languages. Learners may interpret the meaning of different words differently. Use simple terms with specific and detailed descriptions. Doing so creates space for students to focus on learning instead of decoding expectations. Additionally, consider the application of parallel language consistently. The use of similar language (e.g. demonstrates, mostly demonstrates, and doesn’t demonstrate) across each criterion can be helpful to differentiate between each performance level.

Tips of the Trade!

Suitability

Consider the instructional aim, learning outcomes, and the purpose of a task when choosing the best rubric for your course.

  • Analytic Rubrics: The hallmark design of an analytic rubric evaluates performance criteria separately. Characteristically this rubric’s structure is a grid, and evaluation of performance scores are on a continuum of levels. Analytic rubrics are detailed, specific, measurable, and observable. Therefore, this rubric type is an excellent tool for formative feedback and assessment of learning outcomes.
  • Holistic Rubrics: Holistic rubrics evaluate criteria together in one general description for each performance level. Ideally, this rubric design evaluates the overall quality of a task.  Consider the application of a holistic rubric can when an exact answer isn’t needed, when deviation or errors are allowed, and for interpretive/exploratory activities.
  • General Rubrics: Generalized rubrics can be leveraged to assess multiple tasks that have the same learning outcomes (e.g., reflection paper, journal). Performance dimensions focus solely on outcomes versus discrete task features.

Explicit Expectations

Demystifying expectations can be challenging.  Consider articulating performance expectations in the task description before deploying a learning task. Refrain from using rubrics as a standalone vehicle to communicate expectations. Unfortunately, students may miss the rubric all together and fail to meet expectations. Secondly, make the implicit explicit! Be transparent. Provide students with all the information and tools they need to be successful from the outset.

Iterate

A continuous improvement process is a key to developing high-quality assessment rubrics. Consider multiple tests and revisions of the rubric. There are several strategies for testing a rubric. 1) Consider asking students, teaching assistants, or professional colleagues to score a range of work samples with a rubric. 2) Integrate opportunities for students to conduct self-assessments. 3) Consider assessing a task with the same rubric between course sections and academic terms. Reflect on how effectively and accurately the rubric performed, after testing is complete. Revise and redeploy as needed.

Customize

Save some time, and don’t reinvent the wheel. Leverage existing samples and templates. Keep in mind that existing resources weren’t designed with your course in mind. Customization will be needed to ensure the accuracy and effectiveness of the rubric.

Are you interested in learning more about rubrics and how they can enrich your course? Your Instructional Designer can help you craft effective rubrics that will be the best fit for your unique course.

References

Additional Resources

The Basics
Best Practices
Creating and Designing Rubrics

One of the most common questions I get as an Instructional Designer is, “How do I prevent cheating in my online course?” Instructors are looking for detection strategies and often punitive measures to catch, report, and punish academic cheaters. Their concerns are understandable—searching Google for the phrase “take my test for me,” returns pages and pages of results from services with names like “Online Class Hero” and “Noneedtostudy.com” that promise to use “American Experts” to help pass your course with “flying grades.” 1 But by focusing only on what detection measures we can implement and the means and methods by which students are cheating, we are asking the wrong questions. Instead let’s consider what we can do to understand why students cheat, and how careful course and assessment design might reduce their motivation to do so.

A new study published in Computers & Education identified five specified themes in analyzing the reasons students provided when seeking help from contract cheating services (Amigud & Lancaster, 2019):

  • Academic Aptitude – “Please teach me how to write an essay.”
  • Perseverance – “I can’t look at it anymore.”
  • Personal Issues – “I have such a bad migraine.”
  • Competing Objectives – “I work so I don’t have time.”
  • Self-Discipline – “I procrastinated until today.”

Their results showed that students don’t begin a course with the intention of academic misconduct. Rather, they reach a point, often after initially attempting the work, when the perception of pressures, lack of skills, or lack of resources removes their will to complete the course themselves. Online students may be more likely to have external obligations and involvement in non-academic activities. According to a 2016 study, a significant majority of online students are often juggling other obligations, including raising children and working while earning their degrees (Clinefelter & Aslanian, 2016).

While issues with cheating are never going to be completely eliminated, several strategies have emerged in recent research that focus on reducing cheating from a lens of design rather than one of punishment. Here are ten of my favorite approaches that speak to the justifications identified by students that led to cheating:

  1. Make sure that students are aware of academic support services (Yu, Glanzer, Johnson, Sriram, & Moore, 2018). Oregon State, like many universities, offers writing help, subject-area tutors and for Ecampus students, a Student Success team that can help identify resources and provide coaching on academic skills. Encourage students, leading up to exams or big assessment projects, to reach out during online office hours or via email if they feel they need assistance.
  2. Have students create study guides as a precursor assignment to an exam—perhaps using online tools to create mindmaps or flashcards. Students who are better prepared for assessments have a reduced incentive to cheat. Study guides can be a non-graded activity, like a game or practice quiz, or provided as a learning resource.
  3. Ensure that students understand the benefits of producing their own work and that the assessment is designed to help them develop and demonstrate subject knowledge (Lancaster & Clarke, 2015). Clarify for students the relevance of a particular assessment and how it relates to the weekly and larger course learning outcomes.
  4. Provide examples of work that meets your expectations along with specific evaluation criteria. Students need to understand how they are being graded and be able to judge the quality of their own work. A student feeling in the dark about what is expected from them may be more likely to turn to outside help.
  5. Provide students with opportunities throughout the course to participate in activities, such as discussions and assignments, that will prepare them for summative assessments (Morris, 2018).
  6. Allow students to use external sources of information while taking tests. Assessments in which students are allowed to leverage the materials they have learned from to construct a response do a better job of assessing higher order learning. Memorizing and repeating information is rarely what we hope students to achieve at the end of instruction.
  7. Introduce alternative forms of assessment. Creative instructors can design learning activities that require students to develop a deeper understanding and take on more challenging assignments. Examples of these include recorded presentations, debates, case studies, portfolios, and research projects.
  8. Rather than a large summative exam at the end of a course, focus on more frequent smaller, formative assessments (Lancaster & Clarke, 2015). Provide students with an ongoing opportunity to demonstrate their knowledge without the pressure introduced by a final exam that accounts for a substantial portion of their grade.
  9. Create a course environment that is safe to make and learn from mistakes. Build into a course non-graded activities in which students can practice the skills they will need to demonstrate during an exam.
  10. Build a relationship with students. When instructors are responsive to student questions, provide substantive feedback throughout a course and find other ways to interact with students — they are less likely to cheat. It matters if students believe an instructor cares about them (Bluestein, 2015).

No single strategy is guaranteed to immunize your course against the possibility that a student will use some form of cheating. Almost any type of assignment can be purchased quickly online. The goal of any assessment should be to ensure that students have met the learning outcomes—not to see if we can catch them cheating. Instead, focus on understanding pressures a student might face to succeed in a course, and the obstacles they could encounter in doing so. Work hard to connect with your students during course delivery and humanize the experience of learning online. Thoughtful design strategies, those that prioritize supporting student academic progress, can alleviate the conditions that lead to academic integrity issues.


1 This search was suggested by an article published in the New England Board of Higher Education on cheating in online programs. (Berkey & Halfond, 2015)

References

Amigud, A., & Lancaster, T. (2019). 246 reasons to cheat: An analysis of students’ reasons for seeking to outsource academic work. Computers & Education, 134, 98–107. https://doi.org/10.1016/j.compedu.2019.01.017

Berkey, D., & Halfond, J. (2015). Cheating, student authentication and proctoring in online programs.

Bluestein, S. A. (2015). Connecting Student-Faculty Interaction to Academic Dishonesty. Community College Journal of Research and Practice, 39(2), 179–191. https://doi.org/10.1080/10668926.2013.848176

Clinefelter, D. D. L., & Aslanian, C. B. (2016). Comprehensive Data on Demands and Preferences. 60.

Lancaster, T., & Clarke, R. (2015). Contract Cheating: The Outsourcing of Assessed Student Work. In T. A. Bretag (Ed.), Handbook of Academic Integrity (pp. 1–14). https://doi.org/10.1007/978-981-287-079-7_17-1

Morris, E. J. (2018). Academic integrity matters: five considerations for addressing contract cheating. International Journal for Educational Integrity, 14(1), 15. https://doi.org/10.1007/s40979-018-0038-5

Yu, H., Glanzer, P. L., Johnson, B. R., Sriram, R., & Moore, B. (2018). Why College Students Cheat: A Conceptual Model of Five Factors. The Review of Higher Education, 41(4), 549–576. https://doi.org/10.1353/rhe.2018.0025

Oregon State University’s Learning Management System (LMS) migrated to Canvas in 2014-2015. The Canvas migration was based not only on the company’s feature alignment with our learning platform needs but also on the outstanding customer service Canvas Instructure has provided to our LMS user community including students, faculty, instructional designers, and administrators. How Canvas provides customer service offers an example we can model to continue to exceed student expectations.

According to Michael Feldstein’s July 8, 2018 report, major players in US LMS market include Blackboard, Canvas, Moodle, Bright Space, Sakai, Schoology, and others (Feldstein, 2018).

LMS Market share in North America

Figure 1: US Primary LMS Systems, July 6th, 2018 (Feldstein, 2018)

 

Of these major players in the LMS field, Canvas is most noticeable with fastest growth in market share among U.S. and Canadian higher education institutions.

LMS history and Market Share

Figure 2. LMS Market Share for US and Canadian Higher Ed Institutions (Feldstein, 2018)

 

Different people suggest different criteria when comparing LMSs. Udutu.com provided a list of 7 things to think about before purchasing a LMS:

  1. Be clear on your learning and training objectives;
  2. Don’t be fooled by the high costs of an LMS;
  3. Know the limitations of your internal team and users;
  4. Pay for the features you need, not for what you might need;
  5. The latest new technology is not necessarily the best one;
  6. Customer support is everything; and
  7. Trust demos and trials over reviews, ratings and “industry experts”

(Udutu, 2016).  Noud (2016) suggested the following ten factors to consider when selecting a LMS:

  1. Unwanted Features;
  2. Mobile Support;
  3. Integrations (APIs, SSO);
  4. Customer Support;
  5. Content Support;
  6. Approach to pricing;
  7. Product roadma;
  8. Scalability, Reliability and Security;
  9. Implementation Timeframe; and
  10. Hidden costs.

Christopher Pappas (2017) suggested 9 factors to consider when calculating your LMS budget:

  1. Upfront costs;
  2. LMS training;
  3. Monthly Or Annual Licensing Fees;
  4. Compatible eLearning Authoring Tools;
  5. Pay-per-User/Learner Fee;
  6. Upgrades and Add-Ons;
  7. Learning and Development Team Payroll;
  8. Online Training Development Costs; and
  9. Ongoing Maintenance.

Of all of the above lists, I like Udutu’s list the best because it matches with my personal experiences with LMS migrations.

I first used WebCT between 2005 and 2007, participated in migrating from WebCT Vista to Blackboard in 2008, and Angel to Blackboard migration in 2013-2014.  During my seven years of using Blackboard as instructional designer and faculty support staff, my biggest complaint with Blackboard was its unexpected server outages during peak times such as beginning of the term and final’s weeks. In 2014, I moved to Oregon State University (OSU). The OSU community was looking for a new LMS in 2013 and started piloting Canvas in 2014. At the end of the pilot, instructor and student feedback was mostly positive. Not subject to local server outages, the cloud-based system was stable and had remained available to users throughout the pilot. Of course no LMS is perfect. But after careful comparison and feedback collection, we migrated from Blackboard to Canvas in 2015. So far in my four years of using Canvas, there has not been a single server outage. Canvas has the basic functionality of a LMS.

Canvas wanted to expand their market share by building up positive customer experiences. They were eager to please OSU and they provided us with 24/7 on-call customer service during our first two years of using Canvas, at a relatively reasonable price. The pilot users were all super satisfied with their customer service. Several instructors reported that they contacted Canvas hotline on Thanksgiving or Christmas, and their calls were answered immediately, and their issues were resolved.

Michael Feldstein (2018) summarized that Canvas’ “cloud-based offering, updated user interface, reputation for outstanding customer service and brash, in-you-face branding” have helped its steady rise in the LMS market share. As instructors and instructional designers, we can learn a lot from the CANVAS INSTRUCTURE’s success story and focus on improving the service we provide to our students, such as student success coaching, online recourses, online learning communities, etc. Would you agree with me on this? If you have specific suggestions on how to improve the way we serve our students, feel free to let us know (Tianhong.shi@oregonstate.edu ; @tianhongshi) !

 

References:

Goldberg, M., Salari, S. & Swoboda, P. (1996) ‘World Wide Web – Course Tool: An Environment for Building WWW-Based Courses’ Computer Networks and ISDN Systems, 28:7-11 pp1219-1231

Feldstein, Michael. (2018). Canvas surpasses Blackboard Learn in US Market Share. E-Literate, July 8, 2018. Retrieved from https://mfeldstein.com/canvas-surpasses-blackboard-learn-in-us-market-share/ on February 2, 2019.

McKenzie, Lindsay. (2018). Canvas catches, and maybe passes, Blackboard. InsideHigherEd. July 10, 2018. Retrieved from https://www.insidehighered.com/digital-learning/article/2018/07/10/canvas-catches-and-maybe-passes-blackboard-top-learning on February 2, 2019.

Moran, Gwen (October 2010). “The Rise of the Virtual Classroom”Entrepreneur Magazine. Irvine, California. Retrieved July 15, 2011.

Noud, Brendan. (February 9, 2016). 10 Things to consider when selecting an LMS. Retrieved from https://www.learnupon.com/blog/top-10-considerations-when-selecting-a-top-lms/ on February 2, 2019.

Pappas, Christopher. (June 13, 2017). Top 9 Factors to consider when calculating Your LMS Budget. Retrieved from https://blog.lambdasolutions.net/top-9-factors-to-consider-when-calculating-your-lms-budget on February 2, 2019.

Udutu. (May 30, 2016). How to choose the best Learning Management System. Retrieved from https://www.udutu.com/blog/lms/ on February 2, 2019.

Wikipedia. (n.d.). WebCT. Retrieved from https://en.wikipedia.org/wiki/WebCT on February 2, 2019.

 

Would you like to save time grading, accurately assess student learning, provide timely feedback, track student progress, demonstrate teaching and learning excellence, foster communication, and much more? If you answered yes, then rubrics are for you! Let’s explore why the intentional use of rubrics can be a valuable tool for instructors and students.

Value for instructors

  • Time management: Have you ever found yourself drowning in a sea of student assignments that need to be graded ASAP (like last week)?  Grading with a rubric can quicken the process because each student is graded in the same way using the same criteria. Rubrics which are detailed, specific, organized and measurable clearly communicate expectations. As you become familiar with how students are commonly responding to an assessment, feedback can be easily personalized and readily deployed.
  • Timely and meaningful feedback: Research has shown that there are several factors that enhance student motivation. One factor is obtaining feedback that is shared often, detailed, timely, and useful. When students receive relevant, meaningful, and useful feedback quickly they have an opportunity to self-assess their progress, course correct (if necessary), and level up their performance.
  • Data! Data! Data! Not only can rubrics provide a panoramic view of student progress, but the tool can also help identify teaching and learning gaps. Instructors will be able to identify if students are improving, struggling, remaining consistent, or if they are missing the mark completely. The information gleaned from rubrics can be utilized to compare student performance within a course, between course sections, or even across time. As well as, the information can serve as feedback to the instructor regarding the effectiveness of the assessment.
  • Effectiveness: When a rubric is designed from the outset to measure the course learning outcomes the rubric can serve as a tool for effective, and accurate, assessment. Tip! Refrain from solely scoring gateway criteria (i.e. organization, mechanics, and grammar). Doing so is paramount because students will interpret meeting the criteria as a demonstration that they have met the learning outcomes even if they haven’t. If learning gaps are consistently identified consider evaluating the task and rubric to ensure instructions, expectations, and performance dimensions are clear and aligned.
  • Shareable: As academic programs begin to develop courses for various modalities (i.e. on campus, hybrid, online) consistently assessing student learning can be a challenge. The advantage of rubrics is they can be easily shared and applied between course sections and modalities. Doing so can be especially valuable when the same course is taught by multiple instructors and teaching assistants.
  • Fosters communication: Instructors can clearly articulate performance expectations and outcomes to key stakeholders such as teaching assistants, instructors, academic programs, and student service representatives (e.g. Ecampus Student Success Team, Writing Center). Rubrics provide additional context above and beyond what is outlined in the course syllabus. A rubric can communicate how students will be assessed, what students should attend to, and how institutional representatives can best help support students. Imagine a scenario where student contacts the Writing Center with the intent of reviewing a draft term paper, and the representative asks for the grading criteria or rubric. The grading criteria furnished by the instructor only outlines the requirements for word length, formatting, and citation conventions. None of the aforementioned criteria communicate the learning outcomes or make any reference to the quality of the work. In this example, the representative might find it challenging to effectively support the student without understanding the instructor’s implicit expectations.
  • Justification: Have you ever been tasked with justifying a contested grade? Rubrics can help you through the process! Rubrics which are detailed, specific, measurable, complete, and aligned can be used to explain why a grade was awarded. A rubric can quickly and accurately highlight where a student failed to meet specific performance dimensions and/ or the learning outcomes.
  • Evidence of teaching improvement: The values of continuous improvement, lifelong learning, and ongoing professional development are woven into the very fabric of academia. Curating effective assessment tools and methods can provide a means of demonstrating performance and providing evidence to support professional advancement.

Value for students

  • Equity: Using rubrics creates an opportunity for consistent and fair grading for all students. Each student is assessed on the same criteria and in the same way. If performance criteria are not clearly communicated from the outset then evaluations may be based on implicit expectations. Implicit expectations are not known or understood by students, and it can create an unfair assessment structure.
  • Clarity: Ambiguity is decreased by using student-centered language. Student composition is highly diverse, and many students speak different native languages. Therefore, students may have different interpretations as to what words mean (e.g. critical thinking). Using very clear and simplistic language can mitigate unintended barriers and decrease confusion.
  • Expectations: Students know exactly what they need to do to demonstrate learning, what instructors are looking for, how to meet the instructor’s expectations, and how to level up their performance. A challenge can be to ensure that all expectations (implicit and explicit) are clearly communicated to students. Tip! Consider explaining expectations in the description of the task as well.
  • Skill development: Rubrics can introduce new concepts/ terminology and help students develop authentic skills (e.g. critical thinking) which can be applied outside of their academic life.
  • Promotes metacognition and self-regulatory behavior: Guidance and feedback help students reflect on their thought processes, self-assess, and foster positive learning behaviors.

As an Ecampus course developer, you have a wide array of support services and experts available to you. Are you interested in learning more about rubric design, development, and implementation? Contact your Instructional Designer today to begin exploring best-fit options for your course. Stay tuned for Rubrics: Markers of Quality (Part 2) –Tips & Best Practices.

References:

  • Brookhart, S. M. (2013). How to create and use rubrics for formative assessment and grading. Alexandria, Va.: ASCD.
  • Richter, D., & Ehlers, Ulf-Daniel. (2013). Open Learning Cultures: A Guide to Quality, Evaluation, and Assessment for Future Learning. (1st ed.). Berlin, Heidelberg: Springer.
  • Stevens, D. D., & Levi, Antonia. (2013). Introduction to rubrics: an assessment tool to save grading time, convey effective feedback, and promote student learning (2nd ed.). Sterling, Va.: Stylus.
  • Walvoord, B. E. F., & Anderson, Virginia Johnson. (2010). Effective grading: a tool for learning and assessment in college (Second edition.). San Francisco, CA: Jossey-Bass.

 

 

 

Curious what an Ecampus Instructional Designer is looking for when they approve slides for narrated lectures?  It certainly depends on the course content.

Generally, the top three things I am looking at are copyright, accessibility, and aesthetics.

For this post, I am going to focus on copyright and I will return to the other topics in a future post.  A copy of the slides, which includes links to helpful materials, is available below the video as well as a list of resources.

Slides: Copyright Considerations for Narrated Slides

Resources:

What’s An Image’s Value?

Image of postcard with a picture is worth a thousand words written on it.

Have you ever created an online course without using images? No?

That is not surprising as images can convey emotions, ideas, and much more. Their value is often captured in an old adage: A picture is worth a thousand words.

This article will discuss the value of images in online course design and how using visuals to accompany instruction via text or narration might contribute to or detract from an online learning experience. Let’s begin.

Multimedia Learning: Images, Text, and More

Online learning is a modern form of multimedia learning. Richard Mayer (2009) described multimedia learning as that learning that integrates the use of words and pictures. In traditional classrooms these learning resources might be experienced as: 

  • Textbooks:  Text and illustrations.
  • Computer-based lessons: Narration w/animation
  • Face-to-face slide presentations: Graphics and audio.

In online learning multimedia may also include:

  • eBooks: Text and digital images 
  • Video: Text, images, animations, coupled with audio.
  • Interactives: Maps, images, and video.
  • Digital Visual Representations: Virtual worlds and 3D models.
  • Screencasts: Software demos, faculty video feedback, and more.
  • Audio: Enhanced podcasts or narrated lectures.

These two short lists, although not exhaustive, demonstrates the importance of visual elements to multimedia based learning in online courses. There are many reasons why we might include any one of these multimedia learning experiences in an online course. For our purposes we will explore a bit more the instructional value of visuals to online learning.

So, how do words and pictures work together to help shape learning? Given that this is perhaps the most common learning object used in an online course it would seem useful to understand what may be considered this simple interpretation of visual literacy for learning (Aisami, 2015).

Visual Engagement Of A Learning Object

In a recent study of how people acquire knowledge from an instructional web page Ludvik Eger (2018) used eye tracking technology to examine a simple learning object composed of a title (headline), a visual element (i.e., diagram), and a box of written text. With no audio support for the learning object in this study, participants engaged the content via visual engagement alone. Results indicated that the majority of students started their learning process at the headline or the headline and visual element. The box of information, in text form, was the third part of the learning object engaged.

Within this context eye movement analysis indicates a learning process that is dependent upon a consistent visual flow. Purposely connecting the title, visual element and information text of a learning object may best reinforce learning. By doing this the course designer/instructor becomes a sort of cognitive guide either focusing or not-focusing learning via the meaning structure of the various learning object elements. In our case we want to use visual elements to support performance and achievement of learning tasks.

Choosing Visual Elements

In order to explore the choice of visual elements in an online learning experience it is helpful to understand how we process that experience from a cognitive science perspective.

Clark and Mayer (2016) describe that cognitive science suggests knowledge construction is based upon three principles: Dual channels, limited capacity and active processing. Let’s briefly examine what these are.

Dual channels:

People have two channesl of cognitive processing 1) for processing visual/pictorial material and 2) one for auditory/verbal material. See Figure 1.  below.

 

Model of cognitive model of multimedia learning.
Figure 1.: Model of the Cognitive Theory of Multimedia Learning

Limited capacity:

Humans can only process a few bits of pieces of information in each channel at the same time.

Active processing:

Learning occurs as people engage in cognitive processing during learning. This may include attending to relevant material, organizing that material into a coherent structure, and integrating that material with prior knowledge.

Due to the limits on any learner’s processing capability it is paramount that we select visual images that help manage the learning process. Our goal is to limit excessive processing that clutters the learning experience, build visual support for representing the core learning process, and provide visual support that fosters deeper understanding of the learning at hand. What does this mean in practice?

Managing Processing Via Image Use

Making decisions about image selection and use is a key to managing this learning process. Understanding the meaning of images to select is also key and is really a function of literacy in one’s field and visual literacy in general (Kennedy, 2013).

In practice we can use the following guidelines to make decisions about image use in multimedia-based online learning. 

  • Control Visual Elements – Too many images on a web page or slide may force extraneous cognitive processing that does not support the instructional objective. 
  • Select Visual Elements Carefully – Images difficult to discern are likely to negatively impact learning. Think about good visual quality, emotional and intellectual message of the image, information value, and readability.
  • Use Focused Visual Elements – Target selection of visual support to those images that represent the core learning material and/or provide access to deeper understanding of that core content.

Other Image Tips

Emotional Tone: Emotional design elements (e.g., visuals) can play important roles in motivating learners and achievement of learning outcomes (Mayer, 2013).

Interest: Decorative images may boost learner interest but do not contribute to higher performance in testing (Mayer, 2013). Use decorative images prudently so they do not contribute to extraneous learning processing (Pettersson & Avgerinou, 2016).

Challenge: Making image selections that contribute to a degree of confusion may challenge learnings to dive more deeply into core learning. This is a tenuous decision in that challenge in sense making may prove to foster excessive processing.

Access: Images must be presented in a format that is viewable to users to be practical. This involves an understanding of technical features of image formats, download capability, mobile use, and universal design techniques.

Final Thoughts

It is valuable to remember that visuals communicate non verbally. They are most effectively used when carefully selected and paired with text or audio narration. Visuals appeal to the sense of sight. They have different classifications and could be pictures, symbols, signs, maps graphs, diagrams, charts, models, and photographs. Knowing their form, meaning, and application is part of being a visually literate course developer or instructional designer.

Web Resources

References

Aisami, R. S. (2015). Learning Styles and Visual Literacy for Learning and Performance. Procedia – Social and Behavioral Sciences, 176, 538-545. doi:10.1016/j.sbspro.2015.01.508

Clark, R. C., & Mayer, R. E. (2016). E-learning and the science of instruction : Proven guidelines for consumers and designers of multimedia learning. Retrieved from http://ebookcentral.proquest.com

Eger, L. (2018). How people acquire knowledge from a web page: An eye tracking study. Knowledge Management & E-Learning: An International Journal 10(3), 350-366.

Kennedy, B. (2013, November 19). What is visual literacy?. [Video file]. Retrieved from https://www.youtube.com/watch?time_continue=1&v=O39niAzuapc

Mayer, R. E. (2009). Multimedia learning (2nd ed.). New York: Cambridge University Press.

Mayer, R. E. (2014). Incorporating motivation into multimedia learning. Learning and Instruction, 29, 171-173. doi:10.1016/j.learninstruc.2013.04.003

Rune Pettersson & Maria D. Avgerinou (2016) Information design with teaching and learning in mind, Journal of Visual Literacy, 35:4, 253-267, DOI: 10.1080/1051144X.2016.1278341

 

Credit: Embedded image by Kelly Sikkema on Unsplash.com

“Diversity is our world’s greatest asset, and inclusion is our biggest challenge. And the way that we are going to address that challenge is by extending our empathy.” -Jutta Treviranus, Founder of the Inclusive Design Research Centre, OCAD University

Decorative image

Sure, you’ve been teaching online courses for a few terms or years now, but have you ever been an online student? Many current faculty members earned their degrees in traditional face-to-face settings and have learned how to migrate their courses to the online environment by using research-based best practices and support from instructional designers and media experts. However, are there benefits to experiencing this fledgling educational modality from the perspective of the online student? I argue that faculty who challenge themselves to take an online course experience both personal and professional benefits and become more empathic, inclusive, creative, and reflective.

Benefits for Faculty Members

Challenge yourself to try out something completely different than your specialization or discipline: Are you a STEM professor who has a screenplay idea? Perhaps you have a trip to the French Riviera on your bucket list, or your college Spanish is rusty. Try a foreign language course this summer. Are you a humanities professor who is curious about the composition of the soil in your garden? Find out about the dirt in your yard as a soil science student.

Here are some benefits to consider:

  • Taking an online course may give you ideas or inspiration for something that you want to try in your own course.
  • Continuing education may benefit brain health.
  • Stretching yourself may spur creativity and innovation.
  • You are modeling lifelong learning for your students and family.
  • Most importantly, it just might be fun!

Building Empathy

I’m consistently impressed with the care and concern OSU faculty have for their students, and taking an online course is one way to demonstrate that concern. By changing roles, such as by becoming an online student, faculty expand their perspectives, which results in the potential for even greater student support and understanding.

Yes, faculty members contend with heavy workloads and may feel that taking an online course on top of everything else would be overwhelming. However, your Ecampus students may also struggle with feeling maxed out.

Did you know that the average age of a student taking an Ecampus course is 31 years old? This means that it is likely your online students are responsible for full-time work as well as family obligations. Taking online courses helps faculty members build empathy for their students by giving themselves opportunities to experience the excitement, anxiety, and pride of successfully completing an online course.

Furthermore, by increasing empathy, faculty members may become more inclusive and reflective practitioners. For example, as an online student, you know how it feels to be welcomed (or not) by your instructor, or to receive feedback within a few days as opposed to a few weeks. As an adult learner, you also may desire to share your prior experience or professional background with the instructor or students. Does your course give you the opportunity to introduce yourself to the instructor and other students, to describe your background and some strengths that you bring to the course community, or are you left feeling invisible in the course, with your expertise unacknowledged?

Tuition Reduction for OSU Employees

As OSU employees, faculty and staff are now eligible to take Ecampus courses at the reduced tuition rate, according to the staff fee privileges.

  • Summer courses begin on June 24th, and fall courses begin on September 25th.

Share Your Experience!

Have you been an online student as well as an online instructor? How did being on online student inform your teaching practices? Reply in the comments section, below.

Resources:

I pledge that I have acted honorably in completing this assessment.

There are two sides to the story of security of online assessments. On the one side, cheating does exist in online assessments. Examity’s president Michael London summarized five common ways students cheat on online exams:

  1. The old-school try of notes;
  2. The screenshot;
  3. The water break;
  4. The cover-up; and
  5. The big listen through devices such as Bluetooth headset (London, 2017).

Newton (2015) even reported the disturbing fact that “cheating in online classes is now big business”. On the other side, academic dishonesty is a problem of long history, both on college campuses and in online courses. The rate of students who admit to cheating at least once in their college careers has held steady at somewhere around 75 percent since the first major survey on cheating in higher education in 1963 (Lang, 2013). Around 2000, Many faculty and students believed it was easier to cheat in online classes (Kennedy, 2000), and about a third of academic leaders perceived online outcomes to be inferior to traditional classes (Allen & Seaman, 2011). However, according to Watson and Sottile (2010) and other comparative studies (Pilgrim & Scanlon, 2018), there is no conclusive evidence that online students are more likely to cheat than face-to-face students. “Online learning is, itself, not necessarily a contributing factor to an increase in academic misconduct (Pilgrim & Scanlon, 2018)”.

Since there are so many ways for students to cheat in online assessments, how can we make online assessments more effective in evaluating students’ learning? Online proctoring is a solution that is easy for instructors but adds a burden of cost to students. Common online proctoring service providers include ProctorU, Examity, Proctorio, Honorlock, to name just a few (Bentley, 2017).

Fortunately, there are other ways to assess online learning without overly concerned with academic dishonesty. Vicky Phillips (n.d.) suggested that authentic assessment makes it extremely difficult to fake or copy one’s homework. The University of Maryland University College has consciously moving away from proctored exams and use scenario-based projects as assessments instead (Lieberman, 2018). James Lang (2013) suggested smaller class sizes will allow instructor to have more instructor-to-students interaction one-on-one and limit cheating to the minimum therefore; Pilgrim and Scanlon (2018) suggest changing assessments to reduce the likelihood of cheating (such as demonstrating problem solving in person or via video, using plagiarism detection software programs like TurnItIn, etc.) , promote and establish a culture of academic integrity (such as honor’s code, integrity pledge), and supporting academic integrity through appropriate policies and processes. Kohnheim-Kalkstein (2006) reports that the use of a classroom honor code has been shown to reduce cheating. Kohnheim-Kalkstein, Stellmack, and Shilkey (2008) report that use of classroom honor code improves rapport between faculty and students, and increases feelings of trust and respect among students. Gurung, Wilhelm and Fitz (2012) suggest that an honor pledge should include formal language, state the specific consequences for cheating, and require a signature. For the honor pledge to be most effective, Shu, Mazar, Gino, Ariely, and Bazerman (2012) suggests including the honor pledge on the first page of an online assessment or online assignment, before students take the assessment or work on the assignment.

Rochester Institute of Technology (2014) ’s Teaching Elements: Assessing Online Students offer a variety of ways to assess students, including discussions, low-stake quizzes, writing assignments (such as muddiest point paper), and individual activities (such as staged assignments for students to receive ongoing feedback), and many other activities.

In summary, there are plenty of ways to design effective formative or summative assessments online that encourage academic honesty, if instructors and course designers are willing to spend the time to try out suggested strategies from literature.

References

Bentley, Kevin. (2017). What to consider when selecting an online exam proctoring service. Inside HigherEd. (June 21, 2017). Retrieved from https://www.insidehighered.com/digital-learning/views/2017/06/21/selecting-online-exam-proctoring-service on February 22, 2019.

Gurung, R. A. R., Wilhelm, T. M., & Filz, T. (2012). Optimizing honor codes for online exam administration. Ethics & Behavior, 22, 158–162.

Konheim-Kalkstein, Y. L. (2006). Use of a classroom honor code in higher education. Journal of Credibility Assessment and Witness Psychology, 7, 169–179.

Konheim-Kalkstein,Y. L., Stellmack, M. A., & Shilkey, M. L. (2008). Comparison of honor code and non-honor code classrooms at a non-honor code university. Journal of College & Character, 9, 1–13.

J.M. Lang. (2013). How college classes encourage cheating. Boston Globe. Retrieved from https://www.bostonglobe.com/ideas/2013/08/03/how-college-classes-encourage-cheating/3Q34x5ysYcplWNA3yO2eLK/story.html on February 21, 2019.

Lieberman, Mark. (2018). Exam proctoring for online students hasn’t yet transformed. Inside Higher Ed (October 10, 2018). Retrieved from https://www.insidehighered.com/digital-learning/article/2018/10/10/online-students-experience-wide-range-proctoring-situations-tech, on February 22, 2019.

Michael London. (2017). 5 Ways to Cheat on Online Exams. Inside Higher Ed (09/20/2017). Retrieved from https://www.insidehighered.com/digital-learning/views/2017/09/20/creative-ways-students-try-cheat-online-exams on February 21, 2019.

Derek Newton. (2015). Cheating in Online Classes is now big business. The Atlantic. Retrieved from https://www.theatlantic.com/education/archive/2015/11/cheating-through-online-courses/413770/ on February 21, 2019.

Vicky Phillips. (n.d.). Big Fat Online Education Myths – students cheat like weasels in Online Classes. GetEducated. Retrieved from https://www.geteducated.com/elearning-education-blog/big-fat-online-education-myths-students-cheat-like-weasels-in-online-classes/ on February 21, 2019.

Chris Pilgrim and Christopher Scanlon. (2018). Don’t assume online students are more likely to cheat. The evidence is murky. Retrieved from https://phys.org/news/2018-07-dont-assume-online-students-evidence.html on February 21, 2019.

Rochester Institute of Technology. (2014). Teaching Elements: Assessing Online Students. Retrieved from https://www.rit.edu/academicaffairs/tls/sites/rit.edu.academicaffairs.tls/files/docs/TE_Online%20Assessmt.pdf on February 21, 2019.

Shu, L. L., Mazar, N., Gino, F., Ariely, D., & Bazerman, M. H. (2012). Signing at the beginning makes ethics salient and decreases dishonest self-reports in comparison to signing at the end. PNAS, 109, 15197–15200.

George Watson. And James Sottile. (2010). Cheating in digital age: Do students cheat more in online courses? Online Journal of Distance Learning Administration 13(1). Retrieved from https://www.westga.edu/~distance/ojdla/spring131/watson131.html on February 21, 2019