One of the most common questions I get as an Instructional Designer is, “How do I prevent cheating in my online course?” Instructors are looking for detection strategies and often punitive measures to catch, report, and punish academic cheaters. Their concerns are understandable—searching Google for the phrase “take my test for me,” returns pages and pages of results from services with names like “Online Class Hero” and “Noneedtostudy.com” that promise to use “American Experts” to help pass your course with “flying grades.” 1 But by focusing only on what detection measures we can implement and the means and methods by which students are cheating, we are asking the wrong questions. Instead let’s consider what we can do to understand why students cheat, and how careful course and assessment design might reduce their motivation to do so.

A new study published in Computers & Education identified five specified themes in analyzing the reasons students provided when seeking help from contract cheating services (Amigud & Lancaster, 2019):

  • Academic Aptitude – “Please teach me how to write an essay.”
  • Perseverance – “I can’t look at it anymore.”
  • Personal Issues – “I have such a bad migraine.”
  • Competing Objectives – “I work so I don’t have time.”
  • Self-Discipline – “I procrastinated until today.”

Their results showed that students don’t begin a course with the intention of academic misconduct. Rather, they reach a point, often after initially attempting the work, when the perception of pressures, lack of skills, or lack of resources removes their will to complete the course themselves. Online students may be more likely to have external obligations and involvement in non-academic activities. According to a 2016 study, a significant majority of online students are often juggling other obligations, including raising children and working while earning their degrees (Clinefelter & Aslanian, 2016).

While issues with cheating are never going to be completely eliminated, several strategies have emerged in recent research that focuses on reducing cheating from a lens of design rather than one of punishment. Here are ten of my favorite approaches that speak to the justifications identified by students that led to cheating:

  1. Make sure that students are aware of academic support services (Yu, Glanzer, Johnson, Sriram, & Moore, 2018). Oregon State, like many universities, offers writing help, subject-area tutors and for Ecampus students, a Student Success team that can help identify resources and provide coaching on academic skills. Encourage students, leading up to exams or big assessment projects, to reach out during online office hours or via email if they feel they need assistance.
  2. Have students create study guides as a precursor assignment to an exam—perhaps using online tools to create mindmaps or flashcards. Students who are better prepared for assessments have a reduced incentive to cheat. Study guides can be a nongraded activity, like a game or practice quiz, or provided as a learning resource.
  3. Ensure that students understand the benefits of producing their own work and that the assessment is designed to help them develop and demonstrate subject knowledge (Lancaster & Clarke, 2015). Clarify for students the relevance of a particular assessment and how it relates to the weekly and larger course learning outcomes.
  4. Provide examples of work that meets your expectations along with specific evaluation criteria. Students need to understand how they are being graded and be able to judge the quality of their own work. A student feeling in the dark about what is expected from them may be more likely to turn to outside help.
  5. Provide students with opportunities throughout the course to participate in activities, such as discussions and assignments, that will prepare them for summative assessments (Morris, 2018).
  6. Allow students to use external sources of information while taking tests. Assessments in which students are allowed to leverage the materials they have learned to construct a response do a better job of assessing higher order learning. Memorizing and repeating information is rarely what we hope students to achieve at the end of instruction.
  7. Introduce alternative forms of assessment. Creative instructors can design learning activities that require students to develop a deeper understanding and take on more challenging assignments. Examples of these include recorded presentations, debates, case studies, portfolios, and research projects.
  8. Rather than a large summative exam at the end of a course, focus on more frequent smaller, formative assessments (Lancaster & Clarke, 2015). Provide students with an ongoing opportunity to demonstrate their knowledge without the pressure introduced by a final exam that accounts for a substantial portion of their grade.
  9. Create a course environment that is safe to make and learn from mistakes. Build into a course non-graded activities in which students can practice the skills they will need to demonstrate during an exam.
  10. Build a relationship with students. When instructors are responsive to student questions, provide substantive feedback throughout a course and find other ways to interact with students — they are less likely to cheat. It matters if students believe an instructor cares about them (Bluestein, 2015).

No single strategy is guaranteed to immunize your course against the possibility that a student will use some form of cheating. Almost any type of assignment can be purchased quickly online. The goal of any assessment should be to ensure that students have met the learning outcomes—not to see if we can catch them cheating. Instead, focus on understanding pressures a student might face to succeed in a course, and the obstacles they could encounter in doing so. Work hard to connect with your students during course delivery and humanize the experience of learning online. Thoughtful design strategies, those that prioritize supporting student academic progress, can alleviate the conditions that lead to academic integrity issues.


1 This search was suggested by an article published in the New England Board of Higher Education on cheating in online programs. (Berkey & Halfond, 2015)

References

Amigud, A., & Lancaster, T. (2019). 246 reasons to cheat: An analysis of students’ reasons for seeking to outsource academic work. Computers & Education, 134, 98–107. https://doi.org/10.1016/j.compedu.2019.01.017

Berkey, D., & Halfond, J. (2015). Cheating, student authentication and proctoring in online programs.

Bluestein, S. A. (2015). Connecting Student-Faculty Interaction to Academic Dishonesty. Community College Journal of Research and Practice, 39(2), 179–191. https://doi.org/10.1080/10668926.2013.848176

Clinefelter, D. D. L., & Aslanian, C. B. (2016). Comprehensive Data on Demands and Preferences. 60.

Lancaster, T., & Clarke, R. (2015). Contract Cheating: The Outsourcing of Assessed Student Work. In T. A. Bretag (Ed.), Handbook of Academic Integrity (pp. 1–14). https://doi.org/10.1007/978-981-287-079-7_17-1

Morris, E. J. (2018). Academic integrity matters: five considerations for addressing contract cheating. International Journal for Educational Integrity, 14(1), 15. https://doi.org/10.1007/s40979-018-0038-5

Yu, H., Glanzer, P. L., Johnson, B. R., Sriram, R., & Moore, B. (2018). Why College Students Cheat: A Conceptual Model of Five Factors. The Review of Higher Education, 41(4), 549–576. https://doi.org/10.1353/rhe.2018.0025

Oregon State University’s Learning Management System (LMS) migrated to Canvas in 2014-2015. The Canvas migration was based not only on the company’s feature alignment with our learning platform needs but also on the outstanding customer service Canvas Instructure has provided to our LMS user community including students, faculty, instructional designers, and administrators. How Canvas provides customer service offers an example we can model to continue to exceed student expectations.

According to Michael Feldstein’s July 8, 2018 report, major players in US LMS market include Blackboard, Canvas, Moodle, Bright Space, Sakai, Schoology, and others (Feldstein, 2018).

LMS Market share in North America

Figure 1: US Primary LMS Systems, July 6th, 2018 (Feldstein, 2018)

 

Of these major players in the LMS field, Canvas is most noticeable with fastest growth in market share among U.S. and Canadian higher education institutions.

LMS history and Market Share

Figure 2. LMS Market Share for US and Canadian Higher Ed Institutions (Feldstein, 2018)

 

Different people suggest different criteria when comparing LMSs. Udutu.com provided a list of 7 things to think about before purchasing a LMS:

  1. Be clear on your learning and training objectives;
  2. Don’t be fooled by the high costs of an LMS;
  3. Know the limitations of your internal team and users;
  4. Pay for the features you need, not for what you might need;
  5. The latest new technology is not necessarily the best one;
  6. Customer support is everything; and
  7. Trust demos and trials over reviews, ratings and “industry experts”

(Udutu, 2016).  Noud (2016) suggested the following ten factors to consider when selecting a LMS:

  1. Unwanted Features;
  2. Mobile Support;
  3. Integrations (APIs, SSO);
  4. Customer Support;
  5. Content Support;
  6. Approach to pricing;
  7. Product roadma;
  8. Scalability, Reliability and Security;
  9. Implementation Timeframe; and
  10. Hidden costs.

Christopher Pappas (2017) suggested 9 factors to consider when calculating your LMS budget:

  1. Upfront costs;
  2. LMS training;
  3. Monthly Or Annual Licensing Fees;
  4. Compatible eLearning Authoring Tools;
  5. Pay-per-User/Learner Fee;
  6. Upgrades and Add-Ons;
  7. Learning and Development Team Payroll;
  8. Online Training Development Costs; and
  9. Ongoing Maintenance.

Of all of the above lists, I like Udutu’s list the best because it matches with my personal experiences with LMS migrations.

I first used WebCT between 2005 and 2007, participated in migrating from WebCT Vista to Blackboard in 2008, and Angel to Blackboard migration in 2013-2014.  During my seven years of using Blackboard as instructional designer and faculty support staff, my biggest complaint with Blackboard was its unexpected server outages during peak times such as beginning of the term and final’s weeks. In 2014, I moved to Oregon State University (OSU). The OSU community was looking for a new LMS in 2013 and started piloting Canvas in 2014. At the end of the pilot, instructor and student feedback was mostly positive. Not subject to local server outages, the cloud-based system was stable and had remained available to users throughout the pilot. Of course no LMS is perfect. But after careful comparison and feedback collection, we migrated from Blackboard to Canvas in 2015. So far in my four years of using Canvas, there has not been a single server outage. Canvas has the basic functionality of a LMS.

Canvas wanted to expand their market share by building up positive customer experiences. They were eager to please OSU and they provided us with 24/7 on-call customer service during our first two years of using Canvas, at a relatively reasonable price. The pilot users were all super satisfied with their customer service. Several instructors reported that they contacted Canvas hotline on Thanksgiving or Christmas, and their calls were answered immediately, and their issues were resolved.

Michael Feldstein (2018) summarized that Canvas’ “cloud-based offering, updated user interface, reputation for outstanding customer service and brash, in-you-face branding” have helped its steady rise in the LMS market share. As instructors and instructional designers, we can learn a lot from the CANVAS INSTRUCTURE’s success story and focus on improving the service we provide to our students, such as student success coaching, online recourses, online learning communities, etc. Would you agree with me on this? If you have specific suggestions on how to improve the way we serve our students, feel free to let us know (Tianhong.shi@oregonstate.edu ; @tianhongshi) !

 

References:

Goldberg, M., Salari, S. & Swoboda, P. (1996) ‘World Wide Web – Course Tool: An Environment for Building WWW-Based Courses’ Computer Networks and ISDN Systems, 28:7-11 pp1219-1231

Feldstein, Michael. (2018). Canvas surpasses Blackboard Learn in US Market Share. E-Literate, July 8, 2018. Retrieved from https://mfeldstein.com/canvas-surpasses-blackboard-learn-in-us-market-share/ on February 2, 2019.

McKenzie, Lindsay. (2018). Canvas catches, and maybe passes, Blackboard. InsideHigherEd. July 10, 2018. Retrieved from https://www.insidehighered.com/digital-learning/article/2018/07/10/canvas-catches-and-maybe-passes-blackboard-top-learning on February 2, 2019.

Moran, Gwen (October 2010). “The Rise of the Virtual Classroom”Entrepreneur Magazine. Irvine, California. Retrieved July 15, 2011.

Noud, Brendan. (February 9, 2016). 10 Things to consider when selecting an LMS. Retrieved from https://www.learnupon.com/blog/top-10-considerations-when-selecting-a-top-lms/ on February 2, 2019.

Pappas, Christopher. (June 13, 2017). Top 9 Factors to consider when calculating Your LMS Budget. Retrieved from https://blog.lambdasolutions.net/top-9-factors-to-consider-when-calculating-your-lms-budget on February 2, 2019.

Udutu. (May 30, 2016). How to choose the best Learning Management System. Retrieved from https://www.udutu.com/blog/lms/ on February 2, 2019.

Wikipedia. (n.d.). WebCT. Retrieved from https://en.wikipedia.org/wiki/WebCT on February 2, 2019.

 

Would you like to save time grading, accurately assess student learning, provide timely feedback, track student progress, demonstrate teaching and learning excellence, foster communication, and much more? If you answered yes, then rubrics are for you! Let’s explore why the intentional use of rubrics can be a valuable tool for instructors and students.

Value for instructors

  • Time management: Have you ever found yourself drowning in a sea of student assignments that need to be graded ASAP (like last week)?  Grading with a rubric can quicken the process because each student is graded in the same way using the same criteria. Rubrics which are detailed, specific, organized and measurable clearly communicate expectations. As you become familiar with how students are commonly responding to an assessment, feedback can be easily personalized and readily deployed.
  • Timely and meaningful feedback: Research has shown that there are several factors that enhance student motivation. One factor is obtaining feedback that is shared often, detailed, timely, and useful. When students receive relevant, meaningful, and useful feedback quickly they have an opportunity to self-assess their progress, course correct (if necessary), and level up their performance.
  • Data! Data! Data! Not only can rubrics provide a panoramic view of student progress, but the tool can also help identify teaching and learning gaps. Instructors will be able to identify if students are improving, struggling, remaining consistent, or if they are missing the mark completely. The information gleaned from rubrics can be utilized to compare student performance within a course, between course sections, or even across time. As well as, the information can serve as feedback to the instructor regarding the effectiveness of the assessment.
  • Effectiveness: When a rubric is designed from the outset to measure the course learning outcomes the rubric can serve as a tool for effective, and accurate, assessment. Tip! Refrain from solely scoring gateway criteria (i.e. organization, mechanics, and grammar). Doing so is paramount because students will interpret meeting the criteria as a demonstration that they have met the learning outcomes even if they haven’t. If learning gaps are consistently identified consider evaluating the task and rubric to ensure instructions, expectations, and performance dimensions are clear and aligned.
  • Shareable: As academic programs begin to develop courses for various modalities (i.e. on campus, hybrid, online) consistently assessing student learning can be a challenge. The advantage of rubrics is they can be easily shared and applied between course sections and modalities. Doing so can be especially valuable when the same course is taught by multiple instructors and teaching assistants.
  • Fosters communication: Instructors can clearly articulate performance expectations and outcomes to key stakeholders such as teaching assistants, instructors, academic programs, and student service representatives (e.g. Ecampus Student Success Team, Writing Center). Rubrics provide additional context above and beyond what is outlined in the course syllabus. A rubric can communicate how students will be assessed, what students should attend to, and how institutional representatives can best help support students. Imagine a scenario where student contacts the Writing Center with the intent of reviewing a draft term paper, and the representative asks for the grading criteria or rubric. The grading criteria furnished by the instructor only outlines the requirements for word length, formatting, and citation conventions. None of the aforementioned criteria communicate the learning outcomes or make any reference to the quality of the work. In this example, the representative might find it challenging to effectively support the student without understanding the instructor’s implicit expectations.
  • Justification: Have you ever been tasked with justifying a contested grade? Rubrics can help you through the process! Rubrics which are detailed, specific, measurable, complete, and aligned can be used to explain why a grade was awarded. A rubric can quickly and accurately highlight where a student failed to meet specific performance dimensions and/ or the learning outcomes.
  • Evidence of teaching improvement: The values of continuous improvement, lifelong learning, and ongoing professional development are woven into the very fabric of academia. Curating effective assessment tools and methods can provide a means of demonstrating performance and providing evidence to support professional advancement.

Value for students

  • Equity: Using rubrics creates an opportunity for consistent and fair grading for all students. Each student is assessed on the same criteria and in the same way. If performance criteria are not clearly communicated from the outset then evaluations may be based on implicit expectations. Implicit expectations are not known or understood by students, and it can create an unfair assessment structure.
  • Clarity: Ambiguity is decreased by using student-centered language. Student composition is highly diverse, and many students speak different native languages. Therefore, students may have different interpretations as to what words mean (e.g. critical thinking). Using very clear and simplistic language can mitigate unintended barriers and decrease confusion.
  • Expectations: Students know exactly what they need to do to demonstrate learning, what instructors are looking for, how to meet the instructor’s expectations, and how to level up their performance. A challenge can be to ensure that all expectations (implicit and explicit) are clearly communicated to students. Tip! Consider explaining expectations in the description of the task as well.
  • Skill development: Rubrics can introduce new concepts/ terminology and help students develop authentic skills (e.g. critical thinking) which can be applied outside of their academic life.
  • Promotes metacognition and self-regulatory behavior: Guidance and feedback help students reflect on their thought processes, self-assess, and foster positive learning behaviors.

As an Ecampus course developer, you have a wide array of support services and experts available to you. Are you interested in learning more about rubric design, development, and implementation? Contact your Instructional Designer today to begin exploring best-fit options for your course. Stay tuned for Rubrics: Markers of Quality (Part 2) –Tips & Best Practices.

References:

  • Brookhart, S. M. (2013). How to create and use rubrics for formative assessment and grading. Alexandria, Va.: ASCD.
  • Richter, D., & Ehlers, Ulf-Daniel. (2013). Open Learning Cultures: A Guide to Quality, Evaluation, and Assessment for Future Learning. (1st ed.). Berlin, Heidelberg: Springer.
  • Stevens, D. D., & Levi, Antonia. (2013). Introduction to rubrics: an assessment tool to save grading time, convey effective feedback, and promote student learning (2nd ed.). Sterling, Va.: Stylus.
  • Walvoord, B. E. F., & Anderson, Virginia Johnson. (2010). Effective grading: a tool for learning and assessment in college (Second edition.). San Francisco, CA: Jossey-Bass.

 

 

 

Curious what an Ecampus Instructional Designer is looking for when they approve slides for narrated lectures?  It certainly depends on the course content.

Generally, the top three things I am looking at are copyright, accessibility, and aesthetics.

For this post, I am going to focus on copyright and I will return to the other topics in a future post.  A copy of the slides, which includes links to helpful materials, is available below the video as well as a list of resources.

Slides: Copyright Considerations for Narrated Slides

Resources:

What’s An Image’s Value?

Image of postcard with a picture is worth a thousand words written on it.

Have you ever created an online course without using images? No?

That is not surprising as images can convey emotions, ideas, and much more. Their value is often captured in an old adage: A picture is worth a thousand words.

This article will discuss the value of images in online course design and how using visuals to accompany instruction via text or narration might contribute to or detract from an online learning experience. Let’s begin.

Multimedia Learning: Images, Text, and More

Online learning is a modern form of multimedia learning. Richard Mayer (2009) described multimedia learning as that learning that integrates the use of words and pictures. In traditional classrooms these learning resources might be experienced as: 

  • Textbooks:  Text and illustrations.
  • Computer-based lessons: Narration w/animation
  • Face-to-face slide presentations: Graphics and audio.

In online learning multimedia may also include:

  • eBooks: Text and digital images 
  • Video: Text, images, animations, coupled with audio.
  • Interactives: Maps, images, and video.
  • Digital Visual Representations: Virtual worlds and 3D models.
  • Screencasts: Software demos, faculty video feedback, and more.
  • Audio: Enhanced podcasts or narrated lectures.

These two short lists, although not exhaustive, demonstrates the importance of visual elements to multimedia based learning in online courses. There are many reasons why we might include any one of these multimedia learning experiences in an online course. For our purposes we will explore a bit more the instructional value of visuals to online learning.

So, how do words and pictures work together to help shape learning? Given that this is perhaps the most common learning object used in an online course it would seem useful to understand what may be considered this simple interpretation of visual literacy for learning (Aisami, 2015).

Visual Engagement Of A Learning Object

In a recent study of how people acquire knowledge from an instructional web page Ludvik Eger (2018) used eye tracking technology to examine a simple learning object composed of a title (headline), a visual element (i.e., diagram), and a box of written text. With no audio support for the learning object in this study, participants engaged the content via visual engagement alone. Results indicated that the majority of students started their learning process at the headline or the headline and visual element. The box of information, in text form, was the third part of the learning object engaged.

Within this context eye movement analysis indicates a learning process that is dependent upon a consistent visual flow. Purposely connecting the title, visual element and information text of a learning object may best reinforce learning. By doing this the course designer/instructor becomes a sort of cognitive guide either focusing or not-focusing learning via the meaning structure of the various learning object elements. In our case we want to use visual elements to support performance and achievement of learning tasks.

Choosing Visual Elements

In order to explore the choice of visual elements in an online learning experience it is helpful to understand how we process that experience from a cognitive science perspective.

Clark and Mayer (2016) describe that cognitive science suggests knowledge construction is based upon three principles: Dual channels, limited capacity and active processing. Let’s briefly examine what these are.

Dual channels:

People have two channesl of cognitive processing 1) for processing visual/pictorial material and 2) one for auditory/verbal material. See Figure 1.  below.

 

Model of cognitive model of multimedia learning.
Figure 1.: Model of the Cognitive Theory of Multimedia Learning

Limited capacity:

Humans can only process a few bits of pieces of information in each channel at the same time.

Active processing:

Learning occurs as people engage in cognitive processing during learning. This may include attending to relevant material, organizing that material into a coherent structure, and integrating that material with prior knowledge.

Due to the limits on any learner’s processing capability it is paramount that we select visual images that help manage the learning process. Our goal is to limit excessive processing that clutters the learning experience, build visual support for representing the core learning process, and provide visual support that fosters deeper understanding of the learning at hand. What does this mean in practice?

Managing Processing Via Image Use

Making decisions about image selection and use is a key to managing this learning process. Understanding the meaning of images to select is also key and is really a function of literacy in one’s field and visual literacy in general (Kennedy, 2013).

In practice we can use the following guidelines to make decisions about image use in multimedia-based online learning. 

  • Control Visual Elements – Too many images on a web page or slide may force extraneous cognitive processing that does not support the instructional objective. 
  • Select Visual Elements Carefully – Images difficult to discern are likely to negatively impact learning. Think about good visual quality, emotional and intellectual message of the image, information value, and readability.
  • Use Focused Visual Elements – Target selection of visual support to those images that represent the core learning material and/or provide access to deeper understanding of that core content.

Other Image Tips

Emotional Tone: Emotional design elements (e.g., visuals) can play important roles in motivating learners and achievement of learning outcomes (Mayer, 2013).

Interest: Decorative images may boost learner interest but do not contribute to higher performance in testing (Mayer, 2013). Use decorative images prudently so they do not contribute to extraneous learning processing (Pettersson & Avgerinou, 2016).

Challenge: Making image selections that contribute to a degree of confusion may challenge learnings to dive more deeply into core learning. This is a tenuous decision in that challenge in sense making may prove to foster excessive processing.

Access: Images must be presented in a format that is viewable to users to be practical. This involves an understanding of technical features of image formats, download capability, mobile use, and universal design techniques.

Final Thoughts

It is valuable to remember that visuals communicate non verbally. They are most effectively used when carefully selected and paired with text or audio narration. Visuals appeal to the sense of sight. They have different classifications and could be pictures, symbols, signs, maps graphs, diagrams, charts, models, and photographs. Knowing their form, meaning, and application is part of being a visually literate course developer or instructional designer.

Web Resources

References

Aisami, R. S. (2015). Learning Styles and Visual Literacy for Learning and Performance. Procedia – Social and Behavioral Sciences, 176, 538-545. doi:10.1016/j.sbspro.2015.01.508

Clark, R. C., & Mayer, R. E. (2016). E-learning and the science of instruction : Proven guidelines for consumers and designers of multimedia learning. Retrieved from http://ebookcentral.proquest.com

Eger, L. (2018). How people acquire knowledge from a web page: An eye tracking study. Knowledge Management & E-Learning: An International Journal 10(3), 350-366.

Kennedy, B. (2013, November 19). What is visual literacy?. [Video file]. Retrieved from https://www.youtube.com/watch?time_continue=1&v=O39niAzuapc

Mayer, R. E. (2009). Multimedia learning (2nd ed.). New York: Cambridge University Press.

Mayer, R. E. (2014). Incorporating motivation into multimedia learning. Learning and Instruction, 29, 171-173. doi:10.1016/j.learninstruc.2013.04.003

Rune Pettersson & Maria D. Avgerinou (2016) Information design with teaching and learning in mind, Journal of Visual Literacy, 35:4, 253-267, DOI: 10.1080/1051144X.2016.1278341

 

Credit: Embedded image by Kelly Sikkema on Unsplash.com

“Diversity is our world’s greatest asset, and inclusion is our biggest challenge. And the way that we are going to address that challenge is by extending our empathy.” -Jutta Treviranus, Founder of the Inclusive Design Research Centre, OCAD University

Decorative image

Sure, you’ve been teaching online courses for a few terms or years now, but have you ever been an online student? Many current faculty members earned their degrees in traditional face-to-face settings and have learned how to migrate their courses to the online environment by using research-based best practices and support from instructional designers and media experts. However, are there benefits to experiencing this fledgling educational modality from the perspective of the online student? I argue that faculty who challenge themselves to take an online course experience both personal and professional benefits and become more empathic, inclusive, creative, and reflective.

Benefits for Faculty Members

Challenge yourself to try out something completely different than your specialization or discipline: Are you a STEM professor who has a screenplay idea? Perhaps you have a trip to the French Riviera on your bucket list, or your college Spanish is rusty. Try a foreign language course this summer. Are you a humanities professor who is curious about the composition of the soil in your garden? Find out about the dirt in your yard as a soil science student.

Here are some benefits to consider:

  • Taking an online course may give you ideas or inspiration for something that you want to try in your own course.
  • Continuing education may benefit brain health.
  • Stretching yourself may spur creativity and innovation.
  • You are modeling lifelong learning for your students and family.
  • Most importantly, it just might be fun!

Building Empathy

I’m consistently impressed with the care and concern OSU faculty have for their students, and taking an online course is one way to demonstrate that concern. By changing roles, such as by becoming an online student, faculty expand their perspectives, which results in the potential for even greater student support and understanding.

Yes, faculty members contend with heavy workloads and may feel that taking an online course on top of everything else would be overwhelming. However, your Ecampus students may also struggle with feeling maxed out.

Did you know that the average age of a student taking an Ecampus course is 31 years old? This means that it is likely your online students are responsible for full-time work as well as family obligations. Taking online courses helps faculty members build empathy for their students by giving themselves opportunities to experience the excitement, anxiety, and pride of successfully completing an online course.

Furthermore, by increasing empathy, faculty members may become more inclusive and reflective practitioners. For example, as an online student, you know how it feels to be welcomed (or not) by your instructor, or to receive feedback within a few days as opposed to a few weeks. As an adult learner, you also may desire to share your prior experience or professional background with the instructor or students. Does your course give you the opportunity to introduce yourself to the instructor and other students, to describe your background and some strengths that you bring to the course community, or are you left feeling invisible in the course, with your expertise unacknowledged?

Tuition Reduction for OSU Employees

As OSU employees, faculty and staff are now eligible to take Ecampus courses at the reduced tuition rate, according to the staff fee privileges.

  • Summer courses begin on June 24th, and fall courses begin on September 25th.

Share Your Experience!

Have you been an online student as well as an online instructor? How did being on online student inform your teaching practices? Reply in the comments section, below.

Resources:

I pledge that I have acted honorably in completing this assessment.

There are two sides to the story of security of online assessments. On the one side, cheating does exist in online assessments. Examity’s president Michael London summarized five common ways students cheat on online exams:

  1. The old-school try of notes;
  2. The screenshot;
  3. The water break;
  4. The cover-up; and
  5. The big listen through devices such as Bluetooth headset (London, 2017).

Newton (2015) even reported the disturbing fact that “cheating in online classes is now big business”. On the other side, academic dishonesty is a problem of long history, both on college campuses and in online courses. The rate of students who admit to cheating at least once in their college careers has held steady at somewhere around 75 percent since the first major survey on cheating in higher education in 1963 (Lang, 2013). Around 2000, Many faculty and students believed it was easier to cheat in online classes (Kennedy, 2000), and about a third of academic leaders perceived online outcomes to be inferior to traditional classes (Allen & Seaman, 2011). However, according to Watson and Sottile (2010) and other comparative studies (Pilgrim & Scanlon, 2018), there is no conclusive evidence that online students are more likely to cheat than face-to-face students. “Online learning is, itself, not necessarily a contributing factor to an increase in academic misconduct (Pilgrim & Scanlon, 2018)”.

Since there are so many ways for students to cheat in online assessments, how can we make online assessments more effective in evaluating students’ learning? Online proctoring is a solution that is easy for instructors but adds a burden of cost to students. Common online proctoring service providers include ProctorU, Examity, Proctorio, Honorlock, to name just a few (Bentley, 2017).

Fortunately, there are other ways to assess online learning without overly concerned with academic dishonesty. Vicky Phillips (n.d.) suggested that authentic assessment makes it extremely difficult to fake or copy one’s homework. The University of Maryland University College has consciously moving away from proctored exams and use scenario-based projects as assessments instead (Lieberman, 2018). James Lang (2013) suggested smaller class sizes will allow instructor to have more instructor-to-students interaction one-on-one and limit cheating to the minimum therefore; Pilgrim and Scanlon (2018) suggest changing assessments to reduce the likelihood of cheating (such as demonstrating problem solving in person or via video, using plagiarism detection software programs like TurnItIn, etc.) , promote and establish a culture of academic integrity (such as honor’s code, integrity pledge), and supporting academic integrity through appropriate policies and processes. Kohnheim-Kalkstein (2006) reports that the use of a classroom honor code has been shown to reduce cheating. Kohnheim-Kalkstein, Stellmack, and Shilkey (2008) report that use of classroom honor code improves rapport between faculty and students, and increases feelings of trust and respect among students. Gurung, Wilhelm and Fitz (2012) suggest that an honor pledge should include formal language, state the specific consequences for cheating, and require a signature. For the honor pledge to be most effective, Shu, Mazar, Gino, Ariely, and Bazerman (2012) suggests including the honor pledge on the first page of an online assessment or online assignment, before students take the assessment or work on the assignment.

Rochester Institute of Technology (2014) ’s Teaching Elements: Assessing Online Students offer a variety of ways to assess students, including discussions, low-stake quizzes, writing assignments (such as muddiest point paper), and individual activities (such as staged assignments for students to receive ongoing feedback), and many other activities.

In summary, there are plenty of ways to design effective formative or summative assessments online that encourage academic honesty, if instructors and course designers are willing to spend the time to try out suggested strategies from literature.

References

Bentley, Kevin. (2017). What to consider when selecting an online exam proctoring service. Inside HigherEd. (June 21, 2017). Retrieved from https://www.insidehighered.com/digital-learning/views/2017/06/21/selecting-online-exam-proctoring-service on February 22, 2019.

Gurung, R. A. R., Wilhelm, T. M., & Filz, T. (2012). Optimizing honor codes for online exam administration. Ethics & Behavior, 22, 158–162.

Konheim-Kalkstein, Y. L. (2006). Use of a classroom honor code in higher education. Journal of Credibility Assessment and Witness Psychology, 7, 169–179.

Konheim-Kalkstein,Y. L., Stellmack, M. A., & Shilkey, M. L. (2008). Comparison of honor code and non-honor code classrooms at a non-honor code university. Journal of College & Character, 9, 1–13.

J.M. Lang. (2013). How college classes encourage cheating. Boston Globe. Retrieved from https://www.bostonglobe.com/ideas/2013/08/03/how-college-classes-encourage-cheating/3Q34x5ysYcplWNA3yO2eLK/story.html on February 21, 2019.

Lieberman, Mark. (2018). Exam proctoring for online students hasn’t yet transformed. Inside Higher Ed (October 10, 2018). Retrieved from https://www.insidehighered.com/digital-learning/article/2018/10/10/online-students-experience-wide-range-proctoring-situations-tech, on February 22, 2019.

Michael London. (2017). 5 Ways to Cheat on Online Exams. Inside Higher Ed (09/20/2017). Retrieved from https://www.insidehighered.com/digital-learning/views/2017/09/20/creative-ways-students-try-cheat-online-exams on February 21, 2019.

Derek Newton. (2015). Cheating in Online Classes is now big business. The Atlantic. Retrieved from https://www.theatlantic.com/education/archive/2015/11/cheating-through-online-courses/413770/ on February 21, 2019.

Vicky Phillips. (n.d.). Big Fat Online Education Myths – students cheat like weasels in Online Classes. GetEducated. Retrieved from https://www.geteducated.com/elearning-education-blog/big-fat-online-education-myths-students-cheat-like-weasels-in-online-classes/ on February 21, 2019.

Chris Pilgrim and Christopher Scanlon. (2018). Don’t assume online students are more likely to cheat. The evidence is murky. Retrieved from https://phys.org/news/2018-07-dont-assume-online-students-evidence.html on February 21, 2019.

Rochester Institute of Technology. (2014). Teaching Elements: Assessing Online Students. Retrieved from https://www.rit.edu/academicaffairs/tls/sites/rit.edu.academicaffairs.tls/files/docs/TE_Online%20Assessmt.pdf on February 21, 2019.

Shu, L. L., Mazar, N., Gino, F., Ariely, D., & Bazerman, M. H. (2012). Signing at the beginning makes ethics salient and decreases dishonest self-reports in comparison to signing at the end. PNAS, 109, 15197–15200.

George Watson. And James Sottile. (2010). Cheating in digital age: Do students cheat more in online courses? Online Journal of Distance Learning Administration 13(1). Retrieved from https://www.westga.edu/~distance/ojdla/spring131/watson131.html on February 21, 2019

First, let’s start by considering the characteristics of effective feedback in general. What comes to mind?

sound waves

Perhaps you hear in your head (in the authentically authoritative voice of a past professor) the words timely, frequent, regular, balanced, specific. Perhaps you recall the feedback sandwich–corrective feedback sandwiched between positive feedback. Perhaps you consider rubrics or ample formative feedback to be critical components of effective feedback. You wouldn’t be wrong.

As educators, we understand the main characteristics of effective feedback. But despite this fact, students are often disappointed by the feedback they receive and faculty find the feedback process time consuming, often wondering if the time commitment is worth it. As an instructional designer, I hear from faculty who struggle to get students to pay attention to feedback and make appropriate changes based on feedback. I hear from faculty who struggle to find the time to provide quality feedback, especially in large classes. The struggle is real. I know this because I hear about it all the time.

I’m glad I hear about these concerns. I always want faculty to share their thoughts about what’s working and what’s not working in their classes. About a year or two ago, I also started hearing rave reviews from faculty who decided to try audio feedback in their online courses. They loved it and reported that their students loved it. Naturally, I wanted to know if these reports were outliers or if there’s evidence supporting audio feedback as an effective pedagogical practice.

I started by looking for research on how audio feedback influences student performance, but what I found was research on how students and faculty perceive and experience audio feedback.

What I learned was that, overall, students tend to prefer audio feedback. Faculty perceptions, however, are mixed, especially in terms of the potential for audio feedback to save them time.

While the research was limited and the studies often had contradictory results, there was one consistent takeaway from multiple studies: audio feedback supports social presence, student-faculty connections, and engagement.

While research supports the value of social presence online, audio feedback is not always considered for this purpose. Yet, audio feedback is an excellent opportunity to focus on teaching presence by connecting one-to-one with students.

If you haven’t tried audio feedback in your classes, and you want to, here are some tips to get you started:

  1. Use the Canvas audio tool in Speedgrader. See the “add media comment” section of the Canvas guide to leaving feedback comments. Since this tool is integrated with Canvas, you won’t have to worry about upload and download times for you or your students.
  2. Start slow. You don’t have to jump into the deep end and provide audio comments on all of your students’ assignments. Choose one or two to get started.
  3. Ask your students what they think. Any time you try something new, it’s a good idea to hear from your students. Creating a short survey in your course to solicit student feedback is an excellent way to get informal feedback.
  4. Be flexible. If you have a student with a hearing impairment or another barrier that makes audio feedback a less than optimal option for them, be prepared to provide them with written feedback or another alternative.

Are you ready to try something new? Have you tried using audio feedback in your course? Tell us how it went!

References:

Image by mtmmonline on Pixabay.

Note: This post was based on a presentation given at the STAR Symposium in February 2019. For more information and a full list of references, see the presentation slide deck.

 

As a stranger give it welcome.” – Shakespeare

Students need tactics for when they encounter strange people or strange ideas. (Wilson, 2018) First-time online students are a perfect example of individuals who are encountering something new, strange, and often uncomfortable, for the first time. Welcoming that strange experience should include a little bit of information gathering. Look for positive and negatives in situations to help decide how you view it and, most of all, have an open mind.

To help potential online students make decisions, when they take their first online course, Marie Fetzner asked unsuccessful online students: “What advice would you give to students who are considering registering for an online course?”

Their top 13 responses:

  1. Stay up with the course activities—don’t get behind
  2. Use good time management skills
  3. Use good organizational skills
  4. Set aside specific times during each week for your online class
  5. Know how to get technical help
  6. A lot of online writing is required
  7. There is a lot of reading in the textbook and in online discussions—be prepared
  8. Regular online communications are needed
  9. Ask the professor if you have questions
  10. Carefully read the course syllabus
  11. Be sure you understand the requirements of the online course discussions
  12. Understand how much each online activity is worth toward your grade
  13. Go to the online student orientation, if possible

 

These responses raise the question: how can we better help our students? From the advice above, we know students struggle with time management, expectations, communication, etc.  So, what can we do to help foster their success?

  1. Reach out to students who seem to be lagging behind. A quick email is sometimes all it takes to open up that line of communication between you and the student.
  2. Provide approximate times for course materials and activities. Students can use this to better plan for the requirements that week.
  3. Keep your course organized so students can spend more time with the content instead of search for the content.
  4. Remind students about where to access help and support services.
  5. Develop a Q&A discussion board for student questions about the course. Often, more than one student has the same question and often other students might already know the answer. Have this be something you check daily to answer questions quickly so students can continue with their learning.
  6. Use rubrics for grading. By giving the students rubrics, they will know what is expected, you will get responses closer to your expectations, and it makes grading easier!

 

Welcome these ideas as you would a new experience. Give it a little try, jump right in, confer with colleagues, or chose your own path. Know that as an instructor or developer for an online course, you have the ability to help your students be successful!

References

Fetzner, Marie. (2013). What Do Unsuccessful Online Students Want Us to Know? Journal of Asynchronous Learning Networks, 17(1), 13-27.

Wilson, J. (2018). “As a stranger give it welcome”: Shakespeare’s Advice for First-Year College Students. Change, 50(5), 60.

 

student response slide

In my last post, I described how Ecampus courses use synchronous study sessions to provide listening and speaking practice to students of world languages. Much of the Ecampus language learning experience is entirely asynchronous, however, to provide flexibility for our students. So how exactly do students converse asynchronously? This post will describe the design of asynchronous listening and speaking exercises in 300-level French conversation courses, executed by Ana-Maria M’Enesti, PhD, and facilitated via VoiceThread, a slide show within the LMS that displays course content about which participants comment via text, audio, or video.

Title slide and Intro slide
In these two slides, Ana-Maria intros the topic via video comment, contextualizes the resource via audio, and links out to the resource. The “i” icon indicates an “Instructions” comment and the numbered icons indicate links 1 and 2.

VoiceThread was an appealing platform because of the ease with which students can add audio or video comments, more streamlined than the protocol for uploading video to a discussion board, and because of its display of content in sequential slides. When Ana-Maria and I began exploring how to present her asynchronous conversational lessons within VoiceThread, we realized that we could chunk each stage of the activity into these individual slides. This made the cognitive load at each stage manageable, yet provided continuity across the activity, because the slides are contained in a single assignment; students navigate by advancing horizontally from slide to slide. VoiceThread allows each slide to link to external content, so students can maintain their place in the sequence of the assignment while engaging with linked resources in another window. Most importantly, since students encounter all the related learning activities from within a single context, it is clear to them why they are investing time in reading or watching a resource – they anticipate that, at the end of the assignment, they will complete a culminating speaking activity.

For the culminating speaking activity, we used VoiceThread to provide each student with a place to upload his or her initial post as a new, individual slide that occupies the entire horizontal pane. Replies from peers are then appended to each student’s initial slide post. Visually, this is easier to follow than a text-based discussion, with its long, vertical display of posts that uses nesting to establish the hierarchy of threaded replies. Within VoiceThread, as students advance through the slides, they are able to focus their attention on each student’s initial post and the associated peer replies, one at a time.

student response slide
A student’s initial slide post displays her individual environmental footprint gained from using the resource linked earlier. On the left, there is an audio explanation and comments between the student, “AC,” instructor, and peers, labeled by their initials or profile pic.

Now that I’ve discussed how we exploited the mechanics of VoiceThread, I’ll review the learning design. To progressively scaffold students’ conversational skills, Ana-Maria builds each assignment as a series of activities of increasing difficulty. On the first slide, students might be prompted to share opinions or personal experiences of a topic in order to activate prior knowledge of thematic vocabulary and associated grammatical structures. Then, on subsequent slides, students are challenged to read or watch related content that is comprehensible, but a bit beyond their current language competence, the “i+1” level, as Krashen coined it. Afterwards, to ensure they’ve grasped the resource, Ana-Maria typically poses factual comprehension questions and then asks students to re-read or re-watch so that they can grasp any meanings they may have missed on the initial encounter.

Finally, students are asked to speak critically on what they read or watched, express a solution to a problem, or place the topic within their own cultural context, using topic-specific vocabulary and associated grammatical structures that they’ve heard or read from the included resources. The instructor is present throughout, mediating the interaction between student and content, since Ana-Maria narrates each slide, reading the instructions aloud and adding additional context. There is also support for listening comprehension, as the most critical instructions are written on each slide.

For the feedback stage of the assignment, students learn from each other’s responses, listening and providing replies to at least two peers on two different days of the week. This requirement allows conversations to develop between students and provides the third type of interaction, learner-to-learner, so that the activity sequence facilitates all three of the interactions described by Moore (1989): learner to content, learner to instructor, and learner to learner.

As expressed by one of our own students, “I was uncertain how a conversation course online would really work,” but “VoiceThread proved to be a helpful tool.” It allowed us to solve the puzzle of providing asynchronous conversational activities for students, who reported in surveys that it helped:

  • to “humanize” them to each other, like being “in an actual classroom”
  • to connect them with their instructor
  • to provide “access to multiple tasks within one [assignment]”
  • to improve listening and speaking skills
  • to make “group projects flow better”

VoiceThread is quite a versatile tool and is being piloted for use with many other disciplines at Ecampus. I’m sure you can imagine other ways to adapt it to your own context and content!