Learning from teaching

Clara Bird, PhD Student, OSU Department of Fisheries and Wildlife, Geospatial Ecology of Marine Megafauna Lab

Based on my undergrad experience I assumed that most teaching in grad school would be as a teaching assistant, and this would consist of teaching labs, grading, leading office hours, etc. However, now that I’m in graduate school, I realize that there are many different forms of teaching as a graduate student. This summer I worked as an instructor for an e-campus course, which mainly involved grading and mentoring students as they developed their own projects. Yet, this past week I was a guest teacher for Physiology and Behavior of Marine Megafauna, which was a bit more involved.

I taught a whale photogrammetry lab that I originally developed as a workshop with a friend and former lab mate, KC Bierlich, at the Duke University Marine Robotics and Remote Sensing (MaRRS) lab when I worked there. Similar to Leila’s work, we were using photogrammetry to measure whales and assess their body condition. Measuring a whale is a deceivingly simple task that gets complicated when taking into account all the sources of error that might affect measurement accuracy. It is important to understand the different sources of error so that we are sure that our results are due to actual differences between whales instead of differences in errors.

Error can come from distortion due to the camera lens, inaccurate altitude measurements from the altimeter, the whale being arched, or from the measurement process. When we draw a line on the image to make a measurement (Image 1), measurement process errors come from the line being drawn incorrectly. This potential human error can effect results, especially if the measurer is inexperienced or rushing. The quality of the image also has an effect here. If there is glare, wake, blow or refraction covering or distorting the measurer’s view of the full body of the whale then the measurer has to estimate where to begin and end the line. This estimation is subjective and, therefore, a source of error. We used the workshop as an opportunity to study these measurement process errors because we could provide a dataset including images of varying qualities and collect data from different measurers.

Image 1. Screenshot of measuring the widths along a minke whale in MorphoMetriX. Source: https://github.com/wingtorres/morphometrix/blob/master/images/Picture4.png

This workshop started as a one-day lecture and lab that we designed for the summer drone course at the Duke Marine Lab. The idea was to simultaneously teach the students about photogrammetry and the methods we use, while also using all the students’ measurements to study the effect of human error and image quality on measurement accuracy. Given this one-day format, we ambitiously decided to teach and measure in the morning, compile and analyze the students’ measurements over lunch, and then present the results of our error analysis in the afternoon. To accomplish this, we prepared as much as we could and set up all the code for the analysis ahead of time. This preparation meant several days of non-stop working, discussing, and testing, all to anticipate any issues that might come up on the day of the class.  We used the measuring software MorphoMetriX (Torres & Bierlich, 2020) that was developed by KC and a fellow Duke Marine Lab grad student Walter Torres. MorphoMetriX was brand new at the time, and this newness of the software meant that we didn’t yet know all the issues that might come up and we did not have time to troubleshoot. We knew this meant that helping the students install the software might be a bit tricky and sure enough, all I remember from the beginning of that first lab is running around the room helping multiple people troubleshoot at the same time, using use all the programming knowledge I had to discover new solutions on the fly.

While troubleshooting on the fly can be stressful and overwhelming, I’ve come to appreciate it as good practice. Not only did we learn how to develop and teach a workshop, we also used what we had learned from all the troubleshooting to improve the software. I also used the code we developed for the analysis as the starting blocks for a software package I then wrote, CollatriX (Bird & Bierlich, 2020), as a follow up software to MorphoMetriX. Aside from the initial troubleshooting stress, the workshop was a success, and we were excited to have a dataset to study measurement process errors. Given that we already had all the materials for the workshop prepared, we decided to run a few more workshops to collect more data.

That brings me to my time at here at OSU. I left the Duke MaRRS lab to start graduate school shortly after we taught the workshop. Interested in running the workshop here, I reached out to a few different people. I first ran the workshop here as an event organized by the undergraduate club Ocean11 (Image 2). It was fun running the workshop a second time, as I used what I learned from the first round; I felt more confident, and I knew what the common issues would likely be and how to solve them. Sure enough, while there were still some troubleshooting issues, the process was smoother and I enjoyed teaching, getting to know OSU undergraduate students, and collecting more data for the project.

Image 2. Ocean11 students measuring during the workshop (Feb 7, 2020).
Image credit: Clara Bird

The next opportunity to run the lab came through Renee Albertson’s physiology and behavior of marine megafauna class, but during the COVID era this class had other challenges. While it’s easier to teach in person, this workshop was well suited to be converted to a remote activity because it only requires a computer, the data can be easily sent to the students, and screen sharing is an effective way to demonstrate how to measure. So, this photogrammetry module was a good fit for the marine megafauna class this term that has been fully remote due to COVID-19.  My first challenge was converting the workshop into a lab assignment with learning outcomes and analysis questions. The process also involved writing R code for the students to use and writing step-by-step instructions in a way that was clear and easy to understand. While stressful, I appreciated the process of developing the lab and these accompanying materials because, as you’ve probably heard from a teacher, a good test of your understanding of a concept is being able to teach it. I was also challenged to think of the best way to communicate and explain these concepts. I tried to think of a few different explanations, so that if a student did not understand it one way, I could offer an alternative that might work better. Similar to the preparation for the first workshop, I also prepared for troubleshooting the students’ issues with the software. However, unlike my previous experiences, this time I had to troubleshoot remotely.

After teaching this photogrammetry lab last week my respect for teachers who are teaching remotely has only increased. Helping students without being able to sit next to them and walk them through things on their computer is not easy. Not only that, in addition to the few virtual office hours I hosted, I was primarily troubleshooting over email, using screen shots from the students to try and figure out what was going on. It felt like the ultimate test of my programming knowledge and experience, having to draw from memories of past errors and solutions, and thinking of alternative solutions if the first one didn’t work. It was also an exercise in communication because programming can be daunting to many students; so, I worked to be encouraging and clearly communicate the instructions. All in all, I ended this week feeling exhausted but accomplished, proud of the students, and grateful for the reminder of how much you learn when you teach.

References

Bird, C. N., & Bierlich, K. (2020). CollatriX: A GUI to collate MorphoMetriX outputs. Journal of Open Source Software, 5(51), 2328. https://doi.org/10.21105/joss.02328

Torres, W., & Bierlich, K. (2020). MorphoMetriX: a photogrammetric measurement GUI for morphometric analysis of megafauna. Journal of Open Source Software, 5(45), 1825. https://doi.org/10.21105/joss.01825

Print Friendly, PDF & Email

Leave a Reply

Your email address will not be published. Required fields are marked *