Learning from teaching

Clara Bird, PhD Student, OSU Department of Fisheries and Wildlife, Geospatial Ecology of Marine Megafauna Lab

Based on my undergrad experience I assumed that most teaching in grad school would be as a teaching assistant, and this would consist of teaching labs, grading, leading office hours, etc. However, now that I’m in graduate school, I realize that there are many different forms of teaching as a graduate student. This summer I worked as an instructor for an e-campus course, which mainly involved grading and mentoring students as they developed their own projects. Yet, this past week I was a guest teacher for Physiology and Behavior of Marine Megafauna, which was a bit more involved.

I taught a whale photogrammetry lab that I originally developed as a workshop with a friend and former lab mate, KC Bierlich, at the Duke University Marine Robotics and Remote Sensing (MaRRS) lab when I worked there. Similar to Leila’s work, we were using photogrammetry to measure whales and assess their body condition. Measuring a whale is a deceivingly simple task that gets complicated when taking into account all the sources of error that might affect measurement accuracy. It is important to understand the different sources of error so that we are sure that our results are due to actual differences between whales instead of differences in errors.

Error can come from distortion due to the camera lens, inaccurate altitude measurements from the altimeter, the whale being arched, or from the measurement process. When we draw a line on the image to make a measurement (Image 1), measurement process errors come from the line being drawn incorrectly. This potential human error can effect results, especially if the measurer is inexperienced or rushing. The quality of the image also has an effect here. If there is glare, wake, blow or refraction covering or distorting the measurer’s view of the full body of the whale then the measurer has to estimate where to begin and end the line. This estimation is subjective and, therefore, a source of error. We used the workshop as an opportunity to study these measurement process errors because we could provide a dataset including images of varying qualities and collect data from different measurers.

Image 1. Screenshot of measuring the widths along a minke whale in MorphoMetriX. Source: https://github.com/wingtorres/morphometrix/blob/master/images/Picture4.png

This workshop started as a one-day lecture and lab that we designed for the summer drone course at the Duke Marine Lab. The idea was to simultaneously teach the students about photogrammetry and the methods we use, while also using all the students’ measurements to study the effect of human error and image quality on measurement accuracy. Given this one-day format, we ambitiously decided to teach and measure in the morning, compile and analyze the students’ measurements over lunch, and then present the results of our error analysis in the afternoon. To accomplish this, we prepared as much as we could and set up all the code for the analysis ahead of time. This preparation meant several days of non-stop working, discussing, and testing, all to anticipate any issues that might come up on the day of the class.  We used the measuring software MorphoMetriX (Torres & Bierlich, 2020) that was developed by KC and a fellow Duke Marine Lab grad student Walter Torres. MorphoMetriX was brand new at the time, and this newness of the software meant that we didn’t yet know all the issues that might come up and we did not have time to troubleshoot. We knew this meant that helping the students install the software might be a bit tricky and sure enough, all I remember from the beginning of that first lab is running around the room helping multiple people troubleshoot at the same time, using use all the programming knowledge I had to discover new solutions on the fly.

While troubleshooting on the fly can be stressful and overwhelming, I’ve come to appreciate it as good practice. Not only did we learn how to develop and teach a workshop, we also used what we had learned from all the troubleshooting to improve the software. I also used the code we developed for the analysis as the starting blocks for a software package I then wrote, CollatriX (Bird & Bierlich, 2020), as a follow up software to MorphoMetriX. Aside from the initial troubleshooting stress, the workshop was a success, and we were excited to have a dataset to study measurement process errors. Given that we already had all the materials for the workshop prepared, we decided to run a few more workshops to collect more data.

That brings me to my time at here at OSU. I left the Duke MaRRS lab to start graduate school shortly after we taught the workshop. Interested in running the workshop here, I reached out to a few different people. I first ran the workshop here as an event organized by the undergraduate club Ocean11 (Image 2). It was fun running the workshop a second time, as I used what I learned from the first round; I felt more confident, and I knew what the common issues would likely be and how to solve them. Sure enough, while there were still some troubleshooting issues, the process was smoother and I enjoyed teaching, getting to know OSU undergraduate students, and collecting more data for the project.

Image 2. Ocean11 students measuring during the workshop (Feb 7, 2020).
Image credit: Clara Bird

The next opportunity to run the lab came through Renee Albertson’s physiology and behavior of marine megafauna class, but during the COVID era this class had other challenges. While it’s easier to teach in person, this workshop was well suited to be converted to a remote activity because it only requires a computer, the data can be easily sent to the students, and screen sharing is an effective way to demonstrate how to measure. So, this photogrammetry module was a good fit for the marine megafauna class this term that has been fully remote due to COVID-19.  My first challenge was converting the workshop into a lab assignment with learning outcomes and analysis questions. The process also involved writing R code for the students to use and writing step-by-step instructions in a way that was clear and easy to understand. While stressful, I appreciated the process of developing the lab and these accompanying materials because, as you’ve probably heard from a teacher, a good test of your understanding of a concept is being able to teach it. I was also challenged to think of the best way to communicate and explain these concepts. I tried to think of a few different explanations, so that if a student did not understand it one way, I could offer an alternative that might work better. Similar to the preparation for the first workshop, I also prepared for troubleshooting the students’ issues with the software. However, unlike my previous experiences, this time I had to troubleshoot remotely.

After teaching this photogrammetry lab last week my respect for teachers who are teaching remotely has only increased. Helping students without being able to sit next to them and walk them through things on their computer is not easy. Not only that, in addition to the few virtual office hours I hosted, I was primarily troubleshooting over email, using screen shots from the students to try and figure out what was going on. It felt like the ultimate test of my programming knowledge and experience, having to draw from memories of past errors and solutions, and thinking of alternative solutions if the first one didn’t work. It was also an exercise in communication because programming can be daunting to many students; so, I worked to be encouraging and clearly communicate the instructions. All in all, I ended this week feeling exhausted but accomplished, proud of the students, and grateful for the reminder of how much you learn when you teach.

References

Bird, C. N., & Bierlich, K. (2020). CollatriX: A GUI to collate MorphoMetriX outputs. Journal of Open Source Software, 5(51), 2328. https://doi.org/10.21105/joss.02328

Torres, W., & Bierlich, K. (2020). MorphoMetriX: a photogrammetric measurement GUI for morphometric analysis of megafauna. Journal of Open Source Software, 5(45), 1825. https://doi.org/10.21105/joss.01825

Why Feeling Stupid is Great: How stupidity fuels scientific progress and discovery

By Alexa Kownacki, Ph.D. Student, OSU Department of Fisheries and Wildlife, Geospatial Ecology of Marine Megafauna Lab

It all started with a paper. On Halloween, I sat at my desk, searching for papers that could answer my questions about bottlenose dolphin metabolism and realized I had forgotten to check my email earlier. In my inbox, there was a new message with an attachment from Dr. Leigh Torres to the GEMM Lab members, saying this was a “must-read” article. The suggested paper was Martin A. Schwartz’s 2008 essay, “The importance of stupidity in scientific research”, published in the Journal of Cell Science, highlighted universal themes across science. In a single, powerful page, Schwartz captured my feelings—and those of many scientists: the feeling of being stupid.

For the next few minutes, I stood at the printer and absorbed the article, while commenting out loud, “YES!”, “So true!”, and “This person can see into my soul”. Meanwhile, colleagues entered my office to see me, dressed in my Halloween costume—as “Amazon’s Alexa”, talking aloud to myself. Coincidently, I was feeling pretty stupid at that moment after just returning from a weekly meeting, where everyone asked me questions that I clearly did not have the answers to (all because of my costume). This paper seemed too relevant; the timing was uncanny. In the past few weeks, I have been writing my PhD research proposal —a requirement for our department— and my goodness, have I felt stupid. The proposal outlines my dissertation objectives, puts my work into context, and provides background research on common bottlenose dolphin health. There is so much to know that I don’t know!

Alexa dressed as “Amazon Alexa” on Halloween at her office in San Diego, CA.

When I read Schwartz’s 2008 paper, there were a few takeaway messages that stood out:

  1. People take different paths. One path is not necessarily right nor wrong. Simply, different. I compared that to how I split my time between OSU and San Diego, CA. Spending half of the year away from my lab and my department is incredibly challenging; I constantly feel behind and I miss the support that physically being with other students provides. However, I recognize the opportunities I have in San Diego where I work directly with collaborators who teach and challenge me in new ways that bring new skills and perspective.

    Image result for different ways
    (Image source: St. Albert’s Place)
  2. Feeling stupid is not bad. It can be a good feeling—or at least we should treat it as being a positive thing. It shows we have more to learn. It means that we have not reached our maximum potential for learning (who ever does?). While writing my proposal I realized just how little I know about ecotoxicology, chemistry, and statistics. I re-read papers that are critical to understanding my own research, like “Nontargeted biomonitoring of halogenated organic compounds in two ecotypes of bottlenose dolphins (Tursiops truncatus) from the Southern California bight” (2014) by Shaul et al. and “Bottlenose dolphins as indicators of persistent organic pollutants in the western north Atlantic ocean and northern gulf of Mexico” (2011) by Kucklick et al. These articles took me down what I thought were wormholes that ended up being important rivers of information. Because I recognized my knowledge gap, I can now articulate the purpose and methods of analysis for specific compounds that I will conduct using blubber samples of common bottlenose dolphins

    Image result
    Image source: memegenerator.net
  3. Drawing upon experts—albeit intimidating—is beneficial for scientific consulting as well as for our mental health; no one person knows everything. That statement can bring us together because when people work together, everyone benefits. I am also reminded that we are our own harshest critics; sometimes our colleagues are the best champions of our own successes. It is also why historical articles are foundational. In the hunt for the newest technology and the latest and greatest in research, it is important to acknowledge the basis for discoveries. My data begins in 1981, when the first of many researchers began surveying the California coastline for common bottlenose dolphins. Geographic information systems (GIS) were different back then. The data requires conversions and investigative work. I had to learn how the data were collected and how to interpret that information. Therefore, it should be no surprise that I cite literature from the 1970s, such as “Results of attempts to tag Atlantic Bottlenose dolphins, (Tursiops truncatus)” by Irvine and Wells. Although published in 1972, the questions the authors tried to answer are very similar to what I am looking at now: how are site fidelity and home ranges impacted by natural and anthropogenic processes. While Irvine and Wells used large bolt tags to identify individuals, my project utilizes much less invasive techniques (photo-identification and blubber biopsies) to track animals, their health, and their exposures to contaminants.

    Image result for that is why you fail yoda
    (Image source: imgflip.com)
  4. Struggling is part of the solution. Science is about discovery and without the feeling of stupidity, discovery would not be possible. Feeling stupid is the first step in the discovery process: the spark that fuels wanting to explore the unknown. Feeling stupid can lead to the feeling of accomplishment when we find answers to those very questions that made us feel stupid. Part of being a student and a scientist is identifying those weaknesses and not letting them stop me. Pausing, reflecting, course correcting, and researching are all productive in the end, but stopping is not. Coursework is the easy part of a PhD. The hard part is constantly diving deeper into the great unknown that is research. The great unknown is simultaneously alluring and frightening. Still, it must be faced head on. Schwartz describes “productive stupidity [as] being ignorant by choice.” I picture this as essentially blindly walking into the future with confidence. Although a bit of an oxymoron, it resonates the importance of perseverance and conviction in the midst of uncertainty.

    Image result for funny t rex
    (Image source: Redbubble)

Now I think back to my childhood when stupid was one of the forbidden “s-words” and I question whether society had it all wrong. Maybe we should teach children to acknowledge ignorance and pursue the unknown. Stupid is a feeling, not a character flaw. Stupidity is important in science and in life. Fascination and emotional desires to discover new things are healthy. Next time you feel stupid, try running with it, because more often than not, you will learn something.

Image may contain: 1 person, sitting, table, child and outdoor
Alexa teaching about marine mammals to students ages 2-6 and learning from educators about new ways to engage young students. San Diego, CA in 2016. (Photo source: Lori Lowder)