Measure faster! New tools for automatically obtaining body length and body condition of whales from drone videos

Dr. KC Bierlich, Postdoctoral Scholar, OSU Department of Fisheries, Wildlife, & Conservation Sciences, Geospatial Ecology of Marine Megafauna Lab

Monitoring the body length and body condition of animals can help provide important information on the health status of individuals and their populations, and can even serve as early warning signs if a population is adapting to habitat changes or is at risk of collapse (Cerini et al., 2023). As discussed in previous blogs, drone-based photogrammetry provides a method for non-invasively collecting important size measurements of whales, such as for detecting differences in body condition and length between populations, and even diagnosing pregnancy. Thus, using drones to collect measurement data on the growth, body condition, and pregnancy rates of whales can help expedite population health assessments to elicit conservation and management actions.

However, it takes a long time to manually measure whales filmed in drone imagery. For every video collected, an analyst must carefully watch each video and manually select frames with whales in good positions for measuring (flat and straight at the surface). Once frames are selected, each image must then be ranked and filtered for quality before finally measuring using a photogrammetry software, such as MorphoMetriX. This entire manual processing pipeline ultimately delays results, which hinders the ability to rapidly assess population health. If only there was a way to automate this process of obtaining measurements…

Well now there is! Recently, a collaboration between researchers from the GEMM Lab, CODEX, and OSU’s Department of Engineering and Computer Science published a manuscript introducing automated methods for obtaining body length and body condition measurements (Bierlich et al., 2024). The manuscript describes two user-friendly models: 1) “DeteX”, which automatically detects whales in drone videos to output frames for measuring and 2) “XtraX”, which automatically extracts body length and body condition measurements from input frames (Figure 1). We found that using DeteX and XtraX produces measurements just as good as manual measurement (Coefficient of Variation < 5%), while substantially reducing the processing time by almost 90%. This increased efficiency not only saves hours (weeks!) of manual processing time, but enables more rapid assessments of populations’ health.

Future steps for DeteX and XtraX are to adapt the models so that measurements can be extracted from multiple whales in a single frame, which could be particularly useful for analyzing images containing mothers with their calf. We also look forward to adapting DeteX and XtraX to accommodate more species. While DeteX and XtraX was trained using only gray whale imagery, we were pleased to see that these models performed well when trialing on imagery of a blue whale (Figure 2). These results are encouraging because it shows that the models can be adapted to accommodate other species with different body shapes, such as belugas or beaked whales, with the inclusion of more training data.

We are excited to share these methods with the drone community and the rest of this blog walks through the features and steps for running DeteX and XtraX to make them even easier to use.

Figure 1. Overview of DeteX and XtraX for automatically obtaining body length and body condition measurements from drone-based videos.

Figure 2. Example comparing manual (MorphoMetriX) vs. automated (XtraX) measurements of a blue whale.

DeteX and XtraX walkthrough

Both DeteX and XtraX are web-based applications designed to be intuitive and user-friendly. Instructions to install and run DeteX and XtraX are available on the CODEX website. Once DeteX is launched, the default web-browser automatically opens the application where the user is asked to select 1) the folder containing the drone-based videos to analyze and 2) the folder to save output frames (Figure 3). Then, the user can select ‘start’ to begin. The default for DeteX is set to analyze the entire video from start to finish at one frame per second; if recording a video at 30 frames per second, the last (or 30th) frame is processed for each second in the video. There is also a “finetune” version of DeteX that offers users much more control, where they can change these default settings (Figure 4). For example, users can change the defaults to increase the number of frames processed per second (i.e., 10 instead of 1), to target a specific region in the video rather than the entire video, and adjust the “detection model threshold” to change the threshold of confidence the model has for detecting a whale. These specific features for enhanced control may be particularly helpful when there is a specific surfacing sequence that a user wants to have more flexibility in selecting specific frames for measuring.

Figure 3. A screenshot of the DeteX web-based application interface.

Figure 4. The DeteX “finetune” version provides more control for users to change the default settings to target a specific region in the video (here between 3 min 00 sec and 3 min 05 sec), change the number of frames per second to process (now 10 per second), and the detection threshold, or level of confidence for identifying a whale in the video (now a higher threshold at 0.9 instead of the default at 0.8).

Once output frames are generated by DeteX, the user can select which frames to input into XtraX to measure. Once XtraX is launched, the default web-browser automatically opens the application where the user is asked to select 1) the folder containing the frames to measure and 2) the folder to save the output measurements. If the input frames were generated using DeteX, the barometric altitude is automatically extracted from the file name (note, that altitudes collected from a LiDAR altimeter can be joined in the XtraX output .csv file to then calculate measurements using this altitude). The image width (pixels) is automatically extracted from the input frame metadata. Users can then input specific camera parameters, such as sensor width (mm) and the focal length of the camera (mm), the launch height of the drone (i.e., if launching from hand when on a boat), and the region along the body to measure body condition (Figure 5). This region along the body is called the Head-Tail range and is identified as the area where most lipid storage takes place to estimate body condition. To run, the user selects “start”. XtraX then will output a .png file of each frame showing the keypoints (used for the body length measurement) and the shaded region (used for the body condition estimate) along the body to help visual results so users can filter for quality (Figure 6). XtraX also outputs a single .csv containing all the measurements (in meters and pixels) with their associated metadata.

Figure 5. User interface for XtraX. The user specifies a folder containing the images to measure and a folder to save the outputs measurements, and then can enter in camera specifications, the launch height of the drone (to be added to the barometer altitude) and the range of body widths to include in the body condition measurement (in the case, 0.2 and 0.7 correspond to body region between widths 20% and 70% of the total length, respectively).

Figure 6. Example output from XtraX showing (red) keypoints along the body to measure body length and the (green) shaded region used for body condition.

We hope this walkthrough is helpful for researchers interested in using and adapting these tools for their projects. There is also a video tutorial available online. Happy (faster) measuring!

References

Bierlich, K. C., Karki, S., Bird, C. N., Fern, A., & Torres, L. G. (2024). Automated body length and body condition measurements of whales from drone videos for rapid assessment of population health. Marine Mammal Science, e13137. https://doi.org/10.1111/mms.13137

Cerini, F., Childs, D. Z., & Clements, C. F. (2023). A predictive timeline of wildlife population collapse. Nature Ecology & Evolution, 7(3), 320–331. https://doi.org/10.1038/s41559-023-01985-2

Robots are taking over the oceans

By Leila Lemos, PhD Student

In the past few weeks I read an article on the use of aquatic robots in the ocean for research. Since my PhD project uses technology, such as drones and GoPros, to monitor body condition of gray whales and availability of prey along the Oregon coast, I became really interested by the new perspective these robots could provide. Drones produce aerial images while GoPros generate an underwater-scape snapshot. The possible new perspective provided by a robot under the water could be amazing and potentially be used in many different applications.

The article was published on March 21st by The New York Times, and described a new finned robot named “SoFi” or “Sophie”, short for Soft Robotic Fish (Figure 1; The New York Times 2018). The aquatic robot was designed by scientists at the Massachusetts Institute of Technology (MIT) Computer Science and Artificial Intelligence Lab, with the purpose of studying marine life in their natural habitats.

Figure 1: “SoFi”, a robotic fish designed by MIT scientists.
Source: The New York Times 2018.

 

SoFi’s  first swim trial occurred in a coral reef in Fiji, and the footage recorded can be seen in the following video:

 

SoFi can swim at depths up to 18 meters and at speeds up to half-its-body-length a second (average of 23.5 cm/s in a straight path; Katzschmann et al. 2018). Sofi can swim for up to ~40 minutes, as limited by battery time. The robot is also well-equipped (Figure 2). It has a compact buoyancy control mechanism and includes a wide-view video camera, a hydrophone, a battery, environmental sensors, and operating and communication systems. The operating and communication systems allow a diver to issue commands by using a controller that operates through sound waves.

Figure 2: “SoFi” system subcomponents overview.
Source: Katzschmann et al. 2018.

 

The robot designers highlight that while SoFi was swimming, fish didn’t seem to be bothered or get scared by SoFi’s presence. Some fish were seen swimming nearby the robot, suggesting that SoFi has the potential to integrate into the natural underwater environment and therefore record undisturbed behaviors. However, a limitation of this invention is that SoFi needs a diver on scene to control the robot. Therefore, SoFi’s study of marine life without human interference may be compromised until technology develops further.

Another potential impact of SoFi we might be concerned about is noise. Does this device produce noise levels that marine fauna can sense or maybe be stress by? Unfortunately, the answer is yes. Even if fish don’t seem to be bothered by SoFi’s presence, it might bother other animals with hearing sensitivity in the same frequency range of SoFi. Katzschmann and colleagues (2018) explained that they chose a frequency to operate SoFi that would minimally impact marine fauna. They studied the frequencies used by the aquatic animals and, since the hearing ranges of most aquatic species decays significantly above 10 KHz, they selected a frequency above this range (i.e., 36 KHz). However, this high frequency range can be sensed by some species of cetaceans and pinnipeds, but negative affects on these animals will be dependent on the sound amplitude that is produced.

Although not perfect (but what tool is?), SoFi can be seen as a great first step toward a future of underwater robots to assist research efforts.  Battery life, human disturbance, and noise disturbance are limitations, but through thoughtful application and continued innovation this fishy tool can be the start of something great.

The use of aquatic robots, such as SoFi, can help us advance our knowledge in underwater ecosystems. These robots could promote a better understanding of marine life in their natural habitat by studying behaviors, interactions and responses to threats. These robots may offer important new tools in the protection of animals against the effects caused by anthropogenic activities. Additionally, the use of aquatic robots in scientific research may substitute remote operated vehicles and submersibles in some circumstances, such as how drones are substituting for airplanes sometimes, thus providing a less expensive and better-tolerated way of monitoring wildlife.

Through continued multidisciplinary collaboration by robot designers, biologists, meteorologists, and more, innovation will continue allowing data collection with minimal to non-disturbance to the wildlife, providing lower costs and higher safety for the researchers.

It is impressive to see how technology efforts are expanding into the oceans. As drones are conquering our skies today and bringing so much valuable information on wildlife monitoring, I believe that the same will occur in our oceans in a near future, assisting in marine life conservation.

 

 

References:

Katzschmann RK, DelPreto J, MacCurdy R, Rus D. 2018. Exploration of Underwater Life with an Acoustically Controlled Soft Robotic Fish. Sci. Robot. 3, eaar3449. DOI: 10.1126/scirobotics.aar3449.

The New York Times. 2018. Robotic Fish to Keep a Fishy Eye on the Health of the Oceans. Available at: https://www.nytimes.com/2018/03/21/science/robot-fish.html.