{"id":4487,"date":"2022-01-31T12:37:17","date_gmt":"2022-01-31T19:37:17","guid":{"rendered":"https:\/\/blogs.oregonstate.edu\/gemmlab\/?p=4487"},"modified":"2022-01-31T12:37:17","modified_gmt":"2022-01-31T19:37:17","slug":"the-costs-and-benefits-of-automated-behavior-classification","status":"publish","type":"post","link":"https:\/\/blogs.oregonstate.edu\/gemmlab\/2022\/01\/31\/the-costs-and-benefits-of-automated-behavior-classification\/","title":{"rendered":"The costs and benefits of automated behavior classification"},"content":{"rendered":"\n<p><a href=\"https:\/\/mmi.oregonstate.edu\/people\/clara-bird\">Clara Bird<\/a>, PhD Student, OSU Department of Fisheries, Wildlife, and Conservation Sciences,\u00a0<a href=\"https:\/\/mmi.oregonstate.edu\/gemm-lab\">Geospatial Ecology of Marine Megafauna Lab<\/a><\/p>\n\n\n\n<p>\u201cWhy don\u2019t you just automate it?\u201d This is a question I am frequently asked when I tell someone about my work. My thesis involves watching many hours of drone footage of gray whales and meticulously coding behaviors, and there are plenty of days when I have asked myself that very same question. Streamlining my process is certainly appealing and given how wide-spread and effective machine learning methods have become, it is a tempting option to pursue. That said, machine learning is only appropriate for certain research questions and scales, and it\u2019s important to consider these before investing in using a new tool.<\/p>\n\n\n\n<p>The application of machine learning methods to behavioral ecology is called computational ethology (Anderson &amp; Perona, 2014). To identify behaviors from videos, the model tracks individuals across video frames and identifies patterns of movement that form a behavior. This concept is similar to the way we identify a whale as traveling if it\u2019s moving in a straight line and as foraging if it&#8217;s swimming in circles within a small area (Mayo &amp; Marx, 1990, check out this\u00a0<a href=\"https:\/\/blogs.oregonstate.edu\/gemmlab\/2019\/11\/18\/classifying-cetacean-behavior\/\">blog<\/a>\u00a0to learn more). The level of behavioral detail that the model is able to track\u00a0\u00a0depends on the chosen method (Figure 1,\u00a0Pereira et al., 2020). These methods range from tracking each animal as a simple single point (called a centroid) to tracking the animal\u2019s body positioning in 3D (this method is called pose estimation), which range from providing less detailed to more detailed behavior definitions. For example, tracking an individual as a centroid could be used to classify traveling and foraging behaviors, while pose estimation could identify specific foraging tactics.\u00a0<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><a href=\"https:\/\/osu-wams-blogs-uploads.s3.amazonaws.com\/blogs.dir\/2115\/files\/2022\/01\/fig1.png\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"876\" src=\"https:\/\/osu-wams-blogs-uploads.s3.amazonaws.com\/blogs.dir\/2115\/files\/2022\/01\/fig1-1024x876.png\" alt=\"\" class=\"wp-image-4488\" srcset=\"https:\/\/osu-wams-blogs-uploads.s3.amazonaws.com\/blogs.dir\/2115\/files\/2022\/01\/fig1-1024x876.png 1024w, https:\/\/osu-wams-blogs-uploads.s3.amazonaws.com\/blogs.dir\/2115\/files\/2022\/01\/fig1-300x257.png 300w, https:\/\/osu-wams-blogs-uploads.s3.amazonaws.com\/blogs.dir\/2115\/files\/2022\/01\/fig1-768x657.png 768w, https:\/\/osu-wams-blogs-uploads.s3.amazonaws.com\/blogs.dir\/2115\/files\/2022\/01\/fig1-1536x1314.png 1536w, https:\/\/osu-wams-blogs-uploads.s3.amazonaws.com\/blogs.dir\/2115\/files\/2022\/01\/fig1.png 1662w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/a><figcaption><em>Figure 1. Figure from Pereira et al. (2020) illustrating the different methods of animal behavior tracking that are possible using machine learning.<\/em><\/figcaption><\/figure>\n\n\n\n<p>Pose estimation involves training the machine learning algorithm to track individual anatomical features of an individual (e.g., the head, legs, and tail of a rat), meaning that it can define behaviors in great detail. A behavior state could be defined as a combination of the angle between the tail and the head, and the stride length.&nbsp;<\/p>\n\n\n\n<p>For example, Mearns et al. (2020) used pose estimation to study how zebrafish larvae in a lab captured their prey. They tracked the tail movements of individual larvae when presented with prey and classified these movements into separate behaviors that allowed them to associate specific behaviors with prey capture (Figure 2). The authors found that these behaviors occurred in a specific sequence, that the behaviors kept the prey within the larvae\u2019s line of sight, and that the sequence was triggered by visual cues.\u00a0\u00a0In fact, when they removed the visual cue of the prey, the larvae terminated the behavior sequence, meaning that the larvae are continually choosing to do each behavior in the sequence, rather than the sequence being one long behavior event that is triggered only by the initial visual cue. This study is a good example of the applicability of machine learning models for questions aimed at kinematics and fine-scale movements. Pose estimation has also been used to study the role of facial expression and body language in rat social communication (Ebbesen &amp; Froemke, 2021).\u00a0<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><a href=\"https:\/\/osu-wams-blogs-uploads.s3.amazonaws.com\/blogs.dir\/2115\/files\/2022\/01\/fig2.png\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"297\" src=\"https:\/\/osu-wams-blogs-uploads.s3.amazonaws.com\/blogs.dir\/2115\/files\/2022\/01\/fig2-1024x297.png\" alt=\"\" class=\"wp-image-4489\" srcset=\"https:\/\/osu-wams-blogs-uploads.s3.amazonaws.com\/blogs.dir\/2115\/files\/2022\/01\/fig2-1024x297.png 1024w, https:\/\/osu-wams-blogs-uploads.s3.amazonaws.com\/blogs.dir\/2115\/files\/2022\/01\/fig2-300x87.png 300w, https:\/\/osu-wams-blogs-uploads.s3.amazonaws.com\/blogs.dir\/2115\/files\/2022\/01\/fig2-768x223.png 768w, https:\/\/osu-wams-blogs-uploads.s3.amazonaws.com\/blogs.dir\/2115\/files\/2022\/01\/fig2-1536x446.png 1536w, https:\/\/osu-wams-blogs-uploads.s3.amazonaws.com\/blogs.dir\/2115\/files\/2022\/01\/fig2.png 1710w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/a><figcaption><em>Figure 2. Excerpt from figure 1 of Mearns et al. (2020) illustrating (A) the camera set up for their experiment, (B) how the model tracked the eye angles and tail of the larvae fish, (C) the kinematics extracted from the footage. In panel (C) the top plot shows how the eyes converged on the same object (the prey) during prey capture event, the middle plot shows when the tail was curved to the left or the right, and the bottom plot shows the angle of the tail tip relative to the body.<\/em><\/figcaption><\/figure>\n\n\n\n<p>While previous machine learning methods to track animal movements required individuals to be physically marked, the current methods can perform markerless tracking (Pereira et al., 2020). This improvement has broadened the kinds of studies that are possible. For example, Bozek et al., (2021) developed a model that tracked individuals throughout an entire honeybee colony and showed that certain individual behaviors were spatially distributed within the colony (Figure 3). Machine learning enabled the researchers to track over 1000 individual bees over several months, a task that would be infeasible for someone to do by hand.\u00a0<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><a href=\"https:\/\/osu-wams-blogs-uploads.s3.amazonaws.com\/blogs.dir\/2115\/files\/2022\/01\/fig3.png\"><img loading=\"lazy\" decoding=\"async\" width=\"576\" height=\"177\" src=\"https:\/\/osu-wams-blogs-uploads.s3.amazonaws.com\/blogs.dir\/2115\/files\/2022\/01\/fig3.png\" alt=\"\" class=\"wp-image-4491\" srcset=\"https:\/\/osu-wams-blogs-uploads.s3.amazonaws.com\/blogs.dir\/2115\/files\/2022\/01\/fig3.png 576w, https:\/\/osu-wams-blogs-uploads.s3.amazonaws.com\/blogs.dir\/2115\/files\/2022\/01\/fig3-300x92.png 300w\" sizes=\"auto, (max-width: 576px) 100vw, 576px\" \/><\/a><figcaption><em>Figure 3. Excerpt from figure 1 of Bozek et al., (2021) showing how individual bees and their trajectories were tracked.<\/em><\/figcaption><\/figure>\n\n\n\n<p>These studies highlight that the potential benefits of using machine learning when studying fine scale behaviors (like kinematics) or when tracking large groups of individuals. Furthermore, once it\u2019s trained, the model can process large quantities of data in a standardized way to free up time for the scientists to focus on other tasks.<\/p>\n\n\n\n<p>While machine learning is an exciting and enticing tool, automating behavior detection via machine learning could be its own PhD dissertation. Like most things in life, there are costs and benefits to using this technique. It is a technically difficult tool, and while applications exist to make it more accessible, knowledge of the computer science behind it is necessary to apply it effectively and correctly. Secondly, it can be tedious and time consuming to create a training dataset for the model to \u201clearn\u201d what each behavior looks like, as this step involves manually labeling examples for the model to use.\u00a0<\/p>\n\n\n\n<p>As I\u2019ve mentioned in a previous\u00a0<a href=\"https:\/\/blogs.oregonstate.edu\/gemmlab\/2021\/03\/01\/defining-behaviors\/\">blog<\/a>, I came quite close to trying to study the kinematics of gray whale foraging behaviors but ultimately decided that counting fluke beats wasn\u2019t necessary to answer my behavioral\u00a0<a href=\"https:\/\/blogs.oregonstate.edu\/gemmlab\/2020\/08\/04\/connecting-research-questions\/\">research questions<\/a>. It was important to consider the\u00a0<strong>scale<\/strong>\u00a0of my questions (as described in <a href=\"https:\/\/mmi.oregonstate.edu\/people\/allison-dawn\">Allison<\/a>\u2019s\u00a0<a href=\"https:\/\/blogs.oregonstate.edu\/gemmlab\/2021\/11\/22\/weighing-in-on-scale\/\">blog<\/a>) and I think that diving into more fine-scale kinematics questions could be a fascinating follow-up to the questions I\u2019m asking in my PhD.\u00a0<\/p>\n\n\n\n<p>For instance, it would be interesting to quantify how gray whales use their flukes for different behavior tactics. Do gray whales in better body condition beat their flukes more frequently while headstanding? Does the size of the fluke affect how efficiently they can perform certain tactics? While these analyses would help quantify the energetic costs of different behaviors in better detail, they aren\u2019t necessary for my broad scale questions. Consequently, taking the time to develop and train a pose estimation machine learning model is not the best use of my time.<\/p>\n\n\n\n<p>That being said, I am interested in applying machine learning methods to a specific subset of my dataset. In social behavior, it is not only useful to quantify the behaviors exhibited by each individual but also the distance between them. For example, the distance between a mom and her calf can be indicative of the calves\u2019 dependence on its mom (Nielsen et al., 2019). However, continuously measuring the distance between two individuals throughout a video is tedious and time intensive, so training a machine learning model could be an effective use of time. I plan to work with an intern this summer to develop a machine learning model to track the distance between pairs of gray whales in our drone footage and then relate this distance data with the manually coded behaviors to examine patterns in social behavior (Figure 4).\u00a0\u00a0Stay tuned to learn more about our progress!<\/p>\n\n\n\n<div class=\"wp-block-image\"><figure class=\"aligncenter size-full\"><a href=\"https:\/\/osu-wams-blogs-uploads.s3.amazonaws.com\/blogs.dir\/2115\/files\/2022\/01\/210707_I2F_S3_U3_DJI0009.MOV_00_02_37_vlc00002.png\"><img loading=\"lazy\" decoding=\"async\" width=\"432\" height=\"174\" src=\"https:\/\/osu-wams-blogs-uploads.s3.amazonaws.com\/blogs.dir\/2115\/files\/2022\/01\/210707_I2F_S3_U3_DJI0009.MOV_00_02_37_vlc00002.png\" alt=\"\" class=\"wp-image-4492\" srcset=\"https:\/\/osu-wams-blogs-uploads.s3.amazonaws.com\/blogs.dir\/2115\/files\/2022\/01\/210707_I2F_S3_U3_DJI0009.MOV_00_02_37_vlc00002.png 432w, https:\/\/osu-wams-blogs-uploads.s3.amazonaws.com\/blogs.dir\/2115\/files\/2022\/01\/210707_I2F_S3_U3_DJI0009.MOV_00_02_37_vlc00002-300x121.png 300w\" sizes=\"auto, (max-width: 432px) 100vw, 432px\" \/><\/a><figcaption><em>Figure 4. A mom and calf pair surfacing together. Image collected under NOAA\/NMFS permit #21678<\/em><\/figcaption><\/figure><\/div>\n\n\n\n<p><em>Did you enjoy this blog? Want to learn more about marine life, research, and conservation? Subscribe to our blog and get a weekly alert when we make a new post! Just add your name into the subscribe box on the left panel.\u00a0\u00a0<\/em><\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>References<\/strong><\/h2>\n\n\n\n<p>Anderson, D. J., &amp; Perona, P. (2014). Toward a Science of Computational Ethology.&nbsp;<em>Neuron<\/em>,&nbsp;<em>84<\/em>(1), 18\u201331. https:\/\/doi.org\/10.1016\/j.neuron.2014.09.005<\/p>\n\n\n\n<p>Bozek, K., Hebert, L., Portugal, Y., Mikheyev, A. S., &amp; Stephens, G. J. (2021). Markerless tracking of an entire honey bee colony.&nbsp;<em>Nature Communications<\/em>,&nbsp;<em>12<\/em>(1), 1733. https:\/\/doi.org\/10.1038\/s41467-021-21769-1<\/p>\n\n\n\n<p>Ebbesen, C. L., &amp; Froemke, R. C. (2021). Body language signals for rodent social communication.&nbsp;<em>Current Opinion in Neurobiology<\/em>,&nbsp;<em>68<\/em>, 91\u2013106. https:\/\/doi.org\/10.1016\/j.conb.2021.01.008<\/p>\n\n\n\n<p>Mayo, C. A., &amp; Marx, M. K. (1990). Surface foraging behaviour of the North Atlantic right whale, Eubalaena glacialis , and associated zooplankton characteristics.&nbsp;<em>Canadian Journal of Zoology<\/em>,&nbsp;<em>68<\/em>(10), 2214\u20132220. https:\/\/doi.org\/10.1139\/z90-308<\/p>\n\n\n\n<p>Mearns, D. S., Donovan, J. C., Fernandes, A. M., Semmelhack, J. L., &amp; Baier, H. (2020). Deconstructing Hunting Behavior Reveals a Tightly Coupled Stimulus-Response Loop.&nbsp;<em>Current Biology<\/em>,&nbsp;<em>30<\/em>(1), 54-69.e9. https:\/\/doi.org\/10.1016\/j.cub.2019.11.022<\/p>\n\n\n\n<p>Nielsen, M., Sprogis, K., Bejder, L., Madsen, P., &amp; Christiansen, F. (2019). Behavioural development in southern right whale calves.&nbsp;<em>Marine Ecology Progress Series<\/em>,&nbsp;<em>629<\/em>, 219\u2013234. https:\/\/doi.org\/10.3354\/meps13125<\/p>\n\n\n\n<p>Pereira, T. D., Shaevitz, J. W., &amp; Murthy, M. (2020). Quantifying behavior to understand the brain.&nbsp;<em>Nature Neuroscience<\/em>,&nbsp;<em>23<\/em>(12), 1537\u20131549. https:\/\/doi.org\/10.1038\/s41593-020-00734-z<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Clara Bird, PhD Student, OSU Department of Fisheries, Wildlife, and Conservation Sciences,\u00a0Geospatial Ecology of Marine Megafauna Lab \u201cWhy don\u2019t you just automate it?\u201d This is a question I am frequently asked when I tell someone about my work. My thesis involves watching many hours of drone footage of gray whales and meticulously coding behaviors, and &hellip; <a href=\"https:\/\/blogs.oregonstate.edu\/gemmlab\/2022\/01\/31\/the-costs-and-benefits-of-automated-behavior-classification\/\" class=\"more-link\">Continue reading <span class=\"screen-reader-text\">The costs and benefits of automated behavior classification<\/span><\/a><\/p>\n","protected":false},"author":9938,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":true,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[1310686],"tags":[1834,1310578,1310532,916414,1237819,214862,635445,44681,1310685,634945,1310662,1310750,1310749,214860],"class_list":["post-4487","post","type-post","status-publish","format-standard","hentry","category-behavior-and-body-condition","tag-behavior","tag-behavioral-ecology","tag-clara-bird","tag-drone","tag-drone-footage","tag-drones","tag-gemm-lab","tag-gray-whale","tag-gray-whale-individual-behavior-and-body-condition","tag-gray-whales","tag-machine-learning","tag-marine-ma","tag-social-behavior","tag-uas"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"post_mailing_queue_ids":[],"_links":{"self":[{"href":"https:\/\/blogs.oregonstate.edu\/gemmlab\/wp-json\/wp\/v2\/posts\/4487","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/blogs.oregonstate.edu\/gemmlab\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blogs.oregonstate.edu\/gemmlab\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blogs.oregonstate.edu\/gemmlab\/wp-json\/wp\/v2\/users\/9938"}],"replies":[{"embeddable":true,"href":"https:\/\/blogs.oregonstate.edu\/gemmlab\/wp-json\/wp\/v2\/comments?post=4487"}],"version-history":[{"count":2,"href":"https:\/\/blogs.oregonstate.edu\/gemmlab\/wp-json\/wp\/v2\/posts\/4487\/revisions"}],"predecessor-version":[{"id":4494,"href":"https:\/\/blogs.oregonstate.edu\/gemmlab\/wp-json\/wp\/v2\/posts\/4487\/revisions\/4494"}],"wp:attachment":[{"href":"https:\/\/blogs.oregonstate.edu\/gemmlab\/wp-json\/wp\/v2\/media?parent=4487"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blogs.oregonstate.edu\/gemmlab\/wp-json\/wp\/v2\/categories?post=4487"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blogs.oregonstate.edu\/gemmlab\/wp-json\/wp\/v2\/tags?post=4487"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}