Tag Archives: Oregon State University

Lean, Mean, Bioinformatics Machine

Machines take me by surprise with great frequency. – Alan Turing

This week we have a PhD student from the College of Engineering and advised by Dr. Maude David in Microbiology, Nima Azbijari, to discuss how he uses machine learning to better understand biology. Before we dig in to the research, let’s dig into what exactly machine learning is, and how it differs from artificial intelligence (AI). Both AI and machine learning learn patterns from data they are fed, but the difference is that AI is typically developed to be interacted with and make decisions in real time. If you’ve ever lost a game of chess to a computer, that was AI playing against you. But don’t worry, even the world’s champion at an even more complex game, Go, was beaten by AI. AI utilizes machine learning, but not all machine learning is AI. Kind of like how a square is a rectangle, but not all rectangles are squares. The goal of machine learning is to use data to improve at tasks using data it is fed.

So how exactly does a machine, one of the least biological things on this planet, help us understand biology? 

Ten years ago it was big news that a computer was able to recognize images of cats, but now photo recognition is quite common. Similarly, Nima uses machine learning with large sets of genomic (genes/DNA), proteomic (proteins), and even gut microbiomic data (symbiotic microbes in the digestive track) to then see if the computer can predict varying patient outcomes. By using computational power, larger data sets and the relationships between the varying kinds of data can be analyzed more quickly. This is great for both understanding the biological world in which we live, and also for the potential future of patient care. 

How exactly do you teach an old machine a new trick?

First, it’s important to note that he’s using a machine, not magic, and it can be massively time consuming (even for a computer) to do any kind of analysis on every element of a massive set. Potentially millions of computations, or even more. So to isolate only the data that matters, Nima uses graph neural networks to extrapolate the important pieces. Imagine if you had a data set about your home, and you counted both the number of windows and the number of blinds and found that they were the same. Then you might conclude that you only need to count windows, and that counting blinds doesn’t tell you anything new. The same idea works with reducing data into only the components that add meaning. 

The phrase ‘neural network’ can invoke imagery of a massive computer-brain made of wires, but what does this neural network look like, exactly? The 1999 movie The Matrix borrowed its name from a mathematical object which contains columns and rows of data, much like the iconic green columns of data from the movie posters. These matrices are useful for storing and computing data sets since they can be arranged much like an excel sheet, with columns for each patient and rows for each type of recorded data. He (or the computer?) can then work with that matrix to develop this neural network graph. Then, the neural network determines which data is relevant and can also illustrate connections between the different pieces of data. Much like how you might be connected to friends, coworkers, and family on a social network, except in this case, each profile is a compound or molecule and the connections can be any kind of relationship, such as a common reaction between the pair. However, unlike a social network, no one cares how many degrees from Kevin Bacon they are. The goal here isn’t to connect one molecule to another but to instead identify unknown relationships. Perhaps that makes it more like 23 and Me than Facebook.

TLDR

Nima is using machine learning to discover previously unknown relationships between various kinds of human biological data such as genes and the gut microbiome. Now, that’s a machine you don’t need to rage against.

Excited to learn more about machine learning?
Us too. Be sure to listen live on Sunday November 13th at 7PM on 88.7FM, or download the podcast if you missed it. And if you want to stay up to date on Nima’s research, you can follow them on Twitter.

Stressed out corals

Coral reef ecosystems offer a multitude of benefits, ranging from coastline protection from storms and erosion to a source of food through fishing or harvest. In fact, it is estimated that over half a billion people depend on reefs for food, income, and/or protection. However, coral reefs face many threats in our rapidly changing world. Climate change and nutrient input due to run-off from land are two stressors that can affect coral health. How exactly do these stressors impact corals? This week’s guest Alex Vompe is trying to figure that out!

Alex is a 4th year PhD candidate in the Department of Microbiology at OSU, where he is co-advised by Dr. Becky Vega-Thurber and Dr. Tom Sharpton. The goal of Alex’s research is to understand how coral microbe communities change over time and across various sources of stress. While the microbial communities of different coral species can differ, typically under normal, non-stressed conditions, they look quite similar. However, once exposed to a stressor, changes start to arise in the microbial community between different coral species, which can have different outcomes for the coral host. This pattern has been coined the ‘Anna Karenina principle’ whereby all happy corals are alike, however as soon as things start to go wrong, corals suffer differently.

Alex is testing how this Anna Karenina principle plays out for three different coral species (Acropora retusaPocillopora verrucosa [also known as cauliflower coral], Porites lobata [also known as lobe coral]) in the tropical Pacific Ocean. The stressors that Alex is investigating are reduction in herbivory and introduction of fertilizer. A big source of stress for reefs is when fish populations are low, which results in a lack of grazing by fish on macroalgae. In extreme situations, macroalgae can overgrow a coral reef completely and outcompete it for light and resources. Fertilizers contain a whole host of nutrients with the intent of increasing plant growth and production on land. However, these fertilizers run-off from land into aquatic ecosystems which can often be problematic for aquatic flora and fauna. 

How is Alex testing the effects of these stressors on the corals? He is achieving this both in-situ and in the lab. Alex and his lab conduct field work on coral reefs off the island of Moorea in French Polynesia. Here, they have set up experimental apparatus in the ocean on coral reefs (via scuba diving!) to simulate the effects of reduced herbivory and fertilizer introduction. This field work is conducted three times a year. When not under the water surface, Alex sets up aquaria experiments on land in Moorea using coral fragments, which he has been able to grow in order to investigate the microbial communities more closely. These samples then get processed in the lab at OSU for genomic analysis and Alex uses bioinformatics to investigate the coral microbiome dynamics.

Curious to know more about Alex’s research? Listen live on Sunday, October 23, 2022 at 7 PM on KBVR 88.7FM. Missed the live show? You can download the episode on our Podcast Pages! Also, feel free to follow Alex on Twitter (@AVompe) and Instagram (@vompedomp) to learn more about him and his research.

Schmitty Thompson wears glasses and a sweater, and smiles at the camera while standing in front of a vast field.

What ice sheets can teach us about ancient ocean shorelines

Around 80,000 years ago, the Earth was in the middle of the late Pleistocene era, and much of Canada and the northern part of the United States was blanketed in ice. The massive Laurentide Ice Sheet covered millions of square miles, and in some places, up to 2 miles thick. Over vast timescales this ice sheet advanced its way across the continent slowly, gouging out what we now know as the Great Lakes, carving the valleys, depositing glacial tills, and transforming the surface geology of much of the southern part of Canada and northern US. Further west, the Cordilleran ice sheet stretched across what is now Alaska, British Columbia, and the northern parts of the Western US, compressing the ground under its massive weight. As these ice sheets depressed the land beneath them, the Earth’s crust bulged outwards, and as the planet warmed and the ice sheets began to melt, the pressure was released, returning the crust underneath to its previous shape. As this happened, ocean water flowed away, resulting in lower sea levels locally, but higher levels across the other side of the planet.

The effects of massive bodies of ice forming, moving, and melting are far from negligible in their impact on the overall geology of the region, the sea level throughout history, and the patterns of a changing climate. Though there are only two ice sheets on the planet today, deducing the ancient patterns and dynamics of ice sheets can help researchers fill the geological record and even make predictions about what the planet might look like in the future. Our guest on Inspiration Dissemination this week is PhD candidate and researcher Schmitty Thompson, of the Department of Geology in CEOAS. Thompson is ultimately trying to answer questions about ice distribution, sea levels, and other unknown parameters that the geologic record is missing during two different ice age warming periods. Their research is very interdisciplinary – Thompson has degrees in both math and geology, and also uses a lot of data science, computer science, and physics in their work. They are using computer modeling to figure out just what the shorelines looked like during this time period around 80,000 years ago. 

Schmitty Thompson, fourth year PhD candidate with Jessica Creveling in the Geology Department.

“I use models because the geologic record is pretty incomplete – the further back you go, the less complete it is. So by matching my models to the existing data, we can then infer more information about what the shoreline was like,” they explain. To do this accurately, Thompson feeds the model what the ice sheets looked like over the course of around 250,000 years. They also need to incorporate other inputs to the model to get an accurate picture – variables such as the composition of the interior of the Earth, the physics of Earth’s interior, and even the ice sheets’ own gravitational pull (ice sheets are so massive they exert a gravitational pull on the water around them!)

Using math to learn about ice

The first equation to describe global changes in sea level was published in 1976, with refining throughout the 90s and early 2000s. Thompson’s model builds on these equations in two versions: one which can run in about 10 minutes on their laptop, and another which can take multiple weeks and must run on a supercomputer. The quicker version uses spherical harmonics as the basis function for the pseudospectral formulation, which is basically a complex function that does math and incorporates coefficient representations of the earth’s radius, meridional wave numbers, variation across north/south and east/west, and a few other variables. The short of it is that it can perform these calculations across a 250k time span relatively quickly, but it makes assumptions about the homogeneity of the earth’s crust and mantle viscosity. Think of it like a gumball: a giant, magma-filled gumball with a smooth outer surface and even layers. So while this method is fast, the assumptions that it makes means the output data is limited in its usefulness. When Thompson needs a more accurate picture, they turn to collaborators who are able to run the models on a supercomputer, and then they work with the model’s outputs.

While the model is useful for filling in gaps in the historical record, Thompson also points out that it has uses in predicting what the future will look like in the context of a changing climate. After testing out these models and seeing how sensitive they are, they could be used by researchers looking at much smaller time scales and more sensitive constraints for current and future predictions. “There are still lots of open questions – if we warm the planet by a few degrees, are we going to collapse a big part of Antarctica or a small part? How much ice will melt?”


To learn more about ice sheets, sea levels, and using computer models to figure out how the shoreline looked thousands of years ago, tune in to Schmitty Thompson’s episode on Inspiration Dissemination this upcoming Sunday evening at 7 PM PST. Catch the show live by streaming on https://kbvrfm.orangemedianetwork.com/, or check out the show later wherever you get your podcasts!

Thompson was also recently featured on Alie Ward’s popular podcast Ologies. You can catch up with all things geology by checking out their episode here.

Warming waters, waning nutrition

Here at Inspiration Dissemination, we are fascinated by the moments of inspiration that lead people to pursue graduate studies. For our next guest, an experience like this came during a boat trip accompanying the National Oceanic and Atmospheric Administration (NOAA) on a research expedition. Becky Smoak, an M.S. student in OSU’s Marine Resource Management program, remembers feeling in awe of the vibrant array of marine life that she saw, including whales, sunfish, and sharks. Growing up on a farm in eastern Washington, Becky had always wanted to be a veterinarian. During her undergraduate studies at Washington State University, she came to feel that the culture of pre-veterinary students was too cutthroat. In search of something more collaborative, she came to Oregon State in summer 2019 for a Research Experience for Undergrads (REU) and was impressed by the support and inclusivity of her research mentors. A couple years later, Becky is now on the cusp of graduation after her time spent studying marine life.

Becky’s graduate work is the continuation of a long-running collaboration between Oregon State and NOAA out of the Hatfield Marine Science Center in Newport. Beginning in 1996 under the direction of Bill Peterson, a team of researchers has monitored oceanic conditions along a route called the Newport Hydrographic, which extends in a straight line eastward from the Oregon Coast and intersects the northern part of the vast Californian Current. The team takes samples of ocean water at fixed points along the route and analyzes the concentrations of plankton and other organisms or compounds of interest. 

Becky Smoak, teaching on the OSU research vessel The Elakha.

The specific biochemicals that Becky studies are Omega-3 fatty acids. In a set of experiments from the 1930s, rats fed with a diet poor in Omega-3 fatty acids eventually died, demonstrating that these compounds are essential to life and are not produced by mammals. Two types of Omega-3 fatty acids, called EPA and DHA, can only be synthesized by phytoplankton, microscopic photosynthetic organisms that live in the ocean. The ability of phytoplankton to produce fatty acids is intimately linked with oceanic temperature. Studies have shown that increases in sea surface temperature and decreases in nutrient availability can decrease the quality of fatty acids in phytoplankton, thus decreasing food availability and quality in the marine environment. Fatty acid levels have downstream effects on the ecosystem, for example on copepods, a type of zooplankton that feeds on phytoplankton. Becky’s team affectionately refers to the copepod colony of the chilly northern Pacific as the “cheeseburger” copepods, in contrast to the “celery” copepods of the southern Pacific colony. The present-day effect of temperature also points to a key ecological challenge, as warming oceans due to climate change could disrupt the supply of this vital nutrient.

In her thesis work, Becky seeks to untangle the contributions of phytoplankton community structure to oceanic Omega-3 fatty acid levels. She uses a set of statistical methodologies called nonmetric multidimensional scaling to uncover correlations in the datasets. A particularly interesting instrument used to collect her data is a flow cytometry robot dubbed ‘Lucy’. Lucy uses advanced imaging to count individual plankton and characterize their sizes. This yields an improvement in accuracy over older monitoring techniques that assumed a fixed size for all plankton. Becky’s goal for finishing her thesis is to create a statistical procedure for predicting fatty acid availability given information on phytoplankton population structure.

To hear more about Becky’s journey to OSU, her experiences as a first-generation college student, and the fascinating role of Omega-3s in marine ecosystems, be sure to tune in this Sunday October 9th at 7pm on KBVR.

This article was written by Joseph Valencia.

Violence and Masculinity in Film

After a long summer hiatus, Inspiration Dissemination is back on the airwaves and your podcast platforms this week! Kicking off our Fall quarter lineup is Andrew Herrera, MA candidate with Jon Lewis in the School of Writing, Literature, and Film here at Oregon State University.

Herrera’s research might sound like a dream come true to some: “I study movies, honestly.”

For Herrera it really is a dream come true – he grew up with a lifelong love of film, inspired by watching movies with his mother as a child, the same movies that she had also grown up with. But it was after seeing Darren Aronofsky’s 2010 hit film Black Swan that he knew that studying film was going to be a career for him. The psychological horror production stars Natalie Portman as a dancer in a production of Swan Lake and follows her descent into madness as she struggles with a rival dancer. Herrera recalls that after seeing the film in theaters he sat in the car for several hours, just thinking about what he’d seen. This was around the time he learned that he could actually study film as an academic pursuit, and ended up writing about Black Swan for a literature class, comparing and contrasting it with The Strange Case of Dr. Jekyll and Mr. Hyde.

Andrew Herrera, MA candidate in SWLF.

He eventually finished his Bachelor’s degree in English Literature here at Oregon State University, and decided to stay and pursue a Master’s in Film Studies. His dissertation is focusing on the themes of three films by acclaimed Danish director Nicolas Winding Refn: Drive, Only God Forgives, and Bronson. Herrera is looking at the three films through the lens of masculinity, gender performativity and violence – all three center around male characters engaged in violent trajectories. Herrera in part argues that the three films present masculinity as a kind of performance or even a very literal costume, in the case of Drive (Ryan Gosling’s character is known for his iconic white jacket which sports a scorpion design, which he is only seen wearing when committing acts of violence.) The removal of weakness and femininity through violence and fighting leads to the rebirth of masculinity in Bronson, and in Only God Forgives features an almost Oedipal-like protagonist (also played by Ryan Gosling) who eventually cuts open the womb of his dead mother in a representation of asserting control over his own masculinity. Herrera is also interested in the intersection of masculinity and queerness in media, and how these themes show up explicitly or implicitly in these three and other films.


To hear more about these movies, the way masculinity is portrayed in film and its cultural impacts, and Herrera’s research, tune in to Inspiration Dissemination this Sunday evening at 7 PM at KBVR 88.7 FM or listen live online at https://kbvrfm.orangemedianetwork.com/. If you missed the live episode don’t forget to check out the podcast, now available wherever you get your podcasts.

From A(lgorithms) to Z(O-1 proteins): A Computer Scientist’s Journey into the Lab

By Grace Deitzler

Improvements in DNA sequencing technology have allowed scientists to dig deeper than ever before into the intricacies of the microbes that inhabit our gut, also called the gut microbiome. Massive amounts of data – on the scale of pentabytes – have been accumulated as labs and institutes across the globe sequence the gut microbiome in an effort to learn more about its inhabitants and how they contribute to human health. But now that we have all of this data (and more accumulating all the time), the challenge becomes making sense of it.

This is a challenge that Christine Tataru, a rising fifth year PhD student in the Department of Microbiology, is tackling head-on. “My research is trying to understand what a ‘healthy’ gut microbiome actually looks like, how it ‘should’ look, and to do so in a way that is integrative,” she explains. 

A woman with long hair in a red and white striped shirt sits at a computer.
Christine Tataru, fifth year PhD student in Maude David’s lab.

An integrative approach looks at all of the processes and relationships that are occurring between all of the trillions of microorganisms in our gut, and the cells within our body. Previous microbiology dogma focused on the behavior and impact of singular species such as pathogens, but as we learn more about microbiomes, this approach becomes limiting. There are a vast number of relationships that can occur between microbes and human cells. And there are many different lenses through which we can look at this system: taking a census of what microbes are present; tracking the genes that are present rather than just the microbes (this tells us about the functions that might be carried out); and what proteins or metabolites are actually present, whether those are created by the bacteria or the host. Each piece of the puzzle allows us a glimpse of the massively complex system that is the gut microbiome.

“It’s difficult for a human brain to keep track of these relationships and sources of variations, so I use computer algorithms to try to get a picture of what is happening, and what that might mean for health.” 

It’s an approach that makes sense for the Stanford-trained computer-scientist-turned-biologist. Christine recalls a deep learning class in college in which a natural language processing algorithm on the whiteboard struck her with inspiration: what if instead of being applied to words, this algorithm could be applied to gut microbiomes? The thought stuck with her and when she came to OSU to pursue her PhD, she already had a clear goal in mind for what she wanted to do.

The natural language processing and interpretation algorithm treats words in a document as discrete entities, and looks for patterns and relationships between words to gain context and “understand” the contents. A computer can’t really understand what words mean linguistically and with the complex nuances that natural language presents, but they are really good at looking for patterns. It can look at what words occur together frequently, what words never occur together, and what words share a ‘social network’ — words that don’t appear together, but appear with the same other words. Christine has developed a way to apply this algorithm to large gut microbiome datasets: using this approach to identify what microbes frequently appear together, which don’t, and which share ‘social networks’. This produces clusters of microbes, or what she refers to as ‘topics’, which can then be interpreted by humans to try to understand how these clusters relate to certain aspects of health. You can read more about this method in her recent PLOS Computational Biology publication here.

It’s quite the challenging undertaking: no one has done this type of approach before, and even when the clusters are generated, we still need to be able to interpret what it means – why is it interesting or important that these microbes occur with each other and also correlate with these genes or metabolites? Biologically, what does it actually mean?

The question of biological meaning prompted Christine to pivot to a more traditional ‘wet lab’ biology approach. “Who gave this computer scientist a pipette,” she jokes. But to be perfectly honest, it makes a lot of sense: who better to investigate the hypotheses that can be generated by computers than the scientist who wrote the code?

Taking the ‘integrative approach’ to the next level, she now works on recapitulating the environment of the gut microbiome on a chip in the lab. The organ-on-a-chip system is a fairly new approach to studying biological mechanisms in a way that better mimics the naturally occurring environment. In Christine’s case, she is using a ‘gut on a chip’, which is made of a thin piece of silicone with input and output channels. The silicone is split by a microporous membrane in such a way that two different kinds of cells can be grown, one on the top layer and one on the bottom. What makes this system unique as compared to traditional cell culture is that the channels and membrane allow for constant flow of growth media, which physically simulates the flow of blood over the cells. It can also mimic peristalsis, which is the stretching and relaxing of intestinal cells that helps push food and nutrients through the digestive tract. It’s a sophisticated system, and one that allows her a high degree of control over the environment. She can use this system to mimic Inflammatory Bowel Disease, and then add in specific microbes or combinations of microbes to see how the gut cells respond, using findings from her algorithm results to inform what kinds of additions might have anti-inflammatory effects.

Christine in a biosafety hood, preparing gut-chips for experiments.

This innovative approach provides Christine another lens through which to view the relationship between the gut microbiome and health. Though she will be finishing her doctorate at the end of the year, the curiosity doesn’t end there – “Broadly, my life goal to some extent has always been to make ways for people to help people.” Whether that’s pipeline and methods development or building the infrastructure to study complex biological relationships, Christine’s innovation-driven approach is sure to lead to huge strides in our understanding of how the tiny living things in our gut influence our health, behavior, and mood.

Tune in at 7 PM this Sunday evening on KBVR 88.7 or stream online to hear more about her research and how she ended up here at OSU!

The non-Ghostbusting Venkman: a virus that “eats” marine bacteria

Have you ever considered that a virus that eats bacteria could potentially have an effect on global carbon cycling? No? Me neither. Yet, our guest this week, Dr. Holger Buchholz, a postdoctoral researcher at OSU, taught me just that! Holger, who works with Drs. Kimberly Halsey and Stephen Giovannoni in OSU’s Department of Microbiology, is trying to understand how a bacteriophage (a bacteria-eating virus), called Venkman, impacts the metabolism of marine bacterial strains in a clade called OM43.

Bacteria that are part of the OM43 clade are methylotrophs, in other words, these bacteria eat methanol, a type of volatile organic compound. It is thought that the methanol that the OM43 bacteria consume are a by-product of photosynthesis by algae. In fact, OM43 bacteria are more abundant in coastal waters and are particularly associated with phytoplankton (algae) blooms. While this relationship has been shown in the marine environment before, there are still a lot of unknowns surrounding the exact dynamics. For example, how much methanol do the algae produce and how much of this methanol do the OM43 bacteria in turn consume? Is methanol in the ocean a sink or a source for methanol in the atmosphere? Given that methanol is a carbon compound, these processes likely affect global carbon cycles in some way. We just do not know how much yet. And methanol is just one of many different Volatile Organic Carbon (VOC) compounds that scientists think are important in the marine ecosystem, and they are probably consumed by bacteria too!

Depiction of the carbon cycle within the marine food web. DOM means Dissolved Organic Material, POM stands for Particulate Organic Material. This refers to all the things that are bound within cells that gets released when for example viruses destroy cells. 

All of this gets even more complicated by the fact that a bacteriophage, by the name of Venkman, infects the OM43 bacteria. If you are a fan of Ghostbusters and your mind is conjuring the image of Bill Murray in tan coveralls at the sound of the name Venkman, then you are actually not at all wrong. During his PhD, which he conducted at the University of Exeter, part of Holger’s research was to isolate the bacteriophage that consumes OM43 bacteria (which he successfully did). As a result, Holger and his advisor (Dr. Ben Temperton, who is a big Ghostbusters fan) were able to name the bacteriophage and called it Venkman. Holger’s current work at OSU is to try and figure out how the Venkman bacteriophage affects the metabolism of methanol in OM43 bacteria and the viral influence on methanol production in algae. Does the virus increase the bacteria’s methanol metabolism? Decrease it? Or does nothing happen at all? At this point, Holger is not entirely sure what he is going to find, but whatever the answer, there would be an effect on the amount of carbon in the oceans, which is why this work is being conducted.

Holger is currently in the process of setting up experiments to answer these questions. He has been at OSU since February 2022 and has funding to conduct this work for three years from the Simons Foundation. Join us live on Sunday at 7 pm PST on 88.7 KBVR FM or https://kbvrfm.orangemedianetwork.com/ to hear more about Holger’s research and how a chance encounter with a marine biologist in Australia set him on his current career path! Can’t make it live, catch the podcast after the episode on your preferred podcast platform!

Spaghetti & Networks: Oodles of Nodes

Picture a bowl of spaghetti and meatballs. There are pristine noodles, drenched in rich tomato sauce, topped with savory meatballs. Now imagine you’re only allowed to eat just one noodle, and one meatball. You’re tasked with finding the very best, the most interesting bite out of this bowl of spaghetti. It might sound absurd, but replace spaghetti with ‘edges’ and meatballs with ‘nodes’ and you’ve got a network.

An image of a network from Nolan’s recent publication. The lines are ‘edges’ and the dots are ‘nodes’.

Computational biologists like our guest this week use networks to uncover meaningful relationships, or the tastiest spaghetti noodle and meatball, between biological entities.
Joining us this week is Nolan Newman, a PhD candidate in the College of Pharmacy under PI Andriy Morgun. Nolan’s research lies at the intersection of math, statistics, computer science, and biology. He’s looking at how networks, such as covariation networks, can be used to look for relationships and correlations between genes, microbes, and other factors from massive datasets which compare thousands or even of biological entities. With datasets this large and complex, it can be difficult to pare down just the important or interesting relationships – like trying to scoop a single bowl of spaghetti from a giant tray at a buffet, and then further narrowing it down to pick just one interesting noodle.

Nolan Newman, PhD candidate


Nolan is further interested in how different statistical thresholds and variables contribute to how the networks ‘look’ when they are changed. If only noodles covered in sauce are considered ‘interesting’, then all of the sauce-less noodles are out of the running. But what if noodles are only considered ‘sauce-covered’ if they are 95% or more covered? Could you be missing out on perfectly delicious, interesting noodles by applying this constraint?


If you’re left scratching your head and a little hungry, fear not. We’ll chat about all things computational biology, networks, making meaning out of chaos, and why hearing loss prompted Nolan to begin a career in science, all on this week’s episode of Inspiration Dissemination. Catch the episode live at 7 PST at 88.7 FM or https://kbvrfm.orangemedianetwork.com/, or catch the podcast after the episode on any podcast platform.

AI that benefits humans and humanity

When you think about artificial intelligence or robots in the everyday household, your first thought might be that it sounds like science fiction – like something out of the 1999 cult classic film “Smart House”. But it’s likely you have some of this technology in your home already – if you own a Google Home, Amazon Alexa, Roomba, smart watch, or even just a smartphone, you’re already plugged into this network of AI in the home. The use of this technology can pose great benefits to its users, spanning from simply asking Google to set an alarm to wake you up the next day, to wearable smart devices that can collect health data such as heart rate. AI is also currently being used to improve assistive technology, or technology that is used to improve the lives of disabled or elderly individuals. However, the rapid explosion in development and popularity of this tech also brings risks to consumers: there isn’t great legislation yet about the privacy of, say, healthcare data collected by such devices. Further, as we discussed with another guest a few weeks ago, there is the issue of coding ethics into AI – how can we as humans program robots in such a way that they learn to operate in an ethical manner? Who defines what that is? And on the human side – how do we ensure that human users of such technology can actually trust them, especially if they will be used in a way that could benefit the user’s health and wellness?

Anna Nickelson, a fourth-year PhD student in Kagan Tumer’s lab in the Collaborative Robotics and Intelligent Systems (CoRIS) Institute in the Department of Mechanical, Industrial and Manufacturing Engineering, joins us this week to discuss her research, which touches on several of these aspects regarding the use of technology as part of healthcare. Also a former Brookings Institute intern, Anna incorporates not just coding of robots but far-reaching policy and legislation goals into her work. Her research is driven by a very high level goal: how do we create AI that benefits humans and humanity?

Anna Nickelson, fourth year PhD student in the Collaborative Robotics and Intelligent Systems Institute.

AI for social good

When we think about how to create technology that is beneficial, Anna says that there are four major considerations in play. First is the creation of the technology itself – the hardware, the software; how technology is coded, how it’s built. The second is technologists and the technology industry – how do we think about and create technologies beyond the capitalist mindset of what will make the most money? Third is considering the general public’s role: what is the best way to educate people about things like privacy, the limitations and benefits of AI, and how to protect themselves from harm? Finally, she says we must also consider policy and legislation surrounding beneficial tech at all levels, from local ordinances to international guidelines. 

Anna’s current research with Dr. Tumer is funded by the NSF AI Institute for Collaborative Assistance and Responsive Interaction for Networked Groups (AI-CARING), an institute through the National Science Foundation that focuses on “personalized, longitudinal, collaborative AI, enabling the development of AI systems that learn personalized models of user behavior…and integrate that knowledge to support people and AIs working together”, as per their website. The institute is a collaboration between five universities, including Oregon State University and OHSU. What this looks like for Anna is lots of code writing and simulations studying how AI systems make trade-offs between different objectives.For this she looks at machine learning for decision making, and how multiple robots or AIs can work together towards a specific task without necessarily having to communicate with each other directly. For this she looks at machine learning for decision making in robots, and how multiple robots or AIs can work together towards a specific task without necessarily having to communicate with each other directly. Each robot or AI may have different considerations that factor into how they accomplish their objective, so part of her goal is to develop a framework for the different individuals to make decisions as part of a group.

With an undergraduate degree in math, a background in project management in the tech industry, engineering and coding skills, and experience working with a think tank in DC on tech-related policy, Anna is uniquely situated to address the major questions about development technology for social good in a way that mitigates risk. She came to graduate school at Oregon State with this interdisciplinary goal in mind. Her personal life goal is to get experience in each sector so she can bring in a wide range of perspectives and ideas. “There are quite a few people working on tech policy right now, but very few people have the breadth of perspective on it from the low level to the high level,” she says. 

If you are interested in hearing more about Anna’s life goals and the intersection of artificial intelligence, healthcare, and policy, join us live at 7 PM on Sunday, May 7th on https://kbvrfm.orangemedianetwork.com/, or after the show wherever you find your podcasts. 

Red, Red, (smoky) Wine

Did you know humans have the ability to “taste” through smelling? Well we do, and it is through a process called retronasal olfaction. This fancy sounding term is just some of the ways that food scientists, such as our guest speaker this week, recent M.S. graduate and soon to be Ph.D. student, Jenna Fryer studies how flavors, or tastes through smell, are understood and what impact external factors have on them. Specifically, Fryer looks at the ways fires affect the flavors of wine, a particularly timely area of research due to the recent wave of devastating wildfires in Oregon. 

Fryer at OSU’s vineyard

Having always been interested in food science, Fryer examines the ways smoke penetrates wine grapes. She does this by studying the ways people taste the smoke and how they can best rid the smokiness in their mouths, because spoiler, it has a pretty negative impact on the flavor. This research has forced her to develop novel ways to explain and standardize certain flavors, such as ashiness and mixed berry, as well as learn what compounds are the best palate cleansers. She will continue this research with her Ph.D. where she plans to figure out what compounds make that smoky flavor, and how best to predict which wines will taste like smoke in the future. 

Through this work, Fryer has made some fascinating discoveries, such as how many people can actually detect the smoke flavor (because not everyone can), how best to create an ashy flavor (hint, it has to do with a restaurant in the UK and leeks), why red wine is more affected by smoke than white wines, and what the difference is between flavor and taste. 

Fryer processing wine samples

Tune in live at 7pm on Sunday April 24th or listen to this episode anywhere you get your podcasts to learn about Fryer’s research! 

And, if you are interested in being a part of a future wine study (and who wouldn’t want to get paid to taste wine), click on this link to sign up!