Category Archives: Uncategorized

Lean, Mean, Bioinformatics Machine

Machines take me by surprise with great frequency. – Alan Turing

This week we have a PhD student from the College of Engineering and advised by Dr. Maude David in Microbiology, Nima Azbijari, to discuss how he uses machine learning to better understand biology. Before we dig in to the research, let’s dig into what exactly machine learning is, and how it differs from artificial intelligence (AI). Both AI and machine learning learn patterns from data they are fed, but the difference is that AI is typically developed to be interacted with and make decisions in real time. If you’ve ever lost a game of chess to a computer, that was AI playing against you. But don’t worry, even the world’s champion at an even more complex game, Go, was beaten by AI. AI utilizes machine learning, but not all machine learning is AI. Kind of like how a square is a rectangle, but not all rectangles are squares. The goal of machine learning is to use data to improve at tasks using data it is fed.

So how exactly does a machine, one of the least biological things on this planet, help us understand biology? 

Ten years ago it was big news that a computer was able to recognize images of cats, but now photo recognition is quite common. Similarly, Nima uses machine learning with large sets of genomic (genes/DNA), proteomic (proteins), and even gut microbiomic data (symbiotic microbes in the digestive track) to then see if the computer can predict varying patient outcomes. By using computational power, larger data sets and the relationships between the varying kinds of data can be analyzed more quickly. This is great for both understanding the biological world in which we live, and also for the potential future of patient care. 

How exactly do you teach an old machine a new trick?

First, it’s important to note that he’s using a machine, not magic, and it can be massively time consuming (even for a computer) to do any kind of analysis on every element of a massive set. Potentially millions of computations, or even more. So to isolate only the data that matters, Nima uses graph neural networks to extrapolate the important pieces. Imagine if you had a data set about your home, and you counted both the number of windows and the number of blinds and found that they were the same. Then you might conclude that you only need to count windows, and that counting blinds doesn’t tell you anything new. The same idea works with reducing data into only the components that add meaning. 

The phrase ‘neural network’ can invoke imagery of a massive computer-brain made of wires, but what does this neural network look like, exactly? The 1999 movie The Matrix borrowed its name from a mathematical object which contains columns and rows of data, much like the iconic green columns of data from the movie posters. These matrices are useful for storing and computing data sets since they can be arranged much like an excel sheet, with columns for each patient and rows for each type of recorded data. He (or the computer?) can then work with that matrix to develop this neural network graph. Then, the neural network determines which data is relevant and can also illustrate connections between the different pieces of data. Much like how you might be connected to friends, coworkers, and family on a social network, except in this case, each profile is a compound or molecule and the connections can be any kind of relationship, such as a common reaction between the pair. However, unlike a social network, no one cares how many degrees from Kevin Bacon they are. The goal here isn’t to connect one molecule to another but to instead identify unknown relationships. Perhaps that makes it more like 23 and Me than Facebook.

TLDR

Nima is using machine learning to discover previously unknown relationships between various kinds of human biological data such as genes and the gut microbiome. Now, that’s a machine you don’t need to rage against.

Excited to learn more about machine learning?
Us too. Be sure to listen live on Sunday November 13th at 7PM on 88.7FM, or download the podcast if you missed it. And if you want to stay up to date on Nima’s research, you can follow them on Twitter.

Heat, Hatchlings, and Sea Turtle Survival

Heat, Hatchlings, and Sea Turtle Survival

A team of researchers makes its way across the beach on this dark night, lighting their way only with starlight and moonlight. It’s high tide on this small island off the coast of Brazil, and the kind of night when green sea turtles love to come ashore to nest. The turtles fall into a trance-like state after wandering around for hours and finally building their nests, and this is when the team approaches. They take a skin sample, place a temperature logger to measure the nest temperature, and tag the turtle with a nail polish marking for future identification. One member of the team is Vic Quennessen (she/they), the subject of our next episode. Vic is a PhD student in the Department of Fisheries, Wildlife, and Conservation Sciences. Quennessen is a computational researcher on the project but helping out on nights like these is part of the job. Vic’s team collaborates with Projeto TAMAR, a Brazilian nonprofit organization that works to preserve and conserve these endangered animals throughout Brazil since the 1980s.

Vic Quennessen releases their first hatchling!

Sea turtles have no sex chromosomes, and their sex is instead determined by the environmental temperature during incubation. Eggs subjected to higher temperatures are more likely to produce female hatchlings. The point at which the sex ratio of eggs approaches 50/50 is around 29 degrees Celsius, but at just one degree higher, some clutches of eggs produce as high as 90% female hatchlings. As temperatures rise due to climate change, this has resulted in a worrying oversupply of female hatchlings.

Sea turtles are difficult to study due to their long and mysterious life cycles. It is believed that they reach reproductive maturity after around twenty-five years, but only females are readily observed because they return to land to build their nests and lay eggs. In contrast, the males stay out at sea for their entire lives. This complicates any effort to ascertain the true population structure. Sea turtles also live a long time, so there is a lag between changes in the hatchling population and the overall population. Finally, hatchlings lack external reproductive organs or other visible sexual characteristics, so the sex ratios must be estimated using temperature as a surrogate.

Vic has always loved the ocean, and they came to OSU looking to help conserve resources that are threatened, such as fish stocks or sea turtles. While attending UMass Dartmouth for their undergraduate degree, they double majored in computational mathematics and marine biology. Initially these felt like separate interests, until a professor suggested that she apply to a NOAA workshop on marine resources and population dynamics. Here she learned that mathematical methods could be a part of rigorous modeling efforts in population biology. After a gap year dedicated to science education, Vic made her way to Oregon State for a Masters in Fisheries Science. Her advisor, Prof. Will White, persuaded her to stay on for a PhD with an opportunity to study her beloved sea turtles.

Sea turtles visit the beaches of more than eighty countries, but Vic’s fieldwork focuses on a population that nests on a small Brazilian island.

Quennessen’s research seeks to predict how the green sea turtle population will be affected by their looming sex imbalance. Vic uses data collected from over 3000 hatchlings per season, including nest temperature readings as well as the numbers of nesting females, hatchlings, and captured males. They build a mathematical model to explore possible scenarios for the “mating function”, the unknown relationship between the sex ratio and reproductive success. On the one hand it is easy to imagine that such a mismatch could reduce the number of mating pairs and lead to a rapid population decline. On the other, it is not well understood how many breeding males are required to sustain the population, and adaptations in mating behavior could slow the decline in population long enough for the more optimistic climate mitigation scenarios to take effect. In any case, it will take a lot of international cooperation to conserve these ancient marine creatures – green sea turtles nest on the shores of over 80 countries. Vic’s hope is that a mathematical exploration of this question could help reveal the chances of survival for the green sea turtles and possibly inform these conservation efforts.

To learn more about Vic’s research and their other interests, including science education and working with CGE, the graduate student union at OSU, tune in Sunday, Nov 6th at 7pm PST on KBVR 88.7 FM or online!

Missed the show? Don’t worry, you can download this episode via your podcast player of choice here.

Spaghetti & Networks: Oodles of Nodes

Picture a bowl of spaghetti and meatballs. There are pristine noodles, drenched in rich tomato sauce, topped with savory meatballs. Now imagine you’re only allowed to eat just one noodle, and one meatball. You’re tasked with finding the very best, the most interesting bite out of this bowl of spaghetti. It might sound absurd, but replace spaghetti with ‘edges’ and meatballs with ‘nodes’ and you’ve got a network.

An image of a network from Nolan’s recent publication. The lines are ‘edges’ and the dots are ‘nodes’.

Computational biologists like our guest this week use networks to uncover meaningful relationships, or the tastiest spaghetti noodle and meatball, between biological entities.
Joining us this week is Nolan Newman, a PhD candidate in the College of Pharmacy under PI Andriy Morgun. Nolan’s research lies at the intersection of math, statistics, computer science, and biology. He’s looking at how networks, such as covariation networks, can be used to look for relationships and correlations between genes, microbes, and other factors from massive datasets which compare thousands or even of biological entities. With datasets this large and complex, it can be difficult to pare down just the important or interesting relationships – like trying to scoop a single bowl of spaghetti from a giant tray at a buffet, and then further narrowing it down to pick just one interesting noodle.

Nolan Newman, PhD candidate


Nolan is further interested in how different statistical thresholds and variables contribute to how the networks ‘look’ when they are changed. If only noodles covered in sauce are considered ‘interesting’, then all of the sauce-less noodles are out of the running. But what if noodles are only considered ‘sauce-covered’ if they are 95% or more covered? Could you be missing out on perfectly delicious, interesting noodles by applying this constraint?


If you’re left scratching your head and a little hungry, fear not. We’ll chat about all things computational biology, networks, making meaning out of chaos, and why hearing loss prompted Nolan to begin a career in science, all on this week’s episode of Inspiration Dissemination. Catch the episode live at 7 PST at 88.7 FM or https://kbvrfm.orangemedianetwork.com/, or catch the podcast after the episode on any podcast platform.

AI that benefits humans and humanity

When you think about artificial intelligence or robots in the everyday household, your first thought might be that it sounds like science fiction – like something out of the 1999 cult classic film “Smart House”. But it’s likely you have some of this technology in your home already – if you own a Google Home, Amazon Alexa, Roomba, smart watch, or even just a smartphone, you’re already plugged into this network of AI in the home. The use of this technology can pose great benefits to its users, spanning from simply asking Google to set an alarm to wake you up the next day, to wearable smart devices that can collect health data such as heart rate. AI is also currently being used to improve assistive technology, or technology that is used to improve the lives of disabled or elderly individuals. However, the rapid explosion in development and popularity of this tech also brings risks to consumers: there isn’t great legislation yet about the privacy of, say, healthcare data collected by such devices. Further, as we discussed with another guest a few weeks ago, there is the issue of coding ethics into AI – how can we as humans program robots in such a way that they learn to operate in an ethical manner? Who defines what that is? And on the human side – how do we ensure that human users of such technology can actually trust them, especially if they will be used in a way that could benefit the user’s health and wellness?

Anna Nickelson, a fourth-year PhD student in Kagan Tumer’s lab in the Collaborative Robotics and Intelligent Systems (CoRIS) Institute in the Department of Mechanical, Industrial and Manufacturing Engineering, joins us this week to discuss her research, which touches on several of these aspects regarding the use of technology as part of healthcare. Also a former Brookings Institute intern, Anna incorporates not just coding of robots but far-reaching policy and legislation goals into her work. Her research is driven by a very high level goal: how do we create AI that benefits humans and humanity?

Anna Nickelson, fourth year PhD student in the Collaborative Robotics and Intelligent Systems Institute.

AI for social good

When we think about how to create technology that is beneficial, Anna says that there are four major considerations in play. First is the creation of the technology itself – the hardware, the software; how technology is coded, how it’s built. The second is technologists and the technology industry – how do we think about and create technologies beyond the capitalist mindset of what will make the most money? Third is considering the general public’s role: what is the best way to educate people about things like privacy, the limitations and benefits of AI, and how to protect themselves from harm? Finally, she says we must also consider policy and legislation surrounding beneficial tech at all levels, from local ordinances to international guidelines. 

Anna’s current research with Dr. Tumer is funded by the NSF AI Institute for Collaborative Assistance and Responsive Interaction for Networked Groups (AI-CARING), an institute through the National Science Foundation that focuses on “personalized, longitudinal, collaborative AI, enabling the development of AI systems that learn personalized models of user behavior…and integrate that knowledge to support people and AIs working together”, as per their website. The institute is a collaboration between five universities, including Oregon State University and OHSU. What this looks like for Anna is lots of code writing and simulations studying how AI systems make trade-offs between different objectives.For this she looks at machine learning for decision making, and how multiple robots or AIs can work together towards a specific task without necessarily having to communicate with each other directly. For this she looks at machine learning for decision making in robots, and how multiple robots or AIs can work together towards a specific task without necessarily having to communicate with each other directly. Each robot or AI may have different considerations that factor into how they accomplish their objective, so part of her goal is to develop a framework for the different individuals to make decisions as part of a group.

With an undergraduate degree in math, a background in project management in the tech industry, engineering and coding skills, and experience working with a think tank in DC on tech-related policy, Anna is uniquely situated to address the major questions about development technology for social good in a way that mitigates risk. She came to graduate school at Oregon State with this interdisciplinary goal in mind. Her personal life goal is to get experience in each sector so she can bring in a wide range of perspectives and ideas. “There are quite a few people working on tech policy right now, but very few people have the breadth of perspective on it from the low level to the high level,” she says. 

If you are interested in hearing more about Anna’s life goals and the intersection of artificial intelligence, healthcare, and policy, join us live at 7 PM on Sunday, May 7th on https://kbvrfm.orangemedianetwork.com/, or after the show wherever you find your podcasts. 

Red, Red, (smoky) Wine

Did you know humans have the ability to “taste” through smelling? Well we do, and it is through a process called retronasal olfaction. This fancy sounding term is just some of the ways that food scientists, such as our guest speaker this week, recent M.S. graduate and soon to be Ph.D. student, Jenna Fryer studies how flavors, or tastes through smell, are understood and what impact external factors have on them. Specifically, Fryer looks at the ways fires affect the flavors of wine, a particularly timely area of research due to the recent wave of devastating wildfires in Oregon. 

Fryer at OSU’s vineyard

Having always been interested in food science, Fryer examines the ways smoke penetrates wine grapes. She does this by studying the ways people taste the smoke and how they can best rid the smokiness in their mouths, because spoiler, it has a pretty negative impact on the flavor. This research has forced her to develop novel ways to explain and standardize certain flavors, such as ashiness and mixed berry, as well as learn what compounds are the best palate cleansers. She will continue this research with her Ph.D. where she plans to figure out what compounds make that smoky flavor, and how best to predict which wines will taste like smoke in the future. 

Through this work, Fryer has made some fascinating discoveries, such as how many people can actually detect the smoke flavor (because not everyone can), how best to create an ashy flavor (hint, it has to do with a restaurant in the UK and leeks), why red wine is more affected by smoke than white wines, and what the difference is between flavor and taste. 

Fryer processing wine samples

Tune in live at 7pm on Sunday April 24th or listen to this episode anywhere you get your podcasts to learn about Fryer’s research! 

And, if you are interested in being a part of a future wine study (and who wouldn’t want to get paid to taste wine), click on this link to sign up! 

Nuclear: the history, present, and future of the solution to the energy crisis

In August of 2015, the Animas River in Colorado turned yellow almost overnight. Approximately three million gallons of toxic waste water were released into the watershed following the breaching of a tailings dam at the Gold King Mine. The acidic drainage led to heavy metal contamination in the river reaching hundreds of times the safe limits allowed for domestic water, having devastating effects on aquatic life as well as the ecosystems and communities surrounding the Silverton and Durango area. 

This environmental disaster was counted by our guest this week, Nuclear Science and Engineering PhD student Dusty Mangus, as a close-to-home critical moment in inspiring what would become his pursuit of an education and career in engineering. “I became interested in the ways that engineering could be used to develop solutions to remediate such disasters,” he recalls.

Following his BS of Engineering from Fort Lewis College in Durango, Colorado, Dusty moved to the Pacific Northwest to pursue his PhD in Nuclear Engineering here at Oregon State, where he works with Dr. Samuel Briggs. His research here focuses on an application of engineering to solve one of the biggest problems of our age: energy – and more specifically, the use of nuclear energy. Dusty’s primary focus is on using liquid sodium as an alternative coolant for nuclear reactors, and the longevity of various materials used to construct vessels for such reactors. But before we can get into what that means, we should define a few things: what is nuclear energy? Why is nuclear energy a promising alternative to fossil fuels? And why does it have such an undeserved bad rap?

Going Nuclear

Nuclear energy comes from breaking apart the nuclei of atoms. The nucleus is the core of the atom and holds an enormous amount of energy. Breaking apart atoms, also called fission, can be used to generate electricity. Nuclear reactors are machines that have been designed to control the process of nuclear fission and use the heat generated by this reaction to power generators, which create electricity. Nuclear reactors typically use the element uranium as the fuel source to produce fission, though other elements such as thorium could also be used. The heat created by fission then warms the coolant surrounding the reaction, typically water, which then produces steam. The United States alone has more than 100 nuclear reactors which produce around 20% of the nation’s electricity; however, the majority of the electricity produced in the US is from fossil fuels. This extremely potent energy source almost fully powers some nations including France and Lithuania. 

One of the benefits of nuclear energy is that unlike fossil fuels, nuclear reactors do not produce carbon emissions that contribute to the accumulation of greenhouse gases in the atmosphere. In addition, unlike other alternative energy sources, nuclear plants can support the grid 24/7: extreme weather or lack of sunshine does not shut them down. They also take up less of a footprint than, say, wind farms.  

However, despite their benefits and usefulness, nuclear energy has a bit of a sordid history which has led to a persistent, albeit fading in recent years, negative reputation. While atomic radiation and nuclear fission were researched and developed starting in the late 1800s, many of the advancements in the technology were made between 1939-1945, where development was focused on the atomic bomb. First generation nuclear reactors were developed in the 1950s and 60s, and several of these reactors ran for close to 50 years before decommission. It was in 1986 the infamous Chernobyl nuclear disaster occurred: a flawed reactor design led to a steam explosion and fires which released radioactive material into the environment, killing several workers in the days and weeks following the accident as a result of acute radiation exposure. This incident would have a decades-long impact on the perception of the safety of nuclear reactors, despite the significant effect of the accident on reactor safety design. 

Nuclear Reactor Safety

Despite the perception formed by the events of Chernobyl and other nuclear reactor meltdowns such as the 2011 disaster in Fukushima, Japan, nuclear energy is actually one of the safest energy sources available to mankind, according to a 2012 Forbes article which ranked the mortality rate per kilowatt hour of energy from different sources. Perhaps unsurprisingly, coal tops the list, with a global average of 100,000 deaths per trillion kilowatt hour. Nuclear energy is at the bottom of the list with only about 0.1 deaths per trillion kilowatt hour, making it even safer by this metric than natural gas (4,000 deaths), hydro (1400 deaths), and wind (150 deaths). Modern nuclear reactors are built with passive redundant safety systems that help to avoid the disasters of their predecessors.

Dusty’s research helps to address one of the issues surrounding nuclear reactor safety: coolant material. Typical reactors use water as a coolant: water absorbs the heat from the reaction and it then turns to steam. Once water turns to steam at 100 degrees Celsius, the heat transfer is much less efficient – the workaround to this is putting the water under high pressure, which raises the boiling point. However, this comes with an increased safety risk and a manufacturing challenge: water under high pressure requires large, thick metal vessels to contain it.

Sodium, infamous for its role in the inorganic compound known as salt, is actually a metal. In its liquid phase, it is much like mercury: metallic and viscous. Liquid sodium can be used as a low-pressure, safer coolant that transfers heat efficiently and can keep a reactor core cool without requiring external power. The boiling point of liquid sodium is around 900 degrees Celsius, whereas a nuclear reactor operates in the range of around 300-500 degrees Celsius – meaning that reactors can operate within a much safer range of temperatures at atmospheric pressure as compared to reactors that use conventional water cooling systems.

Dusty’s research is helping to push the field of nuclear reactor efficiency and safety into the future. Nuclear energy promises a safer, greener solution to the energy crisis, providing a potent alternative to current fuel sources that generate greenhouse gas emissions. Nuclear energy utilized efficiently could even the capability to power the sequestration of carbon dioxide from the atmosphere, leading to a cleaner, greener future. 

Did we hook you on nuclear energy yet? Tune in to the show or catch the podcast to learn more about the history, present and future of this potent and promising energy source!  Be sure to listen live on Sunday January 30th at 7PM on 88.7FM or download the podcast if you missed it.

Happy New Year 2017!

Happy New Year from all of us at Inspiration Dissemination! It’s been a great year with fantastic guests on our program. We’ll be back on the air January 15th with Joe Donovan, who’s working on his MFA in Creative Writing! Stay tuned and stay inspired!

Word butt describing guest research in 2016

A word cloud of research descriptions from our guests in 2016

No show on February 21st

We will not have a show on February 21st, 2016 due to a broadcast of a Women’s Basketball game.

Tune in on February 28th, 2016 to hear a whale of a tale from Fisheries and Wildlife Student, Samara Haver.

No Show This Week

We will not have a show on January 17th, 2016 due to the broadcast of OSU Women’s basketball game. Tune in on January 24th as we interview History of Science Ph. D. student, Edwin Wollert.