Category Archives: Uncategorized

Lasers and lipids : in search of a mechanism for dysferlin

This week on Inspiration Dissemination, we are looking forward to chatting with Andrew Carpenter, a postdoctoral fellow working in the lab of Professor Joe Baio in the School of Chemical, Biological, and Environmental Engineering.

Andrew’s research seeks a better understanding of a protein called dysferlin, which plays a critical role in repairing muscle cells.  Muscles undergo constant strain as they expand and contract, leading to tears in the sarcolemmas — thin membranes that surround muscle fibers. Dysferlin is responsible for recruiting vesicles to the site of these tears for a process called vesicle fusion to take place. Andrew likens this mechanism to using a denim patch to fix a hole in jeans, if the patch could become fully absorbed into the fabric in the way that vesicles eventually do into sarcolemmas. Dysferlin is clinically important because certain mutations (dysferlinopathies) to the gene encoding dysferlin lead to a disease called muscular dystrophy. The symptoms of dysferlinopathy typically include muscle weakness and damage to the musculoskeletal system, especially in the limbs.

Andrew working in the lab

The general importance of dysferlin to cell repair is well-established, but the molecular details of its mechanism of action are relatively unknown.  Andrew uses an advanced experimental method called sum-frequency spectroscopy to study the protein at high resolution. This procedure uses two lasers — one infrared and one visible green — and points them at the sample of interest. When the lasers hit the sample, a third beam of light is generated at the surface, carrying information about the vibrations of the molecules. Quantum mechanical calculations are used to examine the intensity of this light as a function of frequency. In Andrew’s research, a synthetic lipid monolayer serves as an in-vitro model of the sarcolemma, and he introduces the dysferlin protein either in its healthy form or with various mutations. Then he uses spectroscopy data to infer changes in protein orientation and binding. In the future, he intends to correlate his experiments with data from live cells.

Andrew first discovered his fascination with laser instrumentation as an undergraduate at Linfield University. After that, he obtained a PhD in Chemistry at the University of Oregon, where he used small oil droplets called nano-emulsions to study the oil-water interface. His background in physical chemistry and expertise in the sum-frequency spectroscopy method have enabled him to readily adapt to studying biological lipid interfaces. His research, including a recent publication, is currently supported by the National Science Foundation.

To hear more about Andrew’s research journey and the differences and similarities in being a postdoc and a graduate student, tune in after the Super Bowl this Sunday, February 12th, at 7pm on 88.7 FM KBVR.


Krypton-ice : what the noble gases tell us about the ancient climate

Tree rings famously reflect the age of the tree, but they can also encode information about the environmental conditions throughout the organism’s life. A similar principle motivates the study of ice cores – traces of the ancient atmosphere are preserved in the massive ice caps covering Earth’s polar regions.

This Sunday’s guest is Olivia Williams, a graduate student here at Oregon State who is helping to uncover the wealth of climate information harbored by polar ice cores. Olivia is a member of the College of Earth, Ocean and Atmospheric Sciences (CEOAS), where she is advised by Christo Buizert. Their lab uses ice cores to study paleoclimatology and heads the Center for Oldest Ice Exploration (COLDEX), a multi-institution NSF collaboration.

Drilling an ice core in the Arctic or Antarctic is an expensive and labor-intensive process. As a result, once they have been studied by project leads, most American ice core samples are centrally managed by the National Ice Core Lab in Denver, CO and carefully allocated to labs throughout the country. Researchers analyze cross-sections of the larger ice core sample for many geochemical features, including dust records, stable isotopes, and evidence of volcanic eruptions. Determining the historical levels of carbon dioxide, methane, and other greenhouse gases is one application of ice core analysis that yields important insights into climate change.

Olivia’s project focuses on “melt layers”, which are formed by a large-scale melting and refreezing event. The frequency and intensity of melt layers help characterize polar summer temperatures, and specifically the number of days above freezing. Typically, researchers use visual examination or optical instruments to locate layers with relatively smooth and bubble-free ice. However, such methods can fail further down in ice cores, where clathrate ice formed by increased pressure excludes all bubbles. In response, the lab of Jeffrey Severinghaus at the Scripps Institution of Oceanography developed a chemical method to serve as a supplement. This technique extracts noble gases from the core and compares the ratio of the heavier (xenon and krypton) to argon, the lightest noble gas. Since the heavier noble gases are more water-soluble, spikes in the relative concentration of krypton and xenon suggest that a melting event occurred.

During a typical day in the lab, Williams takes samples from the ice core stored at -20 C in a large walk-in freezer and handles the samples in chilled ethanol baths. She particularly focuses on ice cores from Greenland and time periods such as the last interglacial period ~120 thousand years ago and the early Holocene ~12 thousand years ago. Since the OSU lab’s noble gas methodology is novel, Olivia’s work involves a lot of design and troubleshooting the extraction line, which extracts the trapped gases. One time, she even had to commission a scientific glassblower for custom cold traps in the extraction line.

Williams’ interest in geology was impressed upon her at an early age, in part by the influence of her grandfather, a longtime science writer for the Seattle Times. Her grandfather’s love for the geology of the Pacific Northwest inspired her to follow in his footsteps as a scientific journalist. At Boston University, Olivia initially planned to major in communications, until she took a seminar on interdisciplinary science communication offered by BU Antarctic Research Lab, together with education and earth sciences majors. This experience helped solidify her interest in geology, and she switcher her major to earth sciences. Her senior research project related to nutrient cycling in salt marshes, but she knew that she eventually wanted to work in polar science and paleoclimatology. Besides her research at OSU, Olivia has stayed active in science communication, serving as the outreach chair for the CEOS graduate student association. She has helped organize education tables at the Corvallis Farmer’s Market. In the future, Olivia hopes to pursue an academic career and continue research and teaching in the field she loves but is open to the full range of earth science career paths.

For more on Olivia’s exciting research and to hear what it is like to drill ice from a lava formation, tune in this Sunday, January 22nd at 7PM on KBVR 88.7 FM or look out for the podcast upload on Spotify!

Lean, Mean, Bioinformatics Machine

Machines take me by surprise with great frequency. – Alan Turing

This week we have a PhD student from the College of Engineering and advised by Dr. Maude David in Microbiology, Nima Azbijari, to discuss how he uses machine learning to better understand biology. Before we dig in to the research, let’s dig into what exactly machine learning is, and how it differs from artificial intelligence (AI). Both AI and machine learning learn patterns from data they are fed, but the difference is that AI is typically developed to be interacted with and make decisions in real time. If you’ve ever lost a game of chess to a computer, that was AI playing against you. But don’t worry, even the world’s champion at an even more complex game, Go, was beaten by AI. AI utilizes machine learning, but not all machine learning is AI. Kind of like how a square is a rectangle, but not all rectangles are squares. The goal of machine learning is to use data to improve at tasks using data it is fed.

So how exactly does a machine, one of the least biological things on this planet, help us understand biology? 

Ten years ago it was big news that a computer was able to recognize images of cats, but now photo recognition is quite common. Similarly, Nima uses machine learning with large sets of genomic (genes/DNA), proteomic (proteins), and even gut microbiomic data (symbiotic microbes in the digestive track) to then see if the computer can predict varying patient outcomes. By using computational power, larger data sets and the relationships between the varying kinds of data can be analyzed more quickly. This is great for both understanding the biological world in which we live, and also for the potential future of patient care. 

How exactly do you teach an old machine a new trick?

First, it’s important to note that he’s using a machine, not magic, and it can be massively time consuming (even for a computer) to do any kind of analysis on every element of a massive set. Potentially millions of computations, or even more. So to isolate only the data that matters, Nima uses graph neural networks to extrapolate the important pieces. Imagine if you had a data set about your home, and you counted both the number of windows and the number of blinds and found that they were the same. Then you might conclude that you only need to count windows, and that counting blinds doesn’t tell you anything new. The same idea works with reducing data into only the components that add meaning. 

The phrase ‘neural network’ can invoke imagery of a massive computer-brain made of wires, but what does this neural network look like, exactly? The 1999 movie The Matrix borrowed its name from a mathematical object which contains columns and rows of data, much like the iconic green columns of data from the movie posters. These matrices are useful for storing and computing data sets since they can be arranged much like an excel sheet, with columns for each patient and rows for each type of recorded data. He (or the computer?) can then work with that matrix to develop this neural network graph. Then, the neural network determines which data is relevant and can also illustrate connections between the different pieces of data. Much like how you might be connected to friends, coworkers, and family on a social network, except in this case, each profile is a compound or molecule and the connections can be any kind of relationship, such as a common reaction between the pair. However, unlike a social network, no one cares how many degrees from Kevin Bacon they are. The goal here isn’t to connect one molecule to another but to instead identify unknown relationships. Perhaps that makes it more like 23 and Me than Facebook.

TLDR

Nima is using machine learning to discover previously unknown relationships between various kinds of human biological data such as genes and the gut microbiome. Now, that’s a machine you don’t need to rage against.

Excited to learn more about machine learning?
Us too. Be sure to listen live on Sunday November 13th at 7PM on 88.7FM, or download the podcast if you missed it. And if you want to stay up to date on Nima’s research, you can follow them on Twitter.

Heat, Hatchlings, and Sea Turtle Survival

Heat, Hatchlings, and Sea Turtle Survival

A team of researchers makes its way across the beach on this dark night, lighting their way only with starlight and moonlight. It’s high tide on this small island off the coast of Brazil, and the kind of night when green sea turtles love to come ashore to nest. The turtles fall into a trance-like state after wandering around for hours and finally building their nests, and this is when the team approaches. They take a skin sample, place a temperature logger to measure the nest temperature, and tag the turtle with a nail polish marking for future identification. One member of the team is Vic Quennessen (she/they), the subject of our next episode. Vic is a PhD student in the Department of Fisheries, Wildlife, and Conservation Sciences. Quennessen is a computational researcher on the project but helping out on nights like these is part of the job. Vic’s team collaborates with Projeto TAMAR, a Brazilian nonprofit organization that works to preserve and conserve these endangered animals throughout Brazil since the 1980s.

Vic Quennessen releases their first hatchling!

Sea turtles have no sex chromosomes, and their sex is instead determined by the environmental temperature during incubation. Eggs subjected to higher temperatures are more likely to produce female hatchlings. The point at which the sex ratio of eggs approaches 50/50 is around 29 degrees Celsius, but at just one degree higher, some clutches of eggs produce as high as 90% female hatchlings. As temperatures rise due to climate change, this has resulted in a worrying oversupply of female hatchlings.

Sea turtles are difficult to study due to their long and mysterious life cycles. It is believed that they reach reproductive maturity after around twenty-five years, but only females are readily observed because they return to land to build their nests and lay eggs. In contrast, the males stay out at sea for their entire lives. This complicates any effort to ascertain the true population structure. Sea turtles also live a long time, so there is a lag between changes in the hatchling population and the overall population. Finally, hatchlings lack external reproductive organs or other visible sexual characteristics, so the sex ratios must be estimated using temperature as a surrogate.

Vic has always loved the ocean, and they came to OSU looking to help conserve resources that are threatened, such as fish stocks or sea turtles. While attending UMass Dartmouth for their undergraduate degree, they double majored in computational mathematics and marine biology. Initially these felt like separate interests, until a professor suggested that she apply to a NOAA workshop on marine resources and population dynamics. Here she learned that mathematical methods could be a part of rigorous modeling efforts in population biology. After a gap year dedicated to science education, Vic made her way to Oregon State for a Masters in Fisheries Science. Her advisor, Prof. Will White, persuaded her to stay on for a PhD with an opportunity to study her beloved sea turtles.

Sea turtles visit the beaches of more than eighty countries, but Vic’s fieldwork focuses on a population that nests on a small Brazilian island.

Quennessen’s research seeks to predict how the green sea turtle population will be affected by their looming sex imbalance. Vic uses data collected from over 3000 hatchlings per season, including nest temperature readings as well as the numbers of nesting females, hatchlings, and captured males. They build a mathematical model to explore possible scenarios for the “mating function”, the unknown relationship between the sex ratio and reproductive success. On the one hand it is easy to imagine that such a mismatch could reduce the number of mating pairs and lead to a rapid population decline. On the other, it is not well understood how many breeding males are required to sustain the population, and adaptations in mating behavior could slow the decline in population long enough for the more optimistic climate mitigation scenarios to take effect. In any case, it will take a lot of international cooperation to conserve these ancient marine creatures – green sea turtles nest on the shores of over 80 countries. Vic’s hope is that a mathematical exploration of this question could help reveal the chances of survival for the green sea turtles and possibly inform these conservation efforts.

To learn more about Vic’s research and their other interests, including science education and working with CGE, the graduate student union at OSU, tune in Sunday, Nov 6th at 7pm PST on KBVR 88.7 FM or online!

Missed the show? Don’t worry, you can download this episode via your podcast player of choice here.

Spaghetti & Networks: Oodles of Nodes

Picture a bowl of spaghetti and meatballs. There are pristine noodles, drenched in rich tomato sauce, topped with savory meatballs. Now imagine you’re only allowed to eat just one noodle, and one meatball. You’re tasked with finding the very best, the most interesting bite out of this bowl of spaghetti. It might sound absurd, but replace spaghetti with ‘edges’ and meatballs with ‘nodes’ and you’ve got a network.

An image of a network from Nolan’s recent publication. The lines are ‘edges’ and the dots are ‘nodes’.

Computational biologists like our guest this week use networks to uncover meaningful relationships, or the tastiest spaghetti noodle and meatball, between biological entities.
Joining us this week is Nolan Newman, a PhD candidate in the College of Pharmacy under PI Andriy Morgun. Nolan’s research lies at the intersection of math, statistics, computer science, and biology. He’s looking at how networks, such as covariation networks, can be used to look for relationships and correlations between genes, microbes, and other factors from massive datasets which compare thousands or even of biological entities. With datasets this large and complex, it can be difficult to pare down just the important or interesting relationships – like trying to scoop a single bowl of spaghetti from a giant tray at a buffet, and then further narrowing it down to pick just one interesting noodle.

Nolan Newman, PhD candidate


Nolan is further interested in how different statistical thresholds and variables contribute to how the networks ‘look’ when they are changed. If only noodles covered in sauce are considered ‘interesting’, then all of the sauce-less noodles are out of the running. But what if noodles are only considered ‘sauce-covered’ if they are 95% or more covered? Could you be missing out on perfectly delicious, interesting noodles by applying this constraint?


If you’re left scratching your head and a little hungry, fear not. We’ll chat about all things computational biology, networks, making meaning out of chaos, and why hearing loss prompted Nolan to begin a career in science, all on this week’s episode of Inspiration Dissemination. Catch the episode live at 7 PST at 88.7 FM or https://kbvrfm.orangemedianetwork.com/, or catch the podcast after the episode on any podcast platform.

AI that benefits humans and humanity

When you think about artificial intelligence or robots in the everyday household, your first thought might be that it sounds like science fiction – like something out of the 1999 cult classic film “Smart House”. But it’s likely you have some of this technology in your home already – if you own a Google Home, Amazon Alexa, Roomba, smart watch, or even just a smartphone, you’re already plugged into this network of AI in the home. The use of this technology can pose great benefits to its users, spanning from simply asking Google to set an alarm to wake you up the next day, to wearable smart devices that can collect health data such as heart rate. AI is also currently being used to improve assistive technology, or technology that is used to improve the lives of disabled or elderly individuals. However, the rapid explosion in development and popularity of this tech also brings risks to consumers: there isn’t great legislation yet about the privacy of, say, healthcare data collected by such devices. Further, as we discussed with another guest a few weeks ago, there is the issue of coding ethics into AI – how can we as humans program robots in such a way that they learn to operate in an ethical manner? Who defines what that is? And on the human side – how do we ensure that human users of such technology can actually trust them, especially if they will be used in a way that could benefit the user’s health and wellness?

Anna Nickelson, a fourth-year PhD student in Kagan Tumer’s lab in the Collaborative Robotics and Intelligent Systems (CoRIS) Institute in the Department of Mechanical, Industrial and Manufacturing Engineering, joins us this week to discuss her research, which touches on several of these aspects regarding the use of technology as part of healthcare. Also a former Brookings Institute intern, Anna incorporates not just coding of robots but far-reaching policy and legislation goals into her work. Her research is driven by a very high level goal: how do we create AI that benefits humans and humanity?

Anna Nickelson, fourth year PhD student in the Collaborative Robotics and Intelligent Systems Institute.

AI for social good

When we think about how to create technology that is beneficial, Anna says that there are four major considerations in play. First is the creation of the technology itself – the hardware, the software; how technology is coded, how it’s built. The second is technologists and the technology industry – how do we think about and create technologies beyond the capitalist mindset of what will make the most money? Third is considering the general public’s role: what is the best way to educate people about things like privacy, the limitations and benefits of AI, and how to protect themselves from harm? Finally, she says we must also consider policy and legislation surrounding beneficial tech at all levels, from local ordinances to international guidelines. 

Anna’s current research with Dr. Tumer is funded by the NSF AI Institute for Collaborative Assistance and Responsive Interaction for Networked Groups (AI-CARING), an institute through the National Science Foundation that focuses on “personalized, longitudinal, collaborative AI, enabling the development of AI systems that learn personalized models of user behavior…and integrate that knowledge to support people and AIs working together”, as per their website. The institute is a collaboration between five universities, including Oregon State University and OHSU. What this looks like for Anna is lots of code writing and simulations studying how AI systems make trade-offs between different objectives.For this she looks at machine learning for decision making, and how multiple robots or AIs can work together towards a specific task without necessarily having to communicate with each other directly. For this she looks at machine learning for decision making in robots, and how multiple robots or AIs can work together towards a specific task without necessarily having to communicate with each other directly. Each robot or AI may have different considerations that factor into how they accomplish their objective, so part of her goal is to develop a framework for the different individuals to make decisions as part of a group.

With an undergraduate degree in math, a background in project management in the tech industry, engineering and coding skills, and experience working with a think tank in DC on tech-related policy, Anna is uniquely situated to address the major questions about development technology for social good in a way that mitigates risk. She came to graduate school at Oregon State with this interdisciplinary goal in mind. Her personal life goal is to get experience in each sector so she can bring in a wide range of perspectives and ideas. “There are quite a few people working on tech policy right now, but very few people have the breadth of perspective on it from the low level to the high level,” she says. 

If you are interested in hearing more about Anna’s life goals and the intersection of artificial intelligence, healthcare, and policy, join us live at 7 PM on Sunday, May 7th on https://kbvrfm.orangemedianetwork.com/, or after the show wherever you find your podcasts. 

Red, Red, (smoky) Wine

Did you know humans have the ability to “taste” through smelling? Well we do, and it is through a process called retronasal olfaction. This fancy sounding term is just some of the ways that food scientists, such as our guest speaker this week, recent M.S. graduate and soon to be Ph.D. student, Jenna Fryer studies how flavors, or tastes through smell, are understood and what impact external factors have on them. Specifically, Fryer looks at the ways fires affect the flavors of wine, a particularly timely area of research due to the recent wave of devastating wildfires in Oregon. 

Fryer at OSU’s vineyard

Having always been interested in food science, Fryer examines the ways smoke penetrates wine grapes. She does this by studying the ways people taste the smoke and how they can best rid the smokiness in their mouths, because spoiler, it has a pretty negative impact on the flavor. This research has forced her to develop novel ways to explain and standardize certain flavors, such as ashiness and mixed berry, as well as learn what compounds are the best palate cleansers. She will continue this research with her Ph.D. where she plans to figure out what compounds make that smoky flavor, and how best to predict which wines will taste like smoke in the future. 

Through this work, Fryer has made some fascinating discoveries, such as how many people can actually detect the smoke flavor (because not everyone can), how best to create an ashy flavor (hint, it has to do with a restaurant in the UK and leeks), why red wine is more affected by smoke than white wines, and what the difference is between flavor and taste. 

Fryer processing wine samples

Tune in live at 7pm on Sunday April 24th or listen to this episode anywhere you get your podcasts to learn about Fryer’s research! 

And, if you are interested in being a part of a future wine study (and who wouldn’t want to get paid to taste wine), click on this link to sign up! 

Nuclear: the history, present, and future of the solution to the energy crisis

In August of 2015, the Animas River in Colorado turned yellow almost overnight. Approximately three million gallons of toxic waste water were released into the watershed following the breaching of a tailings dam at the Gold King Mine. The acidic drainage led to heavy metal contamination in the river reaching hundreds of times the safe limits allowed for domestic water, having devastating effects on aquatic life as well as the ecosystems and communities surrounding the Silverton and Durango area. 

This environmental disaster was counted by our guest this week, Nuclear Science and Engineering PhD student Dusty Mangus, as a close-to-home critical moment in inspiring what would become his pursuit of an education and career in engineering. “I became interested in the ways that engineering could be used to develop solutions to remediate such disasters,” he recalls.

Following his BS of Engineering from Fort Lewis College in Durango, Colorado, Dusty moved to the Pacific Northwest to pursue his PhD in Nuclear Engineering here at Oregon State, where he works with Dr. Samuel Briggs. His research here focuses on an application of engineering to solve one of the biggest problems of our age: energy – and more specifically, the use of nuclear energy. Dusty’s primary focus is on using liquid sodium as an alternative coolant for nuclear reactors, and the longevity of various materials used to construct vessels for such reactors. But before we can get into what that means, we should define a few things: what is nuclear energy? Why is nuclear energy a promising alternative to fossil fuels? And why does it have such an undeserved bad rap?

Going Nuclear

Nuclear energy comes from breaking apart the nuclei of atoms. The nucleus is the core of the atom and holds an enormous amount of energy. Breaking apart atoms, also called fission, can be used to generate electricity. Nuclear reactors are machines that have been designed to control the process of nuclear fission and use the heat generated by this reaction to power generators, which create electricity. Nuclear reactors typically use the element uranium as the fuel source to produce fission, though other elements such as thorium could also be used. The heat created by fission then warms the coolant surrounding the reaction, typically water, which then produces steam. The United States alone has more than 100 nuclear reactors which produce around 20% of the nation’s electricity; however, the majority of the electricity produced in the US is from fossil fuels. This extremely potent energy source almost fully powers some nations including France and Lithuania. 

One of the benefits of nuclear energy is that unlike fossil fuels, nuclear reactors do not produce carbon emissions that contribute to the accumulation of greenhouse gases in the atmosphere. In addition, unlike other alternative energy sources, nuclear plants can support the grid 24/7: extreme weather or lack of sunshine does not shut them down. They also take up less of a footprint than, say, wind farms.  

However, despite their benefits and usefulness, nuclear energy has a bit of a sordid history which has led to a persistent, albeit fading in recent years, negative reputation. While atomic radiation and nuclear fission were researched and developed starting in the late 1800s, many of the advancements in the technology were made between 1939-1945, where development was focused on the atomic bomb. First generation nuclear reactors were developed in the 1950s and 60s, and several of these reactors ran for close to 50 years before decommission. It was in 1986 the infamous Chernobyl nuclear disaster occurred: a flawed reactor design led to a steam explosion and fires which released radioactive material into the environment, killing several workers in the days and weeks following the accident as a result of acute radiation exposure. This incident would have a decades-long impact on the perception of the safety of nuclear reactors, despite the significant effect of the accident on reactor safety design. 

Nuclear Reactor Safety

Despite the perception formed by the events of Chernobyl and other nuclear reactor meltdowns such as the 2011 disaster in Fukushima, Japan, nuclear energy is actually one of the safest energy sources available to mankind, according to a 2012 Forbes article which ranked the mortality rate per kilowatt hour of energy from different sources. Perhaps unsurprisingly, coal tops the list, with a global average of 100,000 deaths per trillion kilowatt hour. Nuclear energy is at the bottom of the list with only about 0.1 deaths per trillion kilowatt hour, making it even safer by this metric than natural gas (4,000 deaths), hydro (1400 deaths), and wind (150 deaths). Modern nuclear reactors are built with passive redundant safety systems that help to avoid the disasters of their predecessors.

Dusty’s research helps to address one of the issues surrounding nuclear reactor safety: coolant material. Typical reactors use water as a coolant: water absorbs the heat from the reaction and it then turns to steam. Once water turns to steam at 100 degrees Celsius, the heat transfer is much less efficient – the workaround to this is putting the water under high pressure, which raises the boiling point. However, this comes with an increased safety risk and a manufacturing challenge: water under high pressure requires large, thick metal vessels to contain it.

Sodium, infamous for its role in the inorganic compound known as salt, is actually a metal. In its liquid phase, it is much like mercury: metallic and viscous. Liquid sodium can be used as a low-pressure, safer coolant that transfers heat efficiently and can keep a reactor core cool without requiring external power. The boiling point of liquid sodium is around 900 degrees Celsius, whereas a nuclear reactor operates in the range of around 300-500 degrees Celsius – meaning that reactors can operate within a much safer range of temperatures at atmospheric pressure as compared to reactors that use conventional water cooling systems.

Dusty’s research is helping to push the field of nuclear reactor efficiency and safety into the future. Nuclear energy promises a safer, greener solution to the energy crisis, providing a potent alternative to current fuel sources that generate greenhouse gas emissions. Nuclear energy utilized efficiently could even the capability to power the sequestration of carbon dioxide from the atmosphere, leading to a cleaner, greener future. 

Did we hook you on nuclear energy yet? Tune in to the show or catch the podcast to learn more about the history, present and future of this potent and promising energy source!  Be sure to listen live on Sunday January 30th at 7PM on 88.7FM or download the podcast if you missed it.

Happy New Year 2017!

Happy New Year from all of us at Inspiration Dissemination! It’s been a great year with fantastic guests on our program. We’ll be back on the air January 15th with Joe Donovan, who’s working on his MFA in Creative Writing! Stay tuned and stay inspired!

Word butt describing guest research in 2016

A word cloud of research descriptions from our guests in 2016