Tag Archives: flooding

How high’s the water, flood model? Five feet high and risin’

Climate change and the resulting effects on communities and their infrastructure are notoriously difficult to model, yet the importance is not difficult to grasp. Infrastructure is designed to last for a certain amount of time, called its design life. The design life of a bridge is about 50 years; a building can be designed for 70 years. For coastal communities that have infrastructure designed to survive severe coastal flooding at the time of construction, what happens if the sea rises during its design life? That severe flooding can become more severe, and the bridge or building might fail.

Most designers and engineers don’t consider the effects of climate change in their designs because they are hard to model and involve much uncertainty.

Kai at Wolf Rock in Oregon.

In comes Kai Parker, a 5th year PhD student in the Coastal Engineering program. Kai is including climate change and a host of other factors into his flood models: Waves, Tides, Storms, Atmospheric Forcing, Streamflow, and many others. He specifically models estuaries (including Coos and Tillamook Bay, Oregon and Grays Harbor, Washington), which extend inland and can have complex geometries. Not only is Kai working to incorporate those natural factors into his flood model, he has also worked with communities to incorporate their response to coastal hazards and the factors that are most important to them into his model.

Modeling climate change requires an immense amount of computing power. Kai uses super computers at the Texas Advanced Computing Center (TACC) to run a flood model and determine the fate of an estuary and its surroundings. But this is for one possible new climate, with one result (this is referred to as a deterministic model). Presenting these results can be misleading, especially if the uncertainty is not properly communicated.

Kai with his hydrodynamic model grid for Coos Bay, Oregon.

In an effort to model more responsibly, Kai has expanded into using what is called a probabilistic flood model, which results in a distribution of probabilities that an event of a certain severity will occur. Instead of just one new climate, Kai would model 10,000 climates and determine which event is most likely to occur. This technique is frequently used by earthquake engineers and often done using Monte Carlo simulations. Unfortunately, flooding models take time and it takes more than supercomputing to make probabilistic flooding a reality.

To increase efficiency, Kai has developed an “emulator”, which uses techniques similar to machine learning to “train” a faster flooding model that can make Monte Carlo simulation a possibility. Kai uses the emulator to solve flood models much like we use our brains to play catch: we are not using equations of physics, factoring in wind speed or the temperature of the air, to calculate where the ball will land. Instead we draw on a bank of experiences to predict where the ball will land, hopefully in our hands.

Kai doing field work at Bodega Bay in California.

Kai grew up in Gerlach, Nevada: Population 206. He moved to San Luis Obispo to study civil engineering at Cal Poly SLO and while studying, he worked as an intern at the Bodega Bay Marine Lab and has been working with the coast ever since. When Kai is not working on his research, he is brewing, climbing rocks, surfing waves, or cooking the meanest soup you’ve ever tasted. Next year, he will move to Chile with a Fulbright grant to apply his emulator techniques to a new hazard: tsunamis.

To hear more about Kai’s research, be sure to tune in to KBVR Corvallis 88.7 FM this Sunday May, 27 at 7 pm, stream the live interview at kbvr.com/listen, or find it in podcast form next week on Apple Podcasts.