Categories
Uncategorized

Risks to living in California: A geospatial analysis

A Wetland in California
Wetland in California Source: http://imagebank.biz/wallpapers/best-wallpaper-of-california-wallpaper-of-morning-early.html

Geological changes to the California landscape are causing California to become more risk prone to climate and natural disasters. Geological changes lead to excessive fuel loads for California wildfires and weather patterns that spread these fires, weaken infrastructures like levee systems that prevent flooding, and weaken soil strength thus causing landslides.

For the past seven years California has received an increase in departures and decrease in arrivals. Despite the increase in departures, California still remains the number one state people migrate to. In 2018 and 2019 approximately 480,204 and 501,023 arrivals respectively (US Census Bureau). There are many contributing factors as to why people migrate. According to the US Census Bureau the top four reasons are better home, to establish a household, family reasons, and new job. Only an average of 0.6% of people that participated in the census stated that their reasons for moving were climate or natural disaster related. So why is extreme climate and natural disaster related reason so significant when such a small number of individuals leave for this reason? The answer is, extreme climate and natural disasters have higher risk to safety, health, and property damage.

According to FEMA, California has experienced the highest number of natural disasters than any other state. Since 1953 they have experienced 284 federally declared disasters. Most incidences are due to fires, floods, and earthquakes. Compared to some other states, Texas ranks second at 255 and Oklahoma ranks third at 173. California Counties that have the highest number of disasters appear along the western coast and south. Fire total 73% of the total disasters. Floods total nearly 12%, earthquakes and severe storms total 4% each, and the remaining declarations; levee break, freezing, coastal storms, drought, fishing losses, hurricane all total nearly 5% collectively. Although the western and southern counties maintain the highest number of disasters, fires appear in multiple counties across the state and significantly increased since 1999. In Figure 1a, 1b, 1c, and 1d shows the different counties that experienced the varying disasters. Every county has experienced a disaster since 1953.

Interactive map available at USGS website https://www.fema.gov/data-visualization-summary-disaster-declarations-and-grants

The following annotated bibliography will provide information on the relationship between migration influenced population growth and California’s increasing number of climate and natural disasters. This annotated bibliography will also explain why extreme climate and natural disasters are necessary to track and useful for development models.


Donner W., Rodríguez H. (2008). Population Composition, Migration and Inequality: The Influence of Demographic Changes on Disaster Risk and Vulnerability, Social Forces. Volume 87. Issue 2. December 2008, Pages 1089–1114, https://doi.org/10.1353/sof.0.0141

Donner and Rodríguez take a comprehensive look at population growth, composition, and distribution in the context of disaster risk and vulnerability in the United States. They state that increasing population and coastal development has resulted in elimination of buffer zones along the coast. They also state that other demographic and environmental changes have systemically exposed greater numbers to natural hazards. Socioeconomic status contributes to a person’s ability to have access to disaster-resistant housing and/or affording insurance. In many cases, those at a disadvantage are not able to relocate once settled. Donner and Rodriguez stress that assessment of vulnerability requires a look at complex cooperative arrangements meaning incorporation of human ecology into research methods.

During my review Donner and Rodríguez mention a relationship between economic development and adverse effects to geology, such as losing the coastal buffer zones, but overall, the paper focuses on population growth increasing a disadvantage communities’ vulnerability because they migrate to places that are prone to disaster. Socioeconomic factor prevents these communities from building disaster resistant infrastructure. I found this paper useful because it highlights how migration and population growth effect the landscape and disadvantaged communities. The paper does an excellent job explaining effects of migration as a systemic problem without proper resources to accommodate population growth.

Goodchild, M. F. & Glennon J. A. (2010). Crowdsourcing geographic information for disaster response: a research frontier. International Journal of Digital Earth. 3:3, 231-241. DOI: 10.1080/17538941003759255

Goodchild and Glennon present findings on the usefulness of volunteer geographic information (VGI) for disaster response. They state that geospatial data and tools are not used to their full potential. In many cases people living far from an impacted area are better informed through media than those managing the relief effort. VGI provides a near real-time response to disasters. VGI is met with some criticism due to non-experts performing the task of GIS compilation. Also, sites such as OpenStreetMap, Flickr, and Wikimapia are not subject to quality control. The authors discuss several key issues; is geospatial information best suited for volunteers, what factors determine the quality of VGI, how do researchers synthesize the data gathered, and what are the impacts on society. They author’s also explain why VGI works. Using Tobler’s law as a basis, the authors conclude that many sources of information about the same event provides a fact check on the information. If the data was unlike what is known of the event area, then the quality of that information is considered poor and vice versa. The biggest errors people have to encounter when assessing their risk through VGI are false negatives and false positives. Despite this, case studies prove that the speed of information from VGI provides a better emergency response service.  

The authors make an excellent point about emergency agencies limits on staff and acquiring geographic information. VGI has filled a gap for many public citizens seeking information during a disaster. As population grows more and more crowdsourcing mechanisms will emerge to disseminate information because the administration of California cannot handle it alone.

Wu, Jianguo. (2014). Urban Ecology and sustainability: The state-of-the–science and future directions. Landscape and Urban Planning. 125, 209-221.

Wu’s paper states the importance of changing cities and urban ecology sustainability. The author spends time defining urban ecology over time. Essentially urban ecology included ecosystems and landscapes for a time (Est. 1920), then bio-ecologist started studying ecology without “humans” (Approx. 1940), then returned to correlate with ecosystems and landscapes again. The definition has also expanded and diversified into sustainability and adaptive management studies. The term is now used by geographers, planners, and social scientists. Wu also highlights advances in urban ecology. These advances include the study of how urbanization alters the composition and spatial arrangement of the landscape elements. The discussion explores major advances such as alterations in spatial and temporal patterns in urbanization, effects of urbanization on biodiversity and ecosystem processes, and impacts of urbanization on ecosystem services and human well-being. Wu considers urban ecology an interdisciplinary and transdisciplinary science that integrates research and practice.

Wu provides a conceptual view of modern urban ecology. The paper not only included references to multiple urban ecology frameworks, but also provided great points of reference and recommendations to policy and decision makers. Like the previous annotated bibliography’s Wu’s paper also highlights the relationship between urbanization influencing changes to the landscape. Most importantly, Wu’s paper provides a path forward to secure urban sustainability.

Millar, C.I. and N.L. Stephenson. (2015). Temperate forest health in an era of emerging megadisturbance. Science 349:823-826.

Millar and Stephenson address the effects of megadisturbances on forest health. Some factors that threaten sustainability are natural, such as temperature, native species, and pathogens. Other factors are anthropogenic stressors such as atmospheric pollution and invasive species. The authors explain the new thresholds for forest health and megadisturbances. Specifically, researches are searching for forest tolerance. Anthropogenic climate change has increased the severity, frequency, and extent of disturbances and human population growth has increased the demands of forest ecosystem services. Millar and Stephenson developed a model called “Temperate forest response to 21st-century disturbances” Within this model, forests will respond one of two ways during regeneration; 1) the forest will sustain itself and return to it’s original state, 2) the forest will change the species of trees but maintain ecosystem services, 3) the forest will change the species of trees but ecosystem services will decline, or 4) the forest will convert to a non-forest type such as shrub and grassland. Due to the outcomes of the model climate adaptation will require a more comprehensive strategy that includes proactive methods.

I find the four responses significant in this paper because they model recovery options that could help decision makers develop a climate adaptation strategy. As per my introduction, this paper highlights the importance of tracking climate and natural disasters to inform future development. This paper also touches on the influence of anthropogenic stressors on an ecosystem and strength of the landscape.


Map of 210 fire disasters divided by California counties.
210 California fires occurred since 1953. Shaded areas are reported fires by county. Darker areas have more occurrences. View source for interactive map.
SOURCE: FEMA https://www.fema.gov/data-visualization-summary-disaster-declarations-and-grants

California fires occurred from 1989-2019
California fires occurred from 1989-2019. View source for interactive map.
SOURCE: CAL FIRE, USDA Forest Service Region 5, USDI Bureau of Land Management & National Park Service https://databasin.org/datasets/bf8db57ee6e0420c8ecce3c6395aceeb

Rogan, J. & Franklin, J. (2002). Mapping Wildfire Burn Severity in Southern California Forests and Shrublands Using Enhanced Thematic Mapper Imagery. Geocarto International Geocarto International Centre, G.P.O. Box. 16. 10.1080/10106040108542218.

Rogan and Franklin use Landsat ETM imagery (30 m pixel spatial resolution) to determine burn severity across several vegetation types. Topography variation includes chaparral shrublands, shrub wetlands, oak woodlands, mixed riparian corridors, coastal sage scrub, and annual grasslands. At this time (2002) there was no published examples of fire severity mapping in southern California. The mapping effort also aids gathering data of climate impacts over time. The use of higher spatial resolution resulted in increased accuracy of burn area and perimeter delineation. The authors were also able to correct some topographical illuminations that caused spectral confusion. For example, it is difficult to tell the difference between shaded unburned vegetation patches, shaded non-vegetated patches, and burned patches.. The authors developed a formulas for image processing and spectral mixture analysis. In conclusion, they were able to develop maps with low error and rank burn severity by class.

I believe this effort provides a great foundation for advancing geospatial data. The lack of data prior to this study is disturbing considering southern California is prone to disasters such as fire. Determining burn severity plays a key role in land use planning and public knowledge of livable areas.

Larkin, N. K., O’Neill, S. M., Solomon, R., Raffuse, S., Strand, T., Sullivan, D. C., Krull, C., Rorig, M., Peterson, J., Ferguson, S. A. (2009) The BlueSky smoke modeling framework. International Journal of Wildland Fire 18, 906-920. https://doi.org/10.1071/WF07086

Larkin et al. discuss the importance of the BlueSky smoke modeling framework.  Smoke lifted into the airshed from fires cause air quality impacts on continental scales. The public is developing in increased intolerance for smoke. Planners in land management express an urgent need for decision-support tools that incorporate the latest science knowledge to aid managers in understanding the impact of fires on air quality. The BlueSky tool is a modular framework the integrates existing datasets and models into a unified structure. This framework also works in real time and is developing a web-based interactive tool. BlueSky functions as an interface between the model types. Some fire inputs that are connected to BlueSky are SMARTFIRE, ClearSky, RAZU, FASTRACS, and SMOKEM2. Two case studies were conduced in Washington and Idaho to determine series of impact from the Rex Creek and Frank Church fires. In conclusion the framework helps organize knowledge of fuels, fire behavior, consumption, emissions, plume rise, dispersion, and smoke.

Impacts on air quality have surfaced as recently as this year due to the California wildfires. Impacts from smoke forced much of the state to remain indoors during peak days. Creating a unified system that is available to the public is critical for information discrimination. Also, as the authors stated, this information is important to land managers to prioritizing vegetation and fuel treatments for prescribed burns.

Keeley, J.E., Safford, H., Fotheringham, C.J., Franklin, J., Moritz, M. (September 2009). The 2007 Southern California Wildfires: Lessons in Complexity, Journal of Forestry, Volume 107. Issue 6. Pages 287–296, https://doi.org/10.1093/jof/107.6.287

Keeley et al present a case study on the 2007 southern California wildfires. These 28 fires collectively burned 100,000 acres of land and included several megafires. The authors discuss the varying effects of each fire’s cause, severity, and spread. Some of the finding include prior year(s) drought conditions contributed to low fuel moisture and the severity of the fire. Other causes include lack of understanding of where fuel treatment is effective. The autumn month have the highest risk of fire and historically fires have appeared most often during this period. Without sufficient data it is difficult to bring stakeholders to a common understanding. The Santa Ana winds did not play a significant role in the severity or spread of the fires but terrain and lack of accessibility to fires played a significant role. Southern California is at high risk for repeated fires which compromises the ecosystems ability to recover. Woody ecosystem experienced short fire cycles (5-10 years) which is not enough time for new growth to build carbohydrate reserves or soil seed banks. Urban fuels such as homes with highly flammable building material contributed to the severity and spread of fires. Severity can occur even if the canopy is relatively thin because the urban structures provide the fuel. Lastly, extensive livestock grazing changed the landscape to provide viable fuel for some of the fires. In conclusion, terrain, drought, fuels (natural and urban), and accessibility contribute to the Southern California fire problem.

What I found interesting about this paper is the reoccurring theme of fuel. In some cases, the fuel was natural. In most cases involving large fires, the fuel was caused by humans directly (ep. urban structures) or indirectly (ep. significant livestock grazing). According to the authors this means almost all southern California fires are avoidable. This case study supports necessary action plans for California’s fire problem.

Nauslar, N.J., Abatzoglou, J.T., Marsh, P.T. (2018). The 2017 North Bay and Southern California Fires: A Case Study. Fire 2018, 1, 18.

Nauslar et al provide a technical analysis of the 2017 North Bay and southern California fires. They conclude that Californian’s reside in a wildland-urban interface (WUI) and that populations in the WUI is expected to increase. Residing in a WUI puts those residence at risk of wildfires. Now due to the Californian’s proximity to fire prone areas, fires have more favorable conditions to start and expand. Particularly in California’s Mediterranean environments which are highly populated and often converted for land-use. The case study does not focus on anthropogenic causes but does state that most of the fires in the study were human induced. Causes and extent of the fires are separated by ground conditions and meteorological conditions. Fires that occurred in the North Bay have slightly different causes and extent than fires in Southern California. In the north the fires had a large fuel base due to the Mediterranean climate, previous year extreme drought, and lack of precipitation from late rains in the fall of 2017. Fires also spread due to available fuel and climate conditions such as the Diablo and Foehn winds, and fuel from shrub, annual grasses, and urban structures. The terrain made access difficult. In the south, fires were fueled by Transverse Ranges and spread due to similar reasons as the northern fires. Fires were also subject to southern climate conditions such as the Santa Ana and offshore winds. Fire damage was extensive and hit record highs. The full extend of the damage is still under investigation. So far, suppression costs exceeded $400 million and $10 billion in insurance claims. Overall economic impact including displacement and evacuation efforts of residents exceeds $85 billion. Some data and relationships were inconclusive. The authors were not able to determine the effects of climate change on metrological conditions that effect the National weather Service’s “Red Flag Warning”. Furthermore, the Red Flag Warning system requires modification so residence can understand the warning program. The authors conclude that better reactionary measures could have helped reduce some of the impacts the fires had. Extensive work must be done to co-exist with fires and mitigate impacts.

Although the authors highlight anthropogenic influence on the fires, they continue to express the idea that a greater warning system is necessary to mitigate effects of fires. The data used in this case study is extremely useful, but the opinion they provide does not seem to address risks to land use development. In addition to mitigation measures, I believe the data provided in this paper can also be used to inform decision makers about the risks of fires accessibility to fuel, then make critical changes to where development is planned.


Map of 37 flood disasters divided by California counties
37 California floods occurred since 1953. Shaded areas are reported floods by county. Darker areas have more occurrences. View source for interactive map.
SOURCE: FEMA https://www.fema.gov/data-visualization-summary-disaster-declarations-and-grants

Florsheim, J. L., and Dettinger, M. D. (2007). Climate and floods still govern California levee breaks, Geophys. Res. Lett., 34, L22403, doi:10.1029/2007GL031702.

Florsheim and Dettinger assembled a 155-year record of levee breaks for California and found that breaks occurred 25% of years during the 20th century. They address a relationship between levee breaks, climate phenomena and geomorphic change due to anthropogenic alterations to major river systems. The study area is the Sacramento-San Joaquin River system. Levee breaks occurred 12% of the time during climate phenomena such as El Nino and La Nina. Breaks occurred within the Sacramento-San Joaquin River system 25% and during high flow seasons.

The others provide an excellent look at a trend that is often overlooked. Resilience to climate change is an emerging factor that needs more attention. This paper provides further evidence that anthropogenic changes will continue to destabilize resilience of the California landscape.

Ludy, J., Kondolf, G.M. (2012). Flood risk perception in lands “protected” by 100-year levees. Nat Hazards 61, 829–842. https://doi.org/10.1007/s11069-011-0072-6

Ludy and Kondolf provide conclusions from a survey measuring a sample of Sacramento-San Joaquin Delta residence’ knowledge of their flood risk from a 100-year flood levee. They purposely measured those with higher education (advance degrees) and median income (at least $80 thousand). The survey sample location is Spanos Park West and measured approximately 400 residence in the area. The survey measured if the residence believed they were at risk, if they were concerned, confidence, perceived depth of water and damage, compared risk to flood to other disasters, prior exposure to flood, if the residence understood the term “100-year flood”, preparation, and knowledge of the area. In all instances the residence severely underestimated their risk to flooding in the area. Some did not purchase flood insurance. Ludy and Kondolf (2011) concluded that education level and income did not have a relationship with perceived risk. Rather, they were unaware due to the flood zone standards set by US National Flood Insurance Program (NFIP). NFIP told the residence they were not in a flood zone, which is technically true. But residents are unaware that they are still at risk of flood if their levee should break. The authors highlight how classifications of zones can skew perception of risk for the public. Public education approaches are necessary to people who dwell behind levees.

NFIP standards pose a greater problem for current developments. As the authors stated, misinformation is shared with potential residents of the area skewing their perception of risk. Allowing further development in these zones is also unsustainable because the homes do not have to adhere to the “100-year flood” levee height standards and are often built below sea level. The conclusion in the survey are useful for concerned policy makers and citizens that live or are considering residence near a levee.

Pam Rittelmeyer. (2020). Socio-cultural perceptions of flood risk and management of a levee system: Applying the Q methodology in the California Delta. Geoforum. Volume 111, 2020, Pages 11-23, ISSN 0016-7185. https://doi.org/10.1016/j.geoforum.2020.02.022.

Rittlemeyer discusses conflicting interest over natural resources in the Sacramento-San Joaquin Delta. They use a Q methodology with a variety of Delta stakeholders to assess how various perceptions of flood risk align with acceptance of the suite of potential adaptation measures. The Q methodology also provides a shared learning process for the stakeholders which can help improve management outcomes. It is essential for the Q methodology to identify distinct views not derive a viewpoint from a population. There were 33 participants that were chosen because they spoke in public, served on a committee, or are a decision-maker for an organization on the issue of Delta flood management. The author identified stakeholder statements and compared them to others to create a basis of interpretation. Four factors were identified; there is no crisis, locals are resilient, nature is resilient, and human ingenuity will prevail. Rittelmeyer concluded that people who feel less control over a hazard tend to perceive a risk to be higher than those who think they have control over it.

The Q methodology the author used provides an excellent source of information and framework for social science integrated into natural sciences. Additional studies on risk perception are necessary for those living and migrating to California. I think one of the most important points the author states are the opportunity for the Q methodology to create a shared experience.

Jibson, R.W., Harp, E.L., Michael, J.A. (2000). A method for producing digital probabilistic seismic landslide hazard maps, Engineering Geology, Volume 58, Issues 3–4, 2000, Pages 271-289, ISSN 0013-7952, https://doi.org/10.1016/S0013-7952(00)00039-9

Jibson et al combined datasets based on Newmark’s permanent-deformation model on the 1994 Northridge California earthquake. The datasets were digitized and rasterized at 10 m grid spacing in the ARC/INFO GIS platform. Newmark’s model uses a landslide as a block that slides on a n inclined plane. The datasets include a comprehensive inventory of triggered landslides, about 20 strong-motion records of the maintained shock recorded throughout the region, 1:24,000-scale geologic mapping of the region, extensive data on engineering properties of geologic units, and high-resolution digital elevation models of the topography. The authors used a flow chart to determine steps in producing a seismic landslide hazard map. The steps include computing the statistic factor of safety, compute the critical acceleration, estimate Newmark displacements, construct a curve to estimate probability of slope failure, and generate maps showing probability of seismic slope failure. The study showed that most fails will fail within a narrow and relatively low range of displacements. The data is useful in predicting the spatial distribution of shallow landslides and rock and debris falls.

Focusing on the Northridge earthquake provides a unique look at seismic impacts. Land use planners can determine if an area is susceptible to landslides before development. They can also determine the severity of the potential slides. Further research is needed to determine all areas susceptible to earthquake landslides in California.

He, Y. and Beighley, R.E. (2008). GIS‐based regional landslide susceptibility mapping: a case study in southern California. Earth Surf. Process. Landforms, 33: 380-393. https://doi.org/10.1002/esp.1562

He and Beighley highlight the need for GIS-based landslide susceptibility models to help determine generate dimensionless indexes to quantify the susceptibility of landslides within selected indicator categories, develop indexes to weight the influence of selected indicator parameters on the occurrence of landslides, present a GIS-based mapping approach for landslide susceptibility in southern California and, develop GIS data that describes the extent and location of landslides in selected regions of southern California (He and Beighley, 2007). This assessment will inform decision makers of potential risk, develop early warning systems, mitigate plans and land use restrictions. Urban expansion and climate change are both motivating factors for development of this model. The authors provide a brief history of geomorphological analysis, heuristic models, index or parameter methods, and deterministic or probabilistic models. The authors decide to use GIS-based multivariate statistical approach for their susceptibility assessment. The region studies covered three areas South Coast, Transverse Ranges and Peninsular Ranges. This region (8% of California’s land mass) provided great differences in elevation and slope, displays a large diversity in weather and climate. The methodology used measures the probability that a region will be affected by landslides given a set of environmental conditions. The authors did a great job providing their formulas for each factor category, method for determining potential risk factor relationship, susceptibility analysis and validation, and data sources. In conclusion landslide susceptibility occupy 26% of the total study area. The authors highlight the events leading to potential destabilization of the soil such as ecosystem damage from wildfires and floods. Results were inconclusive because Soil Survey Geographic and State Soil Geographic did not provide data for the entire study area. I really appreciate the extend of the case study and development of the methodology. GIS-based landslide models are absolutely necessary for development purposes. Determining the source of susceptibility is also necessary for development and land use restrictions. The authors state that further data is needed to determine the reason why (outside of the seven risk factors) soil strength is weaker in these areas.


Map of 12 earthquake disasters divided by California Counties
12 California earthquakes occurred since 1953. Shaded areas are reported earthquakes by county. Darker areas have more occurrences. View source for interactive map.
SOURCE: FEMA https://www.fema.gov/data-visualization-summary-disaster-declarations-and-grants

Petersen, M.D., Cao, T, Campbell, K. W., Frankel, A. D. (2007). Time-independent and Time-dependent Seismic Hazard Assessment for the State of California: Uniform California Earthquake Rupture Forecast Model 1.0. Seismological Research Letters ; 78 (1): 99–109. doi: https://doi.org/10.1785/gssrl.78.1.99

Petersen et al discuss time-dependent and time-independent seismic hazards based on U.S. Geological Survey and California Geological Survey (USGS-CGS) seismic hazard model. Time-dependent models of earthquake assumes the probability of occurrence follows a Poisson distribution and results are independent of the time since the last event. Time-dependent assumes the probability of occurrence follows a renewal model. Assessment and comparison of these models provide a basis for the Regional Earthquake Likelihood Models (RELM) that will be developed over the next few years. The time-dependent maps differ by about 10% to 15% from the time-independent maps near A-fault source (Petersen et al., 2007).

The authors highlight the importance of using these maps for making important risk mitigation decisions regarding building design, insurance rats, land use planning, and public policy. All of which are marked as important in my intro. Although this paper does not address risk to life and livelihood, it does provide an in depth look at the importance of best available science for seismic probability. The authors also provide excellent resources for others to repeat the assessment and contribute to further studies.

Wang, Z. (2011). Seismic Hazard Assessment: Issues and Alternatives. Pure Appl. Geophys. 168, 11–25.  https://doi.org/10.1007/s00024-010-0148-3

Wang provides an intercut look at the difference between seismic hazard and risk. Seismic hazard and risk are commonly used terms in engineering design and policy. Wang explains that high seismic hazard does not necessarily mean high seismic risk. Seismic hazard may or may not be mitigated but a seismic risk can always be mitigated or reduced. Estimating seismic hazard and risk also requires different approaches. Seismic hazards are determined by level of severity, spatial measurement, and temporal measurement. Seismic risk relies on the physical interaction with the hazard, vulnerability, and how hazard and vulnerability interact in time and space. Wang warns that seismic risk is complicated and can be expressed in many different ways for different users. For example, seismic risk comparisons can inform engineers to build higher seismic design loads in the Bay or determine insurance needs. Deterministic seismic hazard assessment (DHSA and probability seismic hazard analysis (PHSA) are calculations Wang compared for seismic risk. They concluded that PHSA was more unreliable than DHSA because DHSA accounts for temporal characteristics and PHSA has no physical or mathematical basis. Further work on DHSA temporal characteristics need attention in order to advance the practice.

Understanding the difference between seismic hazards and risk is important for purposes of engineering design in vulnerable areas such as the fault lines of California. This paper provides further evidence that geographic information for seismic activity requires further study and coordination of researchers. As this work is being done, the public is remains vulnerable to seismic activity.


Introduction References

Federal Emergency Management Agency (FEMA) https://www.fema.gov/data-visualization-summary-disaster-declarations-and-grants

United States Census Bureau https://www.census.gov/topics/population/migration.html


Print Friendly, PDF & Email

Leave a Reply

Your email address will not be published. Required fields are marked *