Report Shows U.S. Nuclear Regulators Still Ignoring Lessons of Fukushima Disaster
By Christopher Paine
Three years after Japan’s nuclear disaster, U.S. reactors remain vulnerable to the threat of runaway hydrogen production and leakage in a severe nuclear accident, with little or no capacity to safely reduce or vent potentially explosive concentrations of this gas, or capture its hazardous radioactive constituents before it explodes and contaminates the surrounding region, as occurred at Fukushima in March 2011.
That is the conclusion of a newly released NRDC report, "Preventing Hydrogen Explosions In Severe Nuclear Accidents: Unresolved Safety Issues Involving Hydrogen Generation And Mitigation."
The report musters a multitude of technical evidence showing that the U.S. Nuclear Regulatory Commission (NRC) underestimates the rate, extent and likely impacts of hydrogen production in severe loss-of-coolant accidents, and thus continues to ignore the lessons of Fukushima when it comes to ensuring “defense in depth” against the risks of a hydrogen explosion once a severe accident is in progress.
The report urges the NRC to require more frequent and authentic “leak-rate” tests of reactor containments, and to re-benchmark its computational capability for assessing hydrogen production in severe accidents with data obtained from realistic core damage experiments, as two essential predicates for setting new NRC requirements for U.S. nuclear power stations to minimize hydrogen explosion risk.
The aging fleet of U.S. reactors, which will increasingly operate beyond their initial 40-year term license terms, is now facing severe competitive pressures in wholesale competitive power markets, setting up difficult tradeoffs between low-carbon electricity supply, continued commercial viability, and the new investment required to sustain public safety. Many of the oldest nuclear units are General Electric Boiling Water Reactors (BWRs), with undersized Mark 1 and Mark II primary containments that the NRC has known for decades are especially vulnerable to hydrogen leaks under the elevated pressure conditions expected to occur in severe accidents.
Mark Leyse, the principle author of the report and a technical consultant to NRDC, is critical of the NRC’s apparent willingness to accede to recent licensee requests to further relax and defer requirements for periodic containment pressurization and leak rate testing: He notes that “American BWR Mark I and II containments in particular have performed poorly in leak rate tests, yet the NRC is planning to further extend the permitted intervals between these tests, casting a blind eye toward the hydrogen explosions that occurred in three units of this very design at Fukushima.”
As his report explains in detail, hydrogen is produced in severe loss-of-coolant nuclear accidents when the overheated zirconium alloy tubes that surround the uranium fuel pellets chemically react with steam and undergo rapid oxidation, releasing hydrogen. Above about 1832 deg. F this reaction becomes “autocatalytic,” meaning it becomes self-sustaining by virtue of the heat produced by the chemical reaction alone, while the heat from radioactive decay that is responsible for initially heating up the zirconium fuel cladding continues to make a contribution that declines steadily with time from reactor shut-down. When an overheated core reaches this point, it is said to be in a “thermal runaway” condition, capable of producing thousands of kilograms of combustible hydrogen that can leak out and explode.
Leyse’s investigation found that the NRC’s regulatory passivity is grounded in the computer models it relies on to set safety requirements. These models do not accurately predict the onset of rapid hydrogen production, or the rates of hydrogen production shown in severe fuel damage experiments conducted in the 1980's and 1990's. In short, the NRC seems to be operating with an inadequate technical understanding of the nuclear accident risk it is tasked by statute to minimize.
While most Pressurized Water Reactors (PWRs)—those with the large domed reinforced concrete and steel containments familiar to many Americans as the symbol of nuclear power—can withstand higher containment pressures than BWRs, and have larger volumes in which to disperse hydrogen leaks, thereby potentially avoiding detonable concentrations, the report notes that most US reactors “are not equipped to detect and control dangerous concentrations of hydrogen in all the places where it could migrate and explode in a nuclear power plant.” Nor, Leyse points out, has an analysis ever been done on the damage potential of flying objects generated in an explosion of hydrogen inside a containment. Yet we know from the Fukushima Daiichi accident that debris propelled by hydrogen detonations caused extensive damage to backup emergency power supplies and hoses that were intended to inject seawater into overheated reactors. Some of the debris dispersed around the site by explosions was highly radioactive, exposing personnel to higher dose rates and setting back their efforts to control the accident.
As previously noted by nuclear safety expert David Lochbaum of the Union of Concerned Scientists, poorly mitigated hydrogen explosion risk presents a serious threat to the so-called “FLEX” strategy for severe accident response proposed by the nuclear industry’s lobbying arm, the Nuclear Energy Institute, after Fukushima, and adopted almost verbatim by the NRC. The FLEX response strategy is essentially an array of remotely stored portable equipment that is supposed to be moved into place by workers in the immediate aftermath of a greater-than-expected triggering event, such as an earthquake, tornado, or flood, which severely damages the backup safety systems of the plant or leads to a complete loss of electrical power, temporarily disabling these systems. Inadequate hydrogen control during a severe accident could render key elements of the FLEX strategy ineffective at the very moment they are most needed.
The report also explores the little known fact that when confronted with the quantities of hydrogen produced in severe accidents, current token capabilities for hydrogen control are just as likely to trigger a hydrogen detonation as prevent one. For just this reason, NRDC has joined Riverkeeper in calling for the immediate removal of self-actuating “Passive Autocatalytic Recombiners” (PAR) devices from Indian Point nuclear generating station, located 28 miles north of New York City.
However, knowing when to safely operate electrically-powered versions of these devices, which can be turned on and off, requires knowing the concentration of hydrogen in their immediate vicinity. But in 2003, the report notes, the NRC took the odd step of reclassifying such monitors as “non-safety related equipment,” meaning the equipment no longer needed needed to have redundancy, seismic resistance, or an independent train of onsite standby power. Furthermore, NRDC’s investigation found that GE-BWR Mark I and Mark II designs operate with hydrogen monitors installed only in their nitrogen-filled primary containments, not in their reactor buildings. In the Fukushima Daiichi accident, hydrogen from three Mark I units leaked undetected into these buildings and exploded.
The inability of U.S. nuclear operators to monitor hydrogen concentrations in all plant areas where it could migrate during a severe accident is matched by another critical monitoring deficiency: Operators of PWRs lack a sufficient capability to monitor the onset and progression of the nuclear fuel degradation that leads to runaway hydrogen production in an accident. This deficient capability limits operator knowledge of when to transition from emergency operating procedures (EOPs)—intended to prevent fuel damage—to severe accident management guidelines (SAMGs)—intended to stabilize a damaged reactor core with auxiliary ad-hoc cooling measures while preventing significant off-site releases of radionuclide contamination.
Plant operators are supposed to implement SAMGs before the onset of the rapid zirconium-steam reaction, which leads to thermal runaway in the reactor core. Not knowing which regime one is operating in can have severe consequences. For example, PWR operators could end up re-flooding an overheated core simply because they do not know its actual condition. Unintentionally re-flooding an overheated core could generate hydrogen, at a rate as high as 5,000 grams per second, and the containment could be compromised if large quantities of that hydrogen were to detonate, as occurred at Fukushima.
The report explains that in PWRs, so called “core-exit thermocouples”—temperature measuring devices—are the primary equipment that would be used to detect inadequate core-cooling and signal the point at which operators should transition from EOPs to SAMGs. However, data from experiments demonstrate that core-exittemperature measurements are neither an accurate nor a timely indicator of maximum fuel-cladding temperatures in the core, and hence an unreliable indicator of the likelihood of significant hydrogen production. In the most realistic severe accident experiment ever conducted—in which an actual reactor core was heated with [radioactive] decay heat before melting down—core-exit temperatures were measured at approximately 800 degrees when maximum in-core fuel-cladding temperatures exceeded 3300 degrees. Relying on core-exit thermocouple measurements for timely detection of inadequate core cooling or uncovering of the core is neither reliable nor safe.
In the face of the NRC’s inaction on this critical safety matter, the report presents the following six recommendations for actions to reduce the risk of hydrogen explosions in severe nuclear accidents:
The NRC should develop and experimentally validate computer safety models that can conservatively predict rates of hydrogen generation in severe accidents.
The NRC needs to acknowledge that its existing computer safety models under-predict the rates of hydrogen generation that occur in severe accidents. The NRC should conduct a series of experiments with multi-rod bundles of zirconium alloy fuel rod simulators and/or actual fuel rods as well as study the full set of existing experimental data. The NRC’s objective in this effort should be to develop models capable of predicting with greater accuracy the rates of hydrogen generation that occur in severe accidents.
The safety of existing hydrogen recombiners should be assessed, with the use of Passive Autocatalytic Recombiner (PARs) potentially discontinued until technical improvements are developed and certified.
Experimentation and research should be conducted in order to improve the performance of self-actuating PARs so that they will not malfunction and incur ignitions in the elevated hydrogen concentrations that occur in severe accidents. The NRC and European regulators should perform safety analyses to determine if existing PARs should be removed from plant containments—and, if so, whether they should be replaced with electrically powered thermal hydrogen recombiners that have their own independent train of emergency power. The latter course would require operators to have instrumentation capable of providing timely information on the local hydrogen concentrations throughout the containment, so they could deactivate the thermal recombiners when hydrogen concentrations reached the levels at which the recombiners malfunction and incur ignitions.
Existing oxygen and hydrogen monitoring instrumentation should be significantly improved.
In line with the conclusions of the NRC’s own Advisory Committee on Reactor Safeguards (ACRS), the NRC should reclassify oxygen and hydrogen monitors as safety-related equipment which must undergo full qualification (including seismic qualification), must have redundancy, and must have has its own independent train of emergency electrical power.
The current NRC requirement that hydrogen monitors be functional within 90 minutes of emergency cooling water injection into the reactor vessel is clearly inadequate for protecting public and plant worker safety. The NRC should require that, following the onset of an accident, hydrogen monitors be functional within a timeframe that enables immediate detection of quantities of hydrogen indicative of core damage and a potential threat to containment integrity.
As first urged by our colleagues at the Union of Concerned Scientists, the NRC should also require hydrogen monitoring instrumentation to be installed in:
1) BWR Mark I and Mark II secondary containments;
2) fuel-handling buildings of PWRs and BWR Mark IIIs; and
3) any plant structure where it would be possible for hydrogen to enter.
Current core diagnostic capabilities require upgrading to provide plant operators a better signal for when to transition from emergency operating procedures to severe accident management guidelines.
The NRC should require plants to use thermocouples placed at different elevations and radial positions throughout the reactor core to enable plant operators to accurately measure a wide range of temperatures inside the core under both typical and accident conditions. In the event of a severe accident, in-core thermocouples would provide plant operators with crucial information to help them track the progression of core damage and manage the accident, indicating, in particular, the correct time to transition from EOPs to implementing SAMGs.
The NRC should require all nuclear power plants to control the total quantity of hydrogen that could be generated in a severe accident.
The NRC should require all nuclear power plants to operate with systems for combustible gas control that would effectively and safely control the total quantity of hydrogen that could potentially be generated in different severe accident scenarios; and to have strategies for venting gas from the inerted primary BWR Mark I and Mark II containments without causing significant radiological releases. The NRC should also require nuclear power plants to operate with systems for combustible gas control that are capable of preventing local concentrations of hydrogen in the containment from reaching concentrations that could support explosions powerful enough to breach the containment, or damage other essential accident-mitigating features. Hydrogen explosions are not expected to occur inside the primary BWR Mark I and Mark II containments, which operate with inerted atmospheres, unless somehow oxygen is present.
The NRC should require licensees who operate nuclear power plants with hydrogen igniter systems to perform analyses demonstrating that these systems would effectively and safely mitigate hydrogen in different severe accident scenarios. Licensees unable to do so would be ordered to upgrade their systems to adequate levels of performance.
The NRC should require that data from leak rate tests be used to help predict the hydrogen leak rates of the primary containment of each BWR Mark I and Mark II licensed by the NRC in different severe accident scenarios.
The NRC should require that data from overall leak rate tests and local leak rate tests—already required by Appendix J to Part 50 for determining how much radiation would be released from the containment in a design basis accident—also be used to help predict hydrogen leak rates for a range of severe accident scenarios involving the primary containments of each GE-BWR Mark I and Mark II licensed by the NRC. If data from an individual leak rate test were to indicate that dangerous quantities of explosive hydrogen gas would leak from a primary containment in a severe accident, the plant owner should be required to repair the containment.
The rationale for this requirement is obvious: Hydrogen explosions, or hydrogen concentrations in the reactor building that pose a detonation risk, can severely inhibit emergency response actions essential to containing the accident. Or even worse, emergency response actions themselves, such as hooking up portable power equipment, could actually provide the spark for hydrogen explosions in critical areas of the plant.
The NRC should also end its practice of allowing repairs to be made immediately before leak rate tests are conducted to evaluate potential leakage paths, such as containment welds, valves, fittings, and other components that penetrate containment. This “repair before test” practice obviously defeats the nuclear safety objective of providing an accurate statistical sample of actual pre-existing containment leak rates.
Finally, the NRC should reconsider its plan to extend the intervals of overall and local leak rate tests to once every 15 years and 75 months, respectively. The NRC needs to conduct safety analyses that consider BWR Mark I and Mark II primary containments vulnerable to hydrogen leakage. It also seems probable that as old reactors are kept in service beyond their original licensed lifetimes, the intervals between leak rate tests should be shortened rather than extended.
Visit EcoWatch’s NUCLEAR page for more related news on this topic.
On Thursday, April 22, the world will celebrate Earth Day, the largest non-religious holiday on the globe.
This Earth Day falls at a critical turning point. It is the second Earth Day since the start of the coronavirus pandemic and follows a year of devastating climate disasters, such as the wildfires that scorched California and the hurricanes that battered Central America. But the day's organizers still have hope, and they have chosen a theme to match.
"At the heart of Earth Day's 2021 theme, Restore Our Earth, is optimism, a critically needed sentiment in a world ravaged by both climate change and the pandemic," EarthDay.org president Kathleen Rogers told USA TODAY.
Last Earth Day marked the first time that the holiday was celebrated digitally to prevent the spread of COVID-19. This will largely be the case this year as well.
"Most of our Earth Day events will be virtual with the exception of individual and small group cleanups through our 'Great Global Cleanup' program," EarthDay.org's Olivia Altman told USA TODAY.
Tuesday, April 20: A Global Youth Summit begins at 2:30 p.m. ET featuring young climate activists like Greta Thunberg and Alexandria Villaseñor. This will be followed at 7 p.m. ET by "We Shall Breathe," a virtual summit organized by the Hip Hop Caucus to look at issues like the climate crisis, pollution and the pandemic through an environmental justice lens.
Wednesday, April 22: Beginning at 7 a.m. ET, Education International will lead the "Teach for the Planet: Global Education Summit." Talks will be offered in multiple languages and across multiple time zones to emphasize the importance of education in fighting the climate crisis.
Thursday, April 22: On the day itself, EarthDay.org will host its second ever Earth Day Live digital event beginning at 12 p.m. ET. This event will feature discussions, performances and workshops focusing on the day's theme of restoring our Earth through natural solutions, technological innovations and new ideas.
"EARTHDAY.ORG looks forward to contributing to the success of this historic climate summit and making active progress to Restore Our Earth," Rogers said in a press release. "We must see every country rapidly raise their ambition across all climate issues — and that must include climate education which would lead to a green jobs-ready workforce, a green consumer movement, and an educated and civically engaged citizenry around the world."
EarthDay.org grew out of the first Earth Day in 1970, which drew 20 million U.S. residents to call for greater environmental protections. The movement has been credited with helping to establish the U.S. Environmental Protection Agency and to pass landmark environmental legislation like the Clean Air and Water Acts. It has since gone on to be a banner day for environmental action, such as the signing of the Paris agreement in 2016. More than one billion people in more than 192 countries celebrate Earth Day each year.
This legacy continues. The organization called the scheduling of Biden's summit a "clear acknowledgement of the power of Earth Day."
"This is a critical stepping stone for the U.S. to rejoin the world in combating the climate crisis. In concert with several planned parallel EARTHDAY.ORG events worldwide, Earth Day 2021 will accelerate global action on climate change," EarthDay.org wrote.
Super-emitters are individual sources such as leaking pipelines, landfills or dairy farms that produce a disproportionate amount of planet-warming emissions, especially methane and carbon dioxide. Carbon Mapper, the non-profit leading the effort, hopes to provide a more targeted guide to reducing emissions by launching special satellites that hunt for sources of climate pollution.
"What we've learned is that decision support systems that focus just at the level of nation states, or countries, are necessary but not sufficient. We really need to get down to the scale of individual facilities, and even individual pieces of equipment, if we're going to have an impact across civil society," Riley Duren, Carbon Mapper CEO and University of Arizona researcher, told BBC News. "Super-emitters are often intermittent but they are also disproportionately responsible for the total emissions. That suggests low-hanging fruit, because if you can identify and fix them you can get a big bang for your buck."
The new project, announced Thursday, is a partnership between multiple entities, including Carbon Mapper, the state of California, NASA's Jet Propulsion Laboratory (JPL) and Planet, a company that designs, builds and launches satellites, according to a press release. The project is being implemented in three stages.
The initial stage, which is already complete, involved the initial engineering development. NASA and Planet will work together in the second stage to build two satellites for a 2023 launch. The third phase will launch an entire constellation of satellites starting in 2025.
The satellites will include an imaging spectrometer built by NASA's JPL, NASA explained in a press release. This is a device that can break down visible light into hundreds of colors, providing a unique signature for chemicals such as methane and carbon dioxide. Most imaging spectrometers currently in orbit have larger pixel sizes, making it difficult to locate emission sources that are not always visible from the ground. However, Carbon Mapper spectrometers will have pixels of around 98 square feet, facilitating more detailed pin-pointing.
"This technology enables researchers to identify, study and quantify the strong gas emission sources," JPL Scientist Charles Miller said in the press release.
Once the data is collected, Carbon Mapper will make it available to industry and government actors via an open data portal to help repair leaks.
"These home-grown satellites are a game-changer," California Governor Gavin Newsom said of the project. "They provide California with a powerful, state-of-the-art tool to help us slash emissions of the super-pollutant methane — within our own borders and around the world. That's exactly the kind of dynamic, forward-thinking solution we need now to address the existential crisis of climate change."
By Jenna McGuire
Commonly used herbicides across the U.S. contain highly toxic undisclosed "inert" ingredients that are lethal to bumblebees, according to a new study published Friday in the Journal of Applied Ecology.
The study reviewed several herbicide products and found that most contained glyphosate, an ingredient best recognized from Roundup products and the most widely used herbicide in the U.S. and worldwide.
While the devastating impacts of glyphosate on bee populations are more broadly recognized, the toxicity levels of inert ingredients are less understood because they are not subjected to the same mandatory testing by the U.S. Environmental Protection Agency (EPA).
"Pesticides are manufactured and sold as formulations that contain a mixture of compounds, including one or more active ingredients and, potentially, many inert ingredients," explained the Center for Food Safety in a statement. "The inert ingredients are added to pesticides to aid in mixing and to enhance the products' ability to stick to plant leaves, among other purposes."
The study found that these inert substances can be highly toxic and even block bees' breathing capacity, essentially causing them to drown. While researchers found that some of the combinations of inert ingredients had no negative impacts on the bees, one of the herbicide formulations killed 96% of the bees within 24 hours.
According to the abstract of the study:
Bees exhibited 94% mortality with Roundup® Ready‐To‐Use® and 30% mortality with Roundup® ProActive®, over 24 hr. Weedol® did not cause significant mortality, demonstrating that the active ingredient, glyphosate, is not the cause of the mortality. The 96% mortality caused by Roundup® No Glyphosate supports this conclusion.
"This important new study exposes a fatal flaw in how pesticide products are regulated here in the U.S.," said Jess Tyler, a staff scientist at the Center for Biological Diversity. "Now the question is, will the Biden administration fix this problem, or will it allow the EPA to continue its past practice of ignoring the real-world harms of pesticides?"
According to the Center for Food Safety, there are currently 1,102 registered formulations that contain the active ingredient glyphosate, each with a proprietary mixture of inert ingredients. In 2017, the group filed a legal petition calling for the EPA to force companies to provide safety data on pesticide formulations that include inert ingredients.
"The EPA must begin requiring tests of every pesticide formulation for bee toxicity, divulge the identity of 'secret' formulation additives so scientists can study them, and prohibit application of Roundup herbicides to flowering plants when bees might be present and killed," said Bill Freese, science director at the Center for Food Safety. "Our legal petition gave the EPA a blueprint for acting on this issue of whole formulations. Now they need to take that blueprint and turn it into action, before it's too late for pollinators."
ATTN @EPA: Undisclosed "inert" ingredients in #pesticide products warrant further scrutiny! ➡️ A new study compared… https://t.co/bdFwXCVHsD— Center 4 Food Safety (@Center 4 Food Safety)1618592343.0
Roundup — also linked to cancer in humans — was originally produced by agrochemical giant Monsanto, which was acquired by the German pharmaceutical and biotech company Bayer in 2018.
The merger of the two companies was condemned by environmentalists and food safety groups who warned it would cultivate the greatest purveyor of genetically modified seeds and toxic pesticides in the world.
Reposted with permission from Common Dreams.
By Ayesha Tandon
New research shows that lake "stratification periods" – a seasonal separation of water into layers – will last longer in a warmer climate.
These longer periods of stratification could have "far-reaching implications" for lake ecosystems, the paper says, and can drive toxic algal blooms, fish die-offs and increased methane emissions.
The study, published in Nature Communications, finds that the average seasonal lake stratification period in the northern hemisphere could last almost two weeks longer by the end of the century, even under a low emission scenario. It finds that stratification could last over a month longer if emissions are extremely high.
If stratification periods continue to lengthen, "we can expect catastrophic changes to some lake ecosystems, which may have irreversible impacts on ecological communities," the lead author of the study tells Carbon Brief.
The study also finds that larger lakes will see more notable changes. For example, the North American Great Lakes, which house "irreplaceable biodiversity" and represent some of the world's largest freshwater ecosystems, are already experiencing "rapid changes" in their stratification periods, according to the study.
As temperatures rise in the spring, many lakes begin the process of "stratification." Warm air heats the surface of the lake, heating the top layer of water, which separates out from the cooler layers of water beneath.
The stratified layers do not mix easily and the greater the temperature difference between the layers, the less mixing there is. Lakes generally stratify between spring and autumn, when hot weather maintains the temperature gradient between warm surface water and colder water deeper down.
Dr Richard Woolway from the European Space Agency is the lead author of the paper, which finds that climate change is driving stratification to begin earlier and end later. He tells Carbon Brief that the impacts of stratification are "widespread and extensive," and that longer periods of stratification could have "irreversible impacts" on ecosystems.
For example, Dr Dominic Vachon – a postdoctoral fellow from the Climate Impacts Research Centre at Umea University, who was not involved in the study – explains that stratification can create a "physical barrier" that makes it harder for dissolved gases and particles to move between the layers of water.
This can prevent the oxygen from the surface of the water from sinking deeper into the lake and can lead to "deoxygenation" in the depths of the water, where oxygen levels are lower and respiration becomes more difficult.
Oxygen depletion can have "fatal consequences for living organisms," according to Dr Bertram Boehrer, a researcher at the Helmholtz Centre for Environmental Research, who was not involved in the study.
Lead author Woolway tells Carbon Brief that the decrease in oxygen levels at deeper depths traps fish in the warmer surface waters:
"Fish often migrate to deeper waters during the summer to escape warmer conditions at the surface – for example during a lake heatwave. A decrease in oxygen at depth will mean that fish will have no thermal refuge, as they often can't survive when oxygen concentrations are too low."
This can be very harmful for lake life and can even increase "fish die-off events" the study notes.
However, the impacts of stratification are not limited to fish. The study notes that a shift to earlier stratification in spring can also encourage communities of phytoplankton – a type of algae – to grow sooner, and can put them out of sync with the species that rely on them for food. This is called a "trophic mismatch."
Prof Catherine O'Reilly, a professor of geography, geology and the environment at Illinois State University, who was not involved in the study, adds that longer stratified periods could also "increase the likelihood of harmful algae blooms."
The impact of climate change on lakes also extends beyond ecosystems. Low oxygen levels in lakes can enhance the production of methane, which is "produced in and emitted from lakes at globally significant rates," according to the study.
Woolway explains that higher levels of warming could therefore create a positive climate feedback in lakes, where rising temperatures mean larger planet-warming emissions:
"Low oxygen levels at depth also promotes methane production in lake sediments, which can then be released to the surface either via bubbles or by diffusion, resulting in a positive feedback to climate change."
Onset and Breakup
In the study, the authors determine historical changes in lake stratification periods using long-term observational data from some of the "best-monitored lakes in the world" and daily simulations from a collection of lake models.
They also run simulations of future changes in lake stratification period under three different emission scenarios, to determine how the process could change in the future. The study focuses on lakes in the northern hemisphere.
The figure below shows the average change in lake stratification days between 1900 and 2099, compared to the 1970-1999 average. The plot shows historical measurements (black), and the low emission RCP2.6 (blue), mid emissions RCP6.0 (yellow) and extremely high emissions RCP8.5 (red) scenarios.
Change in lake stratification duration compared to the 1970-1999 average, for historical measurements (black), the low emission RCP2.6 (blue) moderate emissions RCP6.0 (yellow) and extremely high emissions RCP8.5 (red). Credit: Woolway et al (2021).
The plot shows that the average lake stratification period has already lengthened. However, the study adds that some lakes are seeing more significant impacts than others.
For example, Blelham Tarn – the most well-monitored lake in the English Lake District – is now stratifying 24 days earlier and maintaining its stratification for an extra 18 days compared to its 1963-1972 averages, the study finds. Woolway tells Carbon Brief that as a result, the lake is already showing signs of oxygen depletion.
Climate change is increasing average stratification duration in lakes, the findings show, by moving the onset of stratification earlier and pushing the stratification "breakup" later. The table below shows projected changes in the onset, breakup and overall length of lake stratification under different emission scenarios, compared to a 1970-1999 baseline.
The table shows that even under the low emission scenario, the lake stratification period is expected to be 13 days longer by the end of the century. However, in the extremely high emissions scenario, it could be 33 days longer.
The table also shows that stratification onset has changed more significantly than stratification breakup. The reasons why are revealed by looking at the drivers of stratification more closely.
Warmer Weather and Weaker Winds
The timing of stratification onset and breakup in lakes is driven by two main factors – temperature and wind speed.
The impact of temperature on lake stratification is based on the fact that warm water is less dense than cool water, Woolway tells Carbon Brief:
"Warming of the water's surface by increasing air temperature causes the density of water to decrease and likewise results in distinct thermal layers within a lake to form – cooler, denser water settles to the bottom of the lake, while warmer, lighter water forms a layer on top."
This means that, as climate change causes temperatures to rise, lakes will begin to stratify earlier and remain stratified for longer. Lakes in higher altitudes are also likely to see greater changes in stratification, Woolway tells Carbon Brief, because "the prolonging of summer is very apparent in high latitude regions."
The figure below shows the expected increase in stratification duration from lakes in the northern hemisphere under the low (left), mid (center), and high (right) emission scenarios. Deeper colors indicate a larger increase in stratification period.
Expected increase in stratification duration in lakes in the northern hemisphere under the low (left), mid (centre) and high (right) emissions scenarios. Credit: Woolway et al (2021).
The figure shows that the expected impact of climate change on stratification duration becomes more pronounced at more northerly high latitudes.
The second factor is wind speed, Woolway explains:
"Wind speed also affects the timing of stratification onset and breakdown, with stronger winds acting to mix the water column, thus acting against the stratifying effect of increasing air temperature."
According to the study, wind speed is expected to decrease slightly as the planet warms. The authors note that the expected changes in near-surface wind speed are "relatively minor" compared to the likely temperature increase, but they add that it may still cause "substantial" changes in stratification.
The study finds that air temperature is the most important factor behind when a lake will begin to stratify. However, when looking at stratification breakup, it finds that wind speed is a more important driver.
Meanwhile, Vachon says that wind speeds also have implications for methane emissions from lakes. He notes that stratification prevents the methane produced on the bottom of the lake from rising and that, when the stratification period ends, methane is allowed to rise to the surface. However, according to Vachon, the speed of stratification breakup will affect how much methane is released into the atmosphere:
"My work has suggested that the amount of accumulated methane in bottom waters that will be finally emitted is related to how quickly the stratification break-up occurs. For example, a slow and progressive stratification break-up will most likely allow water oxygenation and allow the bacteria to oxidise methane into carbon dioxide. However, a stratification break-up that occurs rapidly – for example after storm events with high wind speed – will allow the accumulated methane to be emitted to the atmosphere more efficiently."
Finally, the study finds that large lakes take longer to stratify in spring and typically remain stratified for longer in the autumn – due to their higher volume of water. For example, the authors highlight the North American Great Lakes, which house "irreplaceable biodiversity" and represent some of the world's largest freshwater ecosystems.
These lakes have been stratifying 3.5 days earlier every decade since 1980, the authors find, and their stratification onset can vary by up to 48 days between some extreme years.
O'Reilly tells Carbon Brief that "it's clear that these changes will be moving lakes into uncharted territory" and adds that the paper "provides a framework for thinking about how much lakes will change under future climate scenarios."
Reposted with permission from Carbon Brief.
By Robert Glennon
Interstate water disputes are as American as apple pie. States often think a neighboring state is using more than its fair share from a river, lake or aquifer that crosses borders.
Currently the U.S. Supreme Court has on its docket a case between Texas, New Mexico and Colorado and another one between Mississippi and Tennessee. The court has already ruled this term on cases pitting Texas against New Mexico and Florida against Georgia.
Climate stresses are raising the stakes. Rising temperatures require farmers to use more water to grow the same amount of crops. Prolonged and severe droughts decrease available supplies. Wildfires are burning hotter and lasting longer. Fires bake the soil, reducing forests' ability to hold water, increasing evaporation from barren land and compromising water supplies.
As a longtime observer of interstate water negotiations, I see a basic problem: In some cases, more water rights exist on paper than as wet water – even before factoring in shortages caused by climate change and other stresses. In my view, states should put at least as much effort into reducing water use as they do into litigation, because there are no guaranteed winners in water lawsuits.
Alabama, pay attention to Supreme Court ruling against Florida in water war #Water #SDG6 https://t.co/wIjdoY6Ccr— Noah J. Sabich (@Noah J. Sabich)1617800452.0
Dry Times in the West
The situation is most urgent in California and the Southwest, which currently face "extreme or exceptional" drought conditions. California's reservoirs are half-empty at the end of the rainy season. The Sierra snowpack sits at 60% of normal. In March 2021, federal and state agencies that oversee California's Central Valley Project and State Water Project – regional water systems that each cover hundreds of miles – issued "remarkably bleak warnings" about cutbacks to farmers' water allocations.
The Colorado River Basin is mired in a drought that began in 2000. Experts disagree as to how long it could last. What's certain is that the "Law of the River" – the body of rules, regulations and laws governing the Colorado River – has allocated more water to the states than the river reliably provides.
The 1922 Colorado River Compact allocated 7.5 million acre-feet (one acre-foot is roughly 325,000 gallons) to California, Nevada and Arizona, and another 7.5 million acre-feet to Utah, Wyoming, Colorado and New Mexico. A treaty with Mexico secured that country 1.5 million acre-feet, for a total of 16.5 million acre-feet. However, estimates based on tree ring analysis have determined that the actual yearly flow of the river over the last 1,200 years is roughly 14.6 million acre-feet.
The inevitable train wreck has not yet happened, for two reasons. First, Lakes Mead and Powell – the two largest reservoirs on the Colorado – can hold a combined 56 million acre-feet, roughly four times the river's annual flow.
But diversions and increased evaporation due to drought are reducing water levels in the reservoirs. As of Dec. 16, 2020, both lakes were less than half full.
Second, the Upper Basin states – Utah, Wyoming, Colorado and New Mexico – have never used their full allotment. Now, however, they want to use more water. Wyoming has several new dams on the drawing board. So does Colorado, which is also planning a new diversion from the headwaters of the Colorado River to Denver and other cities on the Rocky Mountains' east slope.
Utah Stakes a Claim
The most controversial proposal comes from one of the nation's fastest-growing areas: St. George, Utah, home to approximately 90,000 residents and lots of golf courses. St. George has very high water consumption rates and very low water prices. The city is proposing to augment its water supply with a 140-mile pipeline from Lake Powell, which would carry 86,000 acre-feet per year.
Truth be told, that's not a lot of water, and it would not exceed Utah's unused allocation from the Colorado River. But the six other Colorado River Basin states have protested as though St. George were asking for their firstborn child.
In a joint letter dated Sept. 8, 2020, the other states implored the Interior Department to refrain from issuing a final environmental review of the pipeline until all seven states could "reach consensus regarding legal and operational concerns." The letter explicitly threatened a high "probability of multi-year litigation."
Utah blinked. Having earlier insisted on an expedited pipeline review, the state asked federal officials on Sept. 24, 2020 to delay a decision. But Utah has not given up: In March 2021, Gov. Spencer Cox signed a bill creating a Colorado River Authority of Utah, armed with a $9 million legal defense fund, to protect Utah's share of Colorado River water. One observer predicted "huge, huge litigation."
How huge could it be? In 1930, Arizona sued California in an epic battle that did not end until 2006. Arizona prevailed by finally securing a fixed allocation from the water apportioned to California, Nevada and Arizona.
Litigation or Conservation
Before Utah takes the precipitous step of appealing to the Supreme Court under the court's original jurisdiction over disputes between states, it might explore other solutions. Water conservation and reuse make obvious sense in St. George, where per-person water consumption is among the nation's highest.
St. George could emulate its neighbor, Las Vegas, which has paid residents up to $3 per square foot to rip out lawns and replace them with native desert landscaping. In April 2021 Las Vegas went further, asking the Nevada Legislature to outlaw ornamental grass.
The Southern Nevada Water Authority estimates that the Las Vegas metropolitan area has eight square miles of "nonfunctional turf" – grass that no one ever walks on except the person who cuts it. Removing it would reduce the region's water consumption by 15%.
Water rights litigation is fraught with uncertainty. Just ask Florida, which thought it had a strong case that Georgia's water diversions from the Apalachicola-Chattahoochee-Flint River Basin were harming its oyster fishery downstream.
That case extended over 20 years before the U.S. Supreme Court ended the final chapter in April 2021. The court used a procedural rule that places the burden on plaintiffs to provide "clear and convincing evidence." Florida failed to convince the court, and walked away with nothing.
Robert Glennon is a Regents Professor and Morris K. Udall Professor of Law & Public Policy, University of Arizona.
Disclosure statement: Robert Glennon received funding from the National Science Foundation in the 1990s and 2000s.
Reposted with permission from The Conversation.