Quantcast

Uranium Mining's Toxic Legacy: Why the U.S. Risks Repeating Mistakes

Health + Wellness
Radiation area from Horseshoe Mesa uranium mine tailings at Grand Canyon's South Rim. Al_HikesAZ / Flickr

By Stephanie Malin

Uranium—the raw material for nuclear power and nuclear weapons—is having a moment in the spotlight.

Companies such as Energy Fuels, Inc. have played well-publicized roles in lobbying the Trump administration to reduce federal protection for public lands with uranium deposits. The Defense Department's Nuclear Posture Review calls for new weapons production to expand the U.S. nuclear arsenal, which could spur new domestic uranium mining. And the Interior Department is advocating more domestic uranium production, along with other materials identified as "critical minerals."


What would expanded uranium mining in the U.S. mean at the local level? I have studied the legacies of past uranium mining and milling in Western states for over a decade. My book examines dilemmas faced by uranium communities caught between harmful legacies of previous mining booms and the potential promise of new economic development.

These people and places are invisible to most Americans, but they helped make the U.S. an economic and military superpower. In my view, we owe it to them to learn from past mistakes and make more informed and sustainable decisions about possibly renewing uranium production than our nation made in the past.

National Energy Education Development Project, CC BY-ND

Mining Regulations Have Failed to Protect Public Health

Today most of the uranium that powers U.S. nuclear reactors is imported. But many communities still suffer impacts of uranium mining and milling that occurred for decades to fuel the U.S.-Soviet nuclear arms race. These include environmental contamination, toxic spills, abandoned mines, under-addressed cancer and disease clusters and illnesses that citizens link to uranium exposure despite federal denials.

As World War II phased into the Cold War, U.S. officials rapidly increased uranium production from the 1940s to the 1960s. Regulations were minimal to nonexistent and largely unenforced, even though the U.S. Public Health Service knew that exposure to uranium had caused potentially fatal health effects in Europe, and was monitoring uranium miners and millers for health problems.

Today the industry is subject to regulations that address worker health and safety, environmental protection, treatment of contaminated sites and other considerations. But these regulations lack uniformity, and enforcement responsibilities are spread across multiple agencies.

This creates significant regulatory gaps, which are worsened by a federalist approach to regulation. In the 1970s the newly created Nuclear Regulatory Commission initiated an Agreement States program, under which states take over regulating many aspects of uranium and nuclear production and waste storage. To qualify, state programs must be "adequate to protect public health and safety and compatible with the NRC's regulatory program."

The Orphan uranium mine on the South Rim of the Grand Canyon operated from 1956-1969 and is now a radioactive waste site. Alan Levine, CC BY

Today 37 states have joined this program and two more are applying. Many Agreement States struggle to enforce regulations because of underfunded budgets, lack of staff and anti-regulatory cultures. These problems can lead to piecemeal enforcement and reliance on corporate self-regulation.

For example, budget cuts in Colorado have forced the state to rely frequently on energy companies to monitor their own compliance with regulations. In Utah, the White Mesa Mill—our nation's only currently operating uranium mill—has a record of persistent problems related to permitting, water contamination and environmental health, as well as tribal sacred lands and artifacts.

Neglected Nuclear Legacies

Uranium still affects the environment and human health in the West, but its impacts remain woefully under-addressed. Some of the poorest, most isolated and ethnically marginalized communities in the nation are bearing the brunt of these legacies.

There are approximately 4,000 abandoned uranium mines in Western states. At least 500 are located on land controlled by the Navajo Nation. Diné (Navajo) people have suffered some of the worst consequences of U.S. uranium production, including cancer clusters and water contamination.

A 2015 study found that about 85 percent of Diné homes are still contaminated with uranium, and that tribe members living near uranium mines have more uranium in their bones than 95 percent of the U.S. population. Unsurprisingly, President Donald Trump's decision to reduce the Bears Ears National Monument has reinvigorated discussion over ongoing impacts of uranium contamination across tribal and public land.

Despite legislation such as the Radiation Exposure Compensation Act of 1990, people who lived near uranium production or contamination sites often became forgotten casualties of the Cold War. For instance, Monticello, Utah, hosted a federally owned uranium mill from 1942 to 1960. Portions of the town were even built from tailings left over from uranium milling, which we now know were radioactive. This created two Superfund sites that were not fully remediated until the late 1990s.

Monticello residents have dealt with cancer clusters, increased rates of birth defects and other health abnormalities for decades. Although the community has sought federal recognition and compensation since 1993, its requests have been largely ignored.

Today tensions over water access and its use for uranium mining are creating conflict between regional tribes and corporate water users around the North Rim of the Grand Canyon. Native residents, such as the Havasupai, have had to defend their water rights and fear losing access to this vital resource.

Uranium Production Is a Boom-and-Bust Industry

Like any economic activity based on commodities, uranium production is volatile and unstable. The industry has a history of boom-bust cycles. Communities that depend on it can be whipsawed by rapid growth followed by destabilizing population losses.

The first U.S. uranium boom occurred during the early Cold War and ended in the 1960s due to oversupply, triggering a bust. A second boom began later in the decade when the federal government authorized private commercial investment in nuclear power. But the Three Mile Island (1979) and Chernobyl (1985) disasters ended this second boom.

Uranium prices soared once again from 2007 to 2010. But the 2011 tsunami and meltdown at Japan's Fukushima Dai-ichi nuclear plant sent prices plummeting once again as nations looked for alternatives to nuclear power.

U.S. uranium production, 1949-2011

Companies like Energy Fuels maintain—especially in public meetings with uranium communities—that new production will lead to sustained economic growth. This message is powerful stuff. It boosts support, sometimes in the very communities that have suffered most from past practices.

But I have interviewed Westerners who worry that as production methods become more technologically advanced and mechanized, energy companies may increasingly rely on bringing in out-of-town workers with technical and engineering degrees rather than hiring locals—as has happened in the coal industry. And the core tensions of boom-bust economic volatility and instability persist.

Uranium production advocates contend that new "environmentally friendly" mills and current federal regulations will adequately protect public health and the environment. Yet they offer little evidence to counter White Mesa Mill's poor record.

In my view, there is little evidence that new uranium production would be more reliably regulated or economically stable today than in the past. Instead, I expect that the industry will continue to privatize profits as the public absorbs and subsidizes its risks.

Reposted with permission from our media associate The Conversation.

EcoWatch Daily Newsletter

Pixabay

By Claire L. Jarvis

A ruckus over biofuels has been brewing in Iowa.

Read More Show Less
Serena and Venus Williams have been known to follow a vegan diet. Edwin Martinez / Flickr / CC BY 2.0

By Whitney E. Akers

  • "The Game Changers" is a new documentary on Netflix that posits a vegan diet can improve athletic performance in professional athletes.

  • Limited studies available show that the type of diet — plant-based or omnivorous — doesn't give you an athletic advantage.

  • We talked to experts about what diet is the best for athletic performance.

Packed with record-setting athletes displaying cut physiques and explosive power, "The Game Changers," a new documentary on Netflix, has a clear message: Vegan is best.

Read More Show Less
Sponsored
An illegally trafficked tiger skull and pelt. Ryan Moehring / USFWS

By John R. Platt

When it comes to solving problems related to wildlife trade, there are an awful lot of "sticky widgets."

Read More Show Less
Pexels

By Franziska Spritzler, RD, CDE

Inflammation can be both good and bad.

On one hand, it helps your body defend itself from infection and injury. On the other hand, chronic inflammation can lead to weight gain and disease.

Read More Show Less
Pexels

By Dan Nosowitz

It's no secret that the past few years have been disastrous for the American farming industry.

Read More Show Less
Sponsored
Pexels

By Gavin Van De Walle, MS, RD

Medium-chain triglyceride (MCT) oil and coconut oil are fats that have risen in popularity alongside the ketogenic, or keto, diet.

Read More Show Less
Pexels

By Bijal Trivedi

The Centers for Disease Control and Prevention (CDC) released a report on Nov. 13 that describes a list of microorganisms that have become resistant to antibiotics and pose a serious threat to public health. Each year these so-called superbugs cause more than 2.8 million infections in the U.S. and kill more than 35,000 people.

Read More Show Less
Rool Paap / Flickr / CC BY 2.0

By Franziska Spritzler, RD, CDE

Inflammation can be good or bad depending on the situation.

Read More Show Less