Quantcast
Climate
Smoggy downtown Houston. Kyle Jones / Flickr

Most EPA Pollution Estimates Are Unreliable, So Why Is Everyone Still Using Them?

By Rachel Leven

Engineer Jim Southerland was hired by the U.S. Environmental Protection Agency (EPA) in 1971 to join the nascent war on air pollution. He came to relish the task, investigating orange clouds from an ammunition plant in Tennessee and taking air samples from strip mines in Wyoming. Among his proudest accomplishments: helping the agency develop a set of numbers called emission factors—values that enable regulators to estimate atmospheric discharges from power plants, oil refineries, chemical plants and other industrial operations.


By the time Southerland left the EPA in 1996, he was "frustrated and ticked off," he said, because the numbers he had helped develop were being misused. The original aim had been to paint a broad-brush picture of pollution. Instead, the numbers—meant to represent average emissions from industrial activities—were incorporated into permits stipulating how much pollution individual facilities could release. This happened despite EPA warnings that about half of these sites would discharge more than the models predicted. "These factors were not intended for permits," said Southerland, now retired and living in Cary, North Carolina.

The number of emission factors used by the EPA since Southerland's time has proliferated and stands at 22,693. The agency itself admits most are unreliable: It rates about 62 percent as "below average" or "poor." Nearly 22 percent aren't rated at all. About 17 percent earned grades of "average" or better, and only one in six has ever been updated. There is a slew of common problems, such as poor accounting for emissions from aging equipment.

The upshot: in some cases, major polluters are using flawed numbers to calculate emissions of substances such as benzene, a carcinogen, and methane, a powerful greenhouse gas. Regulators at times are flying blind. The factors color everything we know about air quality and many of the decisions the EPA and state environmental agencies make, from risk assessment to rulemaking.

In an email, an EPA spokeswoman told the Center for Public Integrity that the agency has been working on the problem for a decade. "EPA believes it is important to develop emissions factors that are of high quality and reliable," she wrote.

Some experts, however, say the agency hasn't done enough. The unreliability of the numbers has been flagged over a period of decades by the EPA's own internal watchdog and other government auditors. "This is what tells you what's being put in the air and what you're breathing," said Eric Schaeffer, former head of civil enforcement at the EPA and now executive director of the Environmental Integrity Project, an advocacy group. "You don't want those numbers to be wrong."

Accuracy Questions

Emission factors are based on company and EPA measurements as well as external studies. They are plugged into equations to estimate total emissions from industrial activities, such as the burning of coal in boilers.

As early as the 1950s, regulators in places like Los Angeles were using emission factors to try to pinpoint the origins of dangerous smog episodes. The numbers allowed them to avoid "time-consuming, expensive testing programs and extensive surveys of individual sources," according to a 1960 paper by the Los Angeles County Air Pollution Control District.

In 1965, the U.S. Public Health Service—which regulated air pollution at the time—released its first comprehensive list of factors, a document the agency would label "AP-42" in a 1968 update. The EPA, created two years later, kept revising the estimates as they became more widely used in emission inventories depicting pollution levels and sources around the country.

The EPA knew early on there were problems with the numbers. In 1989, for example, the Office of Technology Assessment—a now-defunct, nonpartisan science adviser to Congress—reported many U.S. metropolitan areas had not met their goals for controlling smog-forming ozone in part because of inaccurate emission inventories. In 1990 amendments to the Clean Air Act, Congress gave the agency six months to make sure all emissions contributing to ozone formation were assigned up-to-date, accurate factors, and directed the EPA to review the numbers every three years thereafter.

The EPA missed both deadlines. It has failed to do at least some of the three-year reviews. It claims to have created all the necessary ozone-related factors, but questions about their accuracy remain.

For decades, government watchdogs, including the EPA's Office of Inspector General, have pointed out deficiencies in the factors, which drive actions ranging from enforcement cases to the drafting of regulations. "We believe the status of emission factor development … is a significant weakness that impedes achievement of major air program goals," the IG wrote in a 1996 report. The EPA's dependence on industry studies because of funding constraints could result in factors that minimized pollution, it warned. The U.S. General Accounting Office—now the Government Accountability Office—reported in 2001 that polluters rely on the estimates even though "facilities' actual emissions can, and do, vary substantially from the published factors." The EPA's IG came back with a targeted reproach in 2014, questioning the validity of factors used to estimate methane emissions from some pipelines.

Still, there was little movement. Whereas emission factors are recognized as crucial tools in understanding air quality and underpinning inventories, they tend to be forgotten. "That foundation is buried to such an extent that it's not often appreciated," said David Mobley, who worked on emission factors in the 1990s. "The urgency is rarely there."

Vehicles travel along Highway 225 near Shell Oil Company's refinery and petrochemical facility in Deer Park, Texas.texasfreeway.com

Test Case in Houston

Accurate pollution data matters. Consider what happened in the ozone-plagued city of Houston, a hub of oil refining and chemical manufacturing.

The city had been using emission inventories to guide its ozone-control strategy. Air monitoring by researchers in 2000 found levels of volatile organic compounds—highly reactive ozone precursors, such as benzene, known as VOCs—were 10 to 100 times higher than what had previously been estimated. The study—conducted by what was then the Texas Natural Resource Conservation Commission, the EPA and more than 40 other public, private and academic institutions—singled out as culprits VOCs such as ethylene, a flammable gas used mainly in the production of plastics.

Houston, it turned out, had focused on controlling the wrong emissions from the wrong sources to lower its ozone levels, said Daniel Cohan, an associate professor of environmental engineering at Rice University. The city changed course, expanding VOC monitoring and developing rules to reduce emissions. Ozone production rates dropped by up to 50 percent in six years, Cohan and his colleagues found in a follow-up study. The study showed that reliance on emission factors alone is a bad idea, Cohan said. "We need scientists to measure these pollutants in the air to find out how much is really being emitted," he said.

The underestimation problem surfaced at individual facilities as well, including Shell's 1,500-acre petrochemical complex in the Houston suburb of Deer Park. A study begun by the City of Houston and the EPA in 2010 showed levels of benzene wafting from one Shell tank were 448 times higher than what the relevant emission factor had predicted. The discrepancy led to an EPA enforcement action; in a consent decree, Shell agreed to pay a $2.6 million fine and spend $115 million to control pollution from flaring—the burning of gas for economic or safety reasons—and other activities. Shell did not respond to requests for comment, but a spokeswoman told the Houston Chronicle in 2013 "the provisions of the settlement are consistent with Shell Deer Park's objectives and ongoing activities to reduce emissions at the site and upgrade our flaring infrastructure."

Despite the findings of these studies and others, the EPA didn't update emission factors for the U.S. refinery and petrochemical sector until 2015, seven years after Houston had petitioned the agency to do so and two years after it was sued by environmental justice groups.

Unreliable Methane Estimates

The low-balling of pollution isn't limited to toxic chemicals. Many emission factors used to estimate releases of methane—a potent greenhouse gas associated with oil and natural-gas development—are "far too low," said Robert Howarth, an ecology and environmental biology professor at Cornell University. Identifying how much methane these operations discharge can help scientists calculate the impact of natural gas—which in 2016 displaced coal as the nation's biggest source of electric power generation—on global warming. This is crucial to preventing "runaway climate change," Howarth said.

Much remains unknown. A 2015 study sponsored by the Environmental Defense Fund found methane releases from oil and gas production and processing in the Barnett Shale Formation in northern Texas were 90 percent higher than what the EPA's Inventory of U.S. Greenhouse Gas Emissions had estimated.

About a third of the factors used to estimate pipeline leaks and other natural-gas emissions in the most recent inventory, for 2015, are based on a 1996 study by the EPA and an industry group then known as the Gas Research Institute. The EPA's IG found in 2014 "there was significant uncertainty in the study data," meaning the EPA's assumptions on the amount of methane that spews from pipelines "may not be valid."

The harm caused by faulty estimates extends beyond oil and gas. An emission factor designed to estimate ammonia releases from poultry farms, for example, "is probably far too low" according to a report by the Environmental Integrity Project. These emissions contribute to problems like algae blooms, which can spread rapidly and kill marine life in waterways like the Chesapeake Bay.

'Pandora's Box of Problems'

The EPA, according to its spokeswoman, has begun executing a plan to improve the science that underlies emission factors and review the estimates more frequently. Among the changes: some companies now must report pollution data electronically to the agency.

The Trump administration proposed slashing the EPA's budget by 31 percent for fiscal year 2018, although Congress has so far extended existing funding levels through a series of short-term resolutions. Progress on emission factors will hinge on "available resources," the EPA spokeswoman wrote in an email, declining to specify a deadline for the project.

The agency said it does not intend to limit the use of emission factors to the purpose for which they were originally intended—to inform pollution inventories. That means, for example, that the numbers will still be used in permits.

Many in industry are fine with that. When the EPA asked in a 2009 Federal Register notice for suggestions on how to improve the system, companies from electric power generators to auto manufacturers argued for the status quo, saying emission factors were sometimes their only data option. Trade groups like the American Petroleum Institute and the American Chemistry Council argued their members should not be penalized if the EPA discovered a deficient factor had caused a permit to underestimate pollution. API said it worried that additional industry data supplied to the EPA to help it improve the numbers "could be misused for enforcement or other purposes." Neither group responded to requests for comment.

Public health advocates, on the other hand, want more. Some companies game the system to avoid EPA permitting fees and civil penalties, said Neil Carman, clean air director for the Lone Star Chapter of the Sierra Club in Austin. "We don't know what the emissions really are," he said. "It's a real Pandora's box of problems."

Carman and other advocates say they understand emission factors will have to be used in some circumstances, and that some types of pollution can be estimated with reasonable accuracy. They also maintain, however, that air monitoring should be more widely deployed. "Where you can do direct monitoring of emissions, that should be required," said Schaeffer, of the Environmental Integrity Project.

Schaeffer faults the EPA for giving some companies an out. It allows operators of power plants, for example, to choose between using continuous monitoring to measure fine particles, or a combination of quarterly testing and emission factors. Some of these plants already have monitoring systems installed, Schaeffer said, but "it's easier to mask noncompliance using emission factors."

Shining a 'Bright Light' on Pollution

California's Bay Area Air Quality Management District changed its approach after studies showed leaks from oil refineries in the area—known as fugitive emissions—were likely underrepresented in emission factors. "We decided, based on that information, that we needed additional ways to better identify fugitive emissions and to shine a bright light on those fugitive emissions," said Eric Stevenson, the district's director of meteorology, measurement and rules.

In 2016, the district urged refineries to install "open path" monitoring systems—which use beams of light to detect the presence of gases like benzene—and make the data available to the public in real time. Chevron installed such a system on the perimeter of its refinery in Richmond, California, in 2013.

The company didn't respond to specific questions about the monitoring but said its focus "on running the refinery efficiently and investing in new technologies" has significantly reduced air pollution since the 1970s. Denny Larson, executive director of the Community Science Institute - CSI for Health and Justice, an environmental group that helps the public test for pollution, said the system in Richmond shows levels of chemicals in the air at a given moment and can alert residents to emission spikes that can trigger asthma attacks and other serious health problems.

"It's showing lots of pollution has been flying under the radar that's extremely toxic and problematic," Larson said. "We can prove what we've always known."

Reposted with permission from our media associate The Center for Public Integrity.

Show Comments ()
Sponsored
Food
Indie Ecology / Instagram

Table-to-Farm-to-Table: Startup Grows Food for Restaurants With Kitchen Leftovers

Food, as we know, is a terrible thing to waste. Roughly one third of the food produced in the world for human consumption gets lost or wasted every year. But what if we could use food waste to create more food?

That's the elegantly full-circle idea behind Indie Ecology, a West Sussex food waste farm that collects leftovers from some of London's best restaurants and turns it into compost. The nutrient-rich matter is then used to grow high quality produce for the chefs to cook with. Call it table-to-farm-to-table—and again and again.

Keep reading... Show less
Climate
Pexels

China’s Global Infrastructure Initiative Could Bring Environmental Catastrophe

By Nexus Media, with William F. Laurance

Humans are ravaging tropical forests by hunting, logging and building roads and the threats are mounting by the day.

China is planning a series of massive infrastructure projects across four continents, an initiative that conservation biologist William Laurance described as "environmentally, the riskiest venture ever undertaken."

Keep reading... Show less
Energy
Alaska's Kenai Fjords National Park, which was impacted by the Exxon Valdez oil spill, could be harmed again if expanded offshore drilling plans go through. National Park Service

Trump’s Offshore Drilling Plan Puts 68 National Parks at Risk

Sixty-eight National Parks along the coastal U.S. could be in danger from devastating oil spills if President Donald Trump's plan to open 90 percent of coastal waters to offshore oil drilling goes through, a report released Wednesday by the Natural Resources Defense Council and the National Parks Conservation Association found.

Keep reading... Show less
Climate
E. coli. The World Health Organizations says antibiotic resistance is "one of the biggest threats to global health, food security, and development today." U.S. Centers for Disease Control and Prevention

Climate Change Could Supercharge Threat of Antibiotic Resistance: Study

By Andrea Germano

The World Health Organization and U.S. Centers for Disease Control and Prevention have previously sounded alarms about the growing issue of antibiotic resistance—a problem already linked to overprescribing of antibiotics and industrial farming practices. Now, new research shows a link between warmer temperatures and antibiotic resistance, suggesting it could be a greater threat than previously thought on our ever-warming planet.

Keep reading... Show less
Sponsored
Renewable Energy
Powerwall residential battery with solar panels. Tesla

Tesla's Massive Virtual Power Plant in South Australia Roars Back to Life

Tesla's plans to build the world's largest virtual power plant in South Australia will proceed after all.

The $800 million (US $634 million) project—struck in February by Tesla CEO Elon Musk and former South Australian Premier Jay Weatherill—involves installing solar panels and batteries on 50,000 homes to function as an interconnected power plant.

Keep reading... Show less
Climate
A French lavender farmer is part of the group suing the EU for more ambitious emissions targets, saying climate change threatens his crop. Iamhao / CC BY-SA 3.0

10 Families Bring First Ever 'People’s Climate Case' Against the EU

Ten families from Fiji, Kenya and countries across Europe who are already suffering the effects of climate change filed a case against the EU Wednesday in a bid to force the body to increase its commitments under the Paris agreement, AFP reported.

Keep reading... Show less
Sponsored
Oceans

Swimmer Plans to Cross Pacific to Highlight Plastic Pollution

Ben Lecomte, the first man to swim across the Atlantic in 1998, will attempt another grueling, history-making ocean crossing.

On Tuesday, the 50-year-old Frenchman and his crew will set out from Tokyo for a 5,500-mile swim across the Pacific, Reuters reported. If all goes as planned, Lecomte will arrive in San Francisco six to eight months later.

Keep reading... Show less
Business
Tesco supermarket near Ashford Hospital in West Bedfont, England. Maxwell Hamilton / CC BY 2.0

UK's Largest Grocer Takes on Food and Plastic Waste

It's been a green week for Tesco, the UK's largest supermarket.

First, the chain said it would remove "Best before" labels from around 70 pre-packaged fruits and vegetables in an attempt to stop customers from discarding still-edible food, BBC News reported Tuesday.

Keep reading... Show less
Sponsored

mail-copy

The best of EcoWatch, right in your inbox. Sign up for our email newsletter!