This blog was written by John D. Wilson, former Deputy Director for Regulatory Policy at the Southern Alliance for Clean Energy.
Guest Blog | September 16, 2015 | Energy Policy, Fossil GasReviewing the media coverage of EPA’s proposed natural gas emission standard rule in September, I was struck by some findings that echoed research from a decade ago. It seems that emissions of natural gas, like many other pollutants from the oil, gas, chemical and refining industry, are often systematically underreported.
Often, EPA and industry emission estimates are based on very old studies that looked at facilities operating under “typical” conditions. And by “typical” we mean the plant was fairly new and the manager was ready for the measurements. Even those measurements weren’t necessarily ideal. And there is no incentive for companies to apply new technology to find and control emissions that aren’t included in their annual reports.
Enter the research team led by Colorado State University. Their study reports that only 7% of natural gas (methane) emissions from gathering and processing facilities were reported by industry. EPA’s more comprehensive inventory assumes higher numbers, but still misses roughly half of the emissions.
How does EPA miss roughly half the methane emissions from natural gas gathering and processing facilities?
Many people may imagine that sensitive technical instruments are used to measure air pollution, but that is often not the case. For natural gas leaks all along the production and distribution chain, what’s called the “average emission factor approach” is still the main tool for counting tons of pollution leaks.
While EPA does recognize some recent improvement in industry reporting, the emission factors are outdated, rarely updated based on findings using new technologies or simply better awareness of how leaks happen. For example, one set of suggested emission factors used for each piece of equipment at a natural gas gathering or processing facility was published by EPA back in 1995, most likely relying on data gathered prior to 1990.
EPA’s emission inventory for natural gas (methane specifically) recognizes that its data are outdated, but only in the sense that improvements have occurred since then. There is no recognition that the initial estimates may have been flat out wrong.
Here is EPA’s 2015 statement on the emissions factors used for natural gas production and processing:
“… key data on emissions from many sources are from a 1996 report containing data collected in 1992. Since the time of this study, practices and technologies have changed. While this study still represents best available data for some emission sources, using these emission factors alone to represent actual emissions without adjusting for emissions controls would in many cases overestimate emissions. As updated emission factors reflecting changing practices are not available for most sources, the 1992 emission factors continue to be used for many sources for all years of the Inventory, but they are considered to be potential emissions factors, representing what emissions would be if practices and technologies had not changed over time.” (emphasis added)
Each year, when EPA reports on methane emissions from the oil, gas, and petrochemical industry reports data, some parts of the report may be very high quality, but much of the report may simply be the number of pieces of equipment multiplied by some factors set in the 1990s and then reduced to account for improvements since then … but not increased to account for missed sources or other problems that have been identified in subsequent research.
In short, we can’t rely on EPA’s emission inventories to reflect up to date information. And industry-reported data is even worse: assuming that the research team led by Colorado State University is on the mark, the gathering and processing sectors of the natural gas industry fails to report an estimated 2.2 million metric tons of leaks per year to the EPA’s Greenhouse Gas Reporting Program.
This should not be news.
While it may seem astounding that half or even 93% of emissions can be missing from the data we use to study air pollution and design effective regulations, the problem is familiar. Back in 2004, I co-authored a report looking at underreported emissions from refineries and chemical plants. Our organizations (Environmental Integrity Project or EIP and GHASP or what is now Air Alliance Houston) estimated the reporting shortfall to be ‘only’ 151,388 metric tons per year.
Our report was met with skepticism from the industry at the time. While the findings we reported had been accepted by industry experts in the Houston area, when we extrapolated the findings nationwide, they claimed that our extrapolation wasn’t based on sound engineering. Presumably the leaking valves in Houston, Texas operated differently than those in Louisville, Kentucky.
An interesting lesson from the work in Houston was that it was relatively easy to find and stop previously underestimated leaks and other sources of pollution. Because the estimated leak rates were so low, industry didn’t think there was a problem. Once confronted with independently-gathered evidence of the problem, industry found it relatively easy to apply new technologies to find leaks, and new practices to reduce other problems at the plants. Not knowing was certainly easier than knowing about and solving the problems, but there wasn’t really “pain” involved in solving some of these problems.
In fact, reducing leaks may often be easier than measuring the emissions from the leaks. As one recent controversy illustrates, it can be hard to get agreement among the experts as to when a particular technology is providing the correct data. In Houston, even with six years of intense industry, academic, and regulatory scrutiny of the problem, the inventories were slow to improve. A 2006 research summary emphasized the problem with respect to highly-reactive volatile organic compounds (HRVOC) emissions:
“The TexAQS 2000 study established that inventories underestimated emission fluxes of HRVOC from petrochemical facilities by one to two orders of magnitude (Ryerson et al., 2003.) [Remarkably] … appreciation of this finding has not been reflected in the inventory evolution. … the latest available emission inventories still underestimate HRVOC emissions … by at least an order of magnitude.” (emphasis added)
I’m told by colleagues that after much science and, sadly, litigation, Houston’s emission inventories have fewer discrepancies with the measured air pollution levels. In large part, industry has found ways to cost-effectively stop the leaks. And from a national perspective, EPA was slow to react, but finally, a decade later, an EPA guideline revision demonstrated the validity of the EIP-GHASP report findings.
Let’s hope the natural gas industry doesn’t take a decade to respond to this problem. It is clear – regardless of which numbers are used – that the natural gas industry can and should do more to stop natural gas leaks. Both accurate measurements and strong regulations are an essential part of that process, in order to make sure that every company and every worker in the industry is held to strong standards, preventing one company from undercutting the others for short run gain.