TL;DR — A report published by the U.S. Government Accountability Office recommends that the EPA and regional air quality monitoring agencies take steps to modernize their air quality monitoring infrastructure and more effectively manage air quality in the 21st century. By implementing emerging technologies such as low-cost sensors, air quality monitoring agencies can mitigate some of the challenges they face associated with decreased funding and aging infrastructure on the path towards cleaner air.
U.S. Government Accountability Office highlights how low-cost sensors can be used to modernize the national air quality monitoring system
In November 2020, the United States Government Accountability Office (GAO) released a report entitled “Air Pollution: Opportunities to Better Sustain and Modernize the National Air Quality Monitoring System”. The report evaluates the state of the national air quality monitoring network operated by regional government agencies with oversight from the U.S. Environmental Protection Agency (USEPA), concluding with recommendations for how these agencies should proceed to modernize their infrastructure. The GAO’s recommendations emphasize the importance of embracing emerging technologies such as low-cost sensors, for reasons we explore further below.
A brief history of air quality monitoring in the United States
As the United States embraced industry as a workhorse of its economy in the 20th century, citizens were alarmed by the negative health effects caused by the air pollution that came with industrialization. Increases in the number of deaths from respiratory and cardiovascular problems were observed across the country as toxic substances like acid rain, smog, and ozone became pollutants of national concern.
In response to these public health concerns, in 1970 Congress passed the Clean Air Act (CAA) and established the National Ambient Air Quality Standards (NAAQS), setting in motion a long process of improving the nation’s air quality. The CAA continues to be the primary regulatory impetus behind government-funded air quality monitoring programs in the United States today.
Under the CAA, maximum air pollutant levels are set at the regional level. The USEPA requires written plans, also called state implementation plans, to ensure the reduction of air pollution in states with nonattainment areas.
Under this model, the equipment used to monitor and regulate regional air pollution compliance is managed locally with federal oversight by the USEPA. State and local agencies are responsible for designing regional air quality monitoring networks and implementing action plans, and also have budgetary responsibility for the purchase, operation, and maintenance of equipment and other necessary infrastructure.
When the CAA was passed in 1970, regional air quality management agencies began to implement federal reference-grade monitoring equipment (FRMs and FEMs) in a pattern roughly matching the geography of the regional compliance zones. The resulting patchwork of monitors across the country allows agencies to monitor and regulate pollutants in these zones.
This network of FRMs and FEMs has accomplished exactly what it was designed to do—largely due to the framework established by the CAA, the aggregate national emissions of six criteria pollutants—particulate matter, ozone, lead, carbon monoxide, sulfur dioxide, and nitrogen dioxide—dropped by an average of 73 percent by 2017. And while the public and private sector’s costs to meet CAA requirements are significant at an annual cost of about $65 billion, the estimated benefits in terms of reductions in air pollution‐related premature death and illness, improved economic welfare, and better environmental conditions are valued at approximately $2 trillion.
A legacy air quality monitoring network faces 21st-century challenges
While the national air quality monitoring infrastructure precipitated by the CAA continues to provide valuable information on regional air quality trends for regulatory purposes, the GAO’s report highlights a range of challenges facing air quality managers in the 21st century. These range from budgetary challenges to more fundamental issues like the fact that our understanding of air pollution’s public health impact is very different today than it was 50 years ago when the CAA was passed.
One major challenge faced by the current national air quality monitoring system is a decline in funding which, coupled with the high costs to operate a network of FRMs or FEMs, can make sustaining a network difficult. A single FRM or FEM can cost tens of thousands of dollars. Overhead including personnel costs are often even higher, typically accounting for the largest line item in an air quality management district’s budget. In California, for example, the annual cost to operate a single FRM can be hundreds of thousands of dollars per year once overhead and other soft costs are accounted for according to publicly available budgets from these agencies.
According to research by the GAO, state and local governments are responsible for funding an average of 75% of their air quality management programs. Federal funding through grants is reported to have decreased by approximately $4 million each year between 2004 and 2019.
Funding for State and Local Air Quality Agencies through EPA Grants, 2004-2019
USEPA officials, state and local agencies, and regional air quality associations across the country identified low funding as a challenge in operating ambient air monitoring systems in their reports to the GAO. Such difficulties have only been exacerbated by the COVID-19 pandemic’s impacts on the global economy.
[T]he economic effects of the Coronavirus Disease 2019 (COVID-19) pandemic on state and local agencies’ budgets will likely be dramatic and are already being felt.” (GAO report, p. 27)
Aging equipment presents another challenge for air quality management agencies in the United States. Many agencies were found to be using equipment 15 to 20 years old, despite the average seven-year lifetime of such equipment. Aging equipment can be cumbersome to use and costly to maintain—one state even reported having to find necessary replacement parts on eBay since being discontinued by the manufacturer.
Aging infrastructure tends to have higher operational costs. Though purchasing new equipment is expensive, it can be even more costly to keep an outdated air quality monitoring station up and running. In addition to operational and maintenance costs, these technologies are often filter-based, meaning they require staff to manually collect filters at the site and send them for data collection.
Outdated monitoring equipment can also come with the risk of lost or invalidated data. The GAO found that poor calibration at certain sites in 2015 and 2016 led to the invalidation of multiple states’ ozone data for these years (more information on calibration’s role in air quality monitoring is available here, in the context of wildfire smoke monitoring). Inadequate conditions in data shelters, such as those caused by malfunctioning air conditioning systems, can affect data as well.
Shifting composition and understanding of priority air pollutants
Even as progress is made on traditional criteria air pollutants, new health impacts of various pollutants are constantly being identified. While FRMs and FEMs have proven highly effective in making progress toward the NAAQS established in 1970, they are not as effective when it comes to monitoring many pollutants that modern-day public health experts recommend we prioritize.
Carbon monoxide, for example, is a criteria pollutant in the NAAQS, but has been found at levels well below NAAQS requirements for years and poses a less significant public health threat than in 1970. However, because it is a criteria pollutant under the NAAQS, state and local agencies are still required to operate expensive, specialized equipment to monitor CO to meet stringent USEPA data requirements for criteria pollutants, despite the minimal public health value gained from this data.
With issues like carbon monoxide and acid rain on the decline in North America, many of today's most concerning air quality threats are more localized in nature, requiring higher-resolution monitoring infrastructure to effectively address. Some of the air quality threats with the most significant impact on public health today include toxic compounds such as tVOC from industrial sites and particulate matter (PM) from point and mobile sources, as well as from natural hazards such as wildfires.
Sparse spatiotemporal coverage
The U.S. air quality monitoring network was designed to identify regional, year-over-year trends, not to monitor local, real-time air pollution trends. Consequently, the data from this network is limited both spatially and temporally, making it of limited use for protecting human health in real-time.
Spatially, because of the varied (and often sparse) density of government monitors in different regions, FRMs and FEMs may miss air pollution hotspots or other localized trends. Rural counties, for example, tend to be overlooked by the FRM network—the GAO’s report found that two-thirds of counties (2,120 of 3,142) in the United States had no regulatory ambient air quality monitoring infrastructure at all in 2019.
Temporally, the FRM network misses air pollution hotspots by design, as their purpose is to improve air quality year-over-year and not to identify exceptional air pollution events. 1-in-6 day sampling is the standard for filter-based FRM samplers, so the data from these monitors cannot provide insight into hourly or even daily air quality trends. Federal equivalent method (FEM) devices can help fill in these temporal gaps through higher-resolution sampling, but they can still be cost-prohibitive compared to low-cost sensors.
From a public health perspective, real-time data is essential for swift action during emergency air pollution events such as wildfires or industrial facility failures. Evidence continues to accumulate that even short-term exposure to pollutants like particulate matter can cause lasting damage. In light of this evidence, air quality managers should prioritize investments in the tools they need to effectively manage air pollution at the neighborhood level, where human health impacts occur.
Increased public accountability
Lastly, government agencies face increased accountability to a data-driven, informed public that is actively involved with keeping their local air clean. Because citizens can now access traffic and weather information at the neighborhood level, there is an expectation that air pollution data be available at this level as well. Where government agencies fail to provide this information, community groups are increasingly taking action into their own hands.
These community-driven networks can be extremely beneficial to local communities, and are often implemented in collaboration with local government agencies—Brightline Defense, for example, operates its network independently of the government-run air quality monitoring system in San Francisco, but has strong ties with local environmental agencies. Air quality management agencies are wise to partner with community-led air quality monitoring projects to best provide the most complete picture of air quality in their region to constituents.
The GAO’s recommendation for modernizing the national air quality monitoring infrastructure: Supplement with low-cost sensors
The GAO concludes that in the face of these challenges, the USEPA and the regional agencies responsible for monitoring air quality across the country should evaluate and adopt low-cost, continuous monitoring equipment to supplement existing government monitoring systems. Low-cost air sensors can be deployed across geographical regions at scale, with lower upfront and operational costs compared to FRMs and FEMs. This makes them a strong complementary technology that can be used to “fill in the gaps” that exist with traditional technology at a reasonable cost.
Bringing low-cost sensors and government-grade monitors under the same jurisdiction is an approach we refer to as Air Quality Monitoring 2.0. The Breathe London project is a prime example of this model.
By integrating local, real-time air quality data from more than 130 sensors with reference data from London Air Quality Network (LAQN), one of the densest and most advanced metropolitan monitoring networks in the world, London has become the first city in the world to holistically integrate low-cost air sensors with existing air quality infrastructure.
Government agencies also use low-cost monitoring equipment to capture the granular air quality data needed to identify air pollution hotspots, allowing air quality managers to consider trends that occur between government monitoring sites when looking at the bigger picture of air quality. The Denver Department of Public Health and Environment, for example, has employed low-cost sensors to identify air pollution hotspots and monitor air quality trends at the neighborhood level.
As low-cost air sensor technology and performance standards continue to evolve, we expect to see more air quality management agencies around the world adopt low-cost air sensors for these or other use cases.
Obstacles to the adoption of low-cost sensors
While low-cost sensors hold a lot of promise, they are still a relatively new technology, and a lot of education is needed before they can be effectively implemented at scale. The GAO found that the main information needs of researchers and air quality managers include:
- Accepted and cost-effective applications of sensors
- Proper sensor calibration
- Proper siting of sensors
The report notes a need for greater education not only about the sensors themselves but also about the challenges they can help resolve.
Air quality managers, researchers, and the public have needs for additional information about real-time, local-scale pollution; air toxics; persistent and complex pollution; and using emerging air quality measurement technologies.” (GAO report, p. 57)
Both monitoring agencies and the general public need more information on low-cost air sensors, especially with respect to suitable use cases for sensors and best practices for the interpretation of data from these networks.
The GAO report highlights concerns from air quality officials that the public will misinterpret uncalibrated data and question the differences between uncalibrated sensors and government monitoring equipment. That is why it is essential that the importance of calibration is recognized and communicated to the public as these sensors are adopted at scale.
The importance of calibration:
PM2.5 data collected from a low-cost sensor on the GAO building and an EPA-overseen monitor located 2 miles away
You can find more information on the proper calibration and assessment of data from low-cost air sensors in Clarity’s Calibration Guide, available here.
Data accuracy is another important factor to understand when using low-cost sensors. Data can be influenced by the presence of other pollutants or even by environmental temperature or humidity. Sensor data accuracy can be assessed using a variety of statistical tools, which we have provided an overview of here.
Making cleaner air a permanent part of the United State’s future
The GAO report’s recommendation for the supplementation of existing government monitors with low-cost sensors presents an exciting vision for the future of air quality monitoring in the United States.
As low-cost sensors become more common, we’ll see the number of air quality data points increase by orders of magnitude. Dozens or even hundreds of sensors can be deployed for the equivalent cost of a single FRM or FEM, allowing for data to be collected at a more localized level across the country. Rather than relying on regional monitoring sites, changes in pollutant levels can be picked up in real-time, leading to quick and effective decision-making when it comes to protecting air quality and public health.
Many state and local officials we interviewed said that they saw potential for using low-cost sensors in the future for applications, such as identifying ideal locations for regulatory monitors, locating pollution hotspots, supporting community-based monitoring initiatives, expanding air toxics monitoring, addressing citizen concerns and questions, and tracking wildfire smoke.” (GAO report, p. 70)
The report concludes with a call to action for the USEPA to produce a plan for the modernization of air quality monitoring infrastructure in the United States. In their written comments on the report the EPA agreed with this recommendation. This is a plan that we look forward to seeing published for the benefit of air quality managers, researchers, and the public alike.
Putting the GAO’s recommendations into action: how to get started with a low-cost air sensor network
While low-cost sensors are a powerful tool in any air quality manager’s toolkit, there is no need to implement a sweeping network of sensors right off the bat. Low-cost sensors have lower upfront and operational costs than traditional monitoring systems, so agencies can begin implementing them as pilots or neighborhood-scale deployments with an extremely low initial investment. Many air quality managers opt to start with a smaller pilot to get familiar with low-cost air sensor technology before scaling the solution across their jurisdiction.
Many factors come into play when planning and implementing a low-cost sensor network. To help government agencies better understand best practices for the use of low-cost sensors, Clarity has released a guide available for download here: Maximize Your Air Quality Budget in a Post-COVID World: A Guide to Leveraging Low-cost Sensors for Air Quality Monitoring 2.0.