4,862
Views
178
CrossRef citations to date
0
Altmetric
Technical Papers

U.S. National PM2.5 Chemical Speciation Monitoring Networks—CSN and IMPROVE: Description of networks

, , , , &
Pages 1410-1438 | Received 15 Aug 2013, Accepted 15 Aug 2014, Published online: 14 Nov 2014

Abstract

The U.S. Environmental Protection Agency (EPA) initiated the national PM2.5 Chemical Speciation Monitoring Network (CSN) in 2000 to support evaluation of long-term trends and to better quantify the impact of sources on particulate matter (PM) concentrations in the size range below 2.5 μm aerodynamic diameter (PM2.5; fine particles). The network peaked at more than 260 sites in 2005. In response to the 1999 Regional Haze Rule and the need to better understand the regional transport of PM, EPA also augmented the long-existing Interagency Monitoring of Protected Visual Environments (IMPROVE) visibility monitoring network in 2000, adding nearly 100 additional IMPROVE sites in rural Class 1 Areas across the country. Both networks measure the major chemical components of PM2.5 using historically accepted filter-based methods. Components measured by both networks include major anions, carbonaceous material, and a series of trace elements. CSN also measures ammonium and other cations directly, whereas IMPROVE estimates ammonium assuming complete neutralization of the measured sulfate and nitrate. IMPROVE also measures chloride and nitrite. In general, the field and laboratory approaches used in the two networks are similar; however, there are numerous, often subtle differences in sampling and chemical analysis methods, shipping, and quality control practices. These could potentially affect merging the two data sets when used to understand better the impact of sources on PM concentrations and the regional nature and long-range transport of PM2.5. This paper describes, for the first time in the peer-reviewed literature, these networks as they have existed since 2000, outlines differences in field and laboratory approaches, provides a summary of the analytical parameters that address data uncertainty, and summarizes major network changes since the inception of CSN.

Implications

Two long-term chemical speciation particle monitoring networks have operated simultaneously in the United States since 2001, when the EPA began regular operations of its PM2.5 Chemical Speciation Monitoring Network (IMPROVE began in 1988). These networks use similar field sampling and analytical methods, but there are numerous, often subtle differences in equipment and methodologies that can affect the results. This paper describes these networks since 2000 (inception of CSN) and their differences, and summarizes the analytical parameters that address data uncertainty, providing researchers and policymakers with background information they may need (e.g., for 2018 PM2.5 designation and State Implementation Plan process; McCarthy, 2013) to assess results from each network and decide how these data sets can be mutually employed for enhanced analyses. Changes in CSN and IMPROVE that have occurred over the years also are described.

Introduction

Particulate matter (PM) is emitted into the air from a wide variety of anthropogenic and biogenic sources. Particles can be emitted directly into the air (primary particles) or formed in the atmosphere (secondary particles) from gaseous precursor emissions that nucleate to form new particles or condense on existing particles (Seinfeld and Pandis, Citation1998). The resulting chemical composition of ambient particles is complex, due to the many different sources of primary particles and precursor gases and their subsequent chemical reactions and physical interactions in air. Understanding the sources of this complex mixture of ambient PM is important because PM has been shown to have adverse effects on human health, degrade visibility, increase the acidity of lakes and streams, damage materials and crops, and impact global climate (U.S. Environmental Protection Agency [EPA], 2009a). Linking sources to ambient PM concentrations and ultimately to health effects and visibility degradation is achieved, in part, by understanding the chemical composition of atmospheric PM and how it varies temporally and spatially (e.g., Hopke, Citation1991; Schauer et al., Citation1996; Hopke, Citation2003; Brook et al., Citation2004; Watson et al., Citation2008; Solomon et al., Citation2008; Solomon and Hopke, Citation2008; Solomon et al., Citation2012). However, near the end of the 20th century detailed information on particle composition was lacking for many locations around the country. To fill this gap, the EPA and the National Park Service (NPS) established or expanded national chemical speciation monitoring networks to collect atmospheric particulate matter at urban sites (EPA) with a focus on linking sources to health effects (Federal Register, Citation1997) and at rural to pristine sites (Class I areas; NPS) with a focus on linking sources to visibility degradation (Federal Register, Citation1999). Of interest here are the networks related to collecting particles with aerodynamic diameters (AD) less than 2.5 μm (fine or PM2.5) as particles in this size range have been associated with adverse health effects and visibility degredation (EPA, Citation2009a; Harrison and Yin, Citation2000; McClellan et al., Citation2004; Kampa and Castanas, Citation2008; Solomon et al., Citation2012).

The two monitoring programs are referred to as the EPA’s PM2.5 National Chemical Speciation Network (CSN) (EPA, Citation1999a) and the Interagency Monitoring of Protected Visual Environments (IMPROVE) network (IMPROVE, Citation2012a). These programs were designed to obtain information about the spatial and temporal chemical composition of ambient fine particles. Due to the regional nature of fine particles and the impact of regional PM on urban PM levels (and vice versa), the EPA realized the need to have a combined data set that includes both CSN (urban) and IMPROVE (rural) data. This combined data set has been used by the EPA to develop effective emissions control strategies for fine particles in urban areas, including the impacts of short- and long-range transport. While similar in design, the science and policy objectives of the individual programs influenced their specific sampling and analysis approaches.

Variations in design and operation between these networks have resulted in potential differences in the measured values. Understanding these differences is critical to using the combined dataset and comparing results among sites in these two networks.

Data from these networks have been used in State Implementation Plans (SIPs) for more than a decade. Shortly, the EPA and the states will begin the PM2.5 designation and SIP process for 2018 attainment (McCarthy, Citation2013), and the information presented here should be useful for the analyses needed to support relevant policy objectives. The specific objectives of this paper are to (1) describe the ambient measurement and laboratory chemical analysis methods employed in these networks since 2000; (2) describe the related quality assurance procedures employed; (3) report observed quality assurance results (e.g., blanks, minimum detection limits [MDLs], and uncertainty); and (4) summarize changes in the networks that have occurred since 2000, when CSN began operations.

Network Descriptions

The EPA’s CSN Program was established through promulgation of the 1997 PM2.5 National Ambient Air Quality Standards (NAAQS) (Federal Register, Citation1997). The CSN began as a small pilot network of 13 sites operating from February through July 2000. As the program grew, some sites were specified as part of the National Air Monitoring Stations (NAMS) network designed to track long-term trends in PM2.5 mass and composition to validate the efficacy of emission reductions strategies and support implementation of the NAAQS. These sites were initially referred to as Speciation Trends Network (STN) sites. Other sites were specified as part of the State and Local Air Monitoring Stations (SLAMS) network established primarily to support NAAQS implementation, with state, local, and tribal (SLT) air quality agencies determining the need for length of operation. Note that NAMS or STN is a subset of SLAMS. Both the STN and SLAMS chemical speciation sites are now referred to as the Chemical Speciation Network, designed to support the additional objectives of health and exposure research studies and programs aimed at improving environmental welfare (e.g., visibility degradation).

In the fall of 2000, CSN began a network expansion from its original 13 sites and by October 2005 there were 54 STN, 181 SLAMS and Tribal sites, and 28 SLAMS that had IMPROVE monitors (IMPROVE Protocol Sites, defined later) (EPA, Citation2005a). All STN sites are located in urban areas, but a few of the SLAMS sites are in nonurban areas. In early 2005, the EPA began reducing the number of speciation monitors in the CSN network due to Congressional budget reductions at EPA. In April 2006, the network consisted of 54 STN, 159 SLAMS sites, 28 SLAMS IMPROVE protocol sites, and three Tribal sites (EPA, Citation2006). In 2012, there were 51 STN, 127 SLAMS sites, no Tribal sites, and 28 SLAMS IMPROVE protocol sites (EPA, Citation2013). indicates the CSN site locations as of 2006 and as of 2012.

Figure 1. CSN and IMPROVE site locations are illustrated as of (a) 2006 and (b) 2012. Colocated: sites where a network had side-by-side samplers to estimate precision. Carbon backup filter: IMPROVE sites (6 in 2006 and 12 in 2012) that had a QFF backup filter to estimate the OC artifact (see text for details). All CSN sites had backup QFF for OC artifact once the IMPROVE methods (URG 3000N sampler and IMPROVE_A analysis protocol) were implemented, between 2007 and 2009; prior to implementation of the URG 3000N no sites included backup QFF. An interactive IMPROVE map with a complete site list can be found at http://views.cira.colostate.edu/fed/Tools/SiteBrowser.aspx. CSN site listing can be found at EPA (Citation2013).

Figure 1. CSN and IMPROVE site locations are illustrated as of (a) 2006 and (b) 2012. Colocated: sites where a network had side-by-side samplers to estimate precision. Carbon backup filter: IMPROVE sites (6 in 2006 and 12 in 2012) that had a QFF backup filter to estimate the OC artifact (see text for details). All CSN sites had backup QFF for OC artifact once the IMPROVE methods (URG 3000N sampler and IMPROVE_A analysis protocol) were implemented, between 2007 and 2009; prior to implementation of the URG 3000N no sites included backup QFF. An interactive IMPROVE map with a complete site list can be found at http://views.cira.colostate.edu/fed/Tools/SiteBrowser.aspx. CSN site listing can be found at EPA (Citation2013).

The IMPROVE monitoring program was established in 1985 to aid in the development of federal implementation plans and SIPs for the protection of visibility in Class I areas as stipulated in the 1977 amendments to the Clean Air Act. The network began collecting samples in 1988 with 20 sites (Eldred et al., Citation1990). At that time, IMPROVE had a long-term goal of monitoring for regional haze in all visibility-protected Federal Class I areas as stipulated in the 1999 Regional Haze Rules (Federal Register, Citation1999). By 1999, the IMPROVE network had increased to 30 IMPROVE sites in Class I areas and 40 IMPROVE Protocol sites, that is, sites that are part of the IMPROVE database but are funded separately by a federal land manager, state agency, or other entity. IMPROVE sites and IMPROVE Protocol sites are operationally identical in every respect, with the only differences being their source of funding and, in some cases, their proximity to Class I areas. With implementation of the Regional Haze Rule and the EPA’s desire to combine CSN and IMPROVE data, the IMPROVE network was expanded by spring 2004 to 110 IMPROVE sites in Class I areas and 57 Protocol sites. All but a few IMPROVE sites are located in Class I Areas where aerosol concentrations are typically much lower than urban levels. Twenty-eight of the IMPROVE Protocol sites are operated at state SLAMS sites, as noted earlier, and the SLAMS data are also included in the EPA’s Air Quality System (AQS), the EPA’s repository of ambient air quality data (EPA, Citation2012a). The number of IMPROVE sites has remained roughly the same since 2004. indicates the IMPROVE site locations as of 2006 and as of 2012.

Similar time-integrated, filter-based approaches and laboratory analytical methods are used in both networks; each measures PM2.5 mass and its major components including sulfate, nitrate, ammonium (CSN only), organic carbon and elemental carbon (OC and EC, respectively), and a number of elements found in crustal and industrial sources, including, but not limited to, Mg, Al, Si, S, Ca, Ti, V, Fe, and Zn. However, due to the different policy objectives of the two networks, differences, sometimes subtle in nature, exist in the sampling and analytical procedures employed (IMPROVE, Citation2012a; Koutrakis, Citation1998; Koutrakis, Citation1999; EPA, Citation1999a). Choices in specific measurement and analytical methods also were made based on funding, available technology, and the desire for long-term monitoring (Koutrakis, Citation1998; Koutrakis, Citation1999; EPA, Citation1999). Current information relating to CSN and IMPROVE measurements is provided at their respective websites (IMPROVE, Citation2012a; EPA, Citation2012b).

present various aspects of the two measurement programs showing the subtle and not so subtle differences among the methods employed. Despite these many differences, results from the two networks typically agree well for high-concentration and nonlabile species such as sulfate.

Table 1. Design specifications for IMPROVE and CSN samplers, the latter employed in EPA Speciation Trends Network

Table 2. Filter medium (as manufacturer and model), acceptance tests, and sample preparation

Table 3. Analytical procedures by filter type and component. All measured species are listed in

Table 4. Operating conditions for the original CSN and the current CSN and IMPROVE OC/EC analysis protocols

Table 5. Denuder characteristics by sampler type. Nylon channel in all samplers, see and

Table 6. Sampling handling, shipping conditions, holding times, and laboratory storage requirements.

Table 7. Filter characteristics, flow rates, and face velocities

Figure 2. CSN MetOne SASS sampler schematic. RAAS and MASS samplers were only used through early 2009; their schematics are available at Solomon et al. (Citation2000) and pictures, including SASS, are available at EPA (Citation2012c).

Figure 2. CSN MetOne SASS sampler schematic. RAAS and MASS samplers were only used through early 2009; their schematics are available at Solomon et al. (Citation2000) and pictures, including SASS, are available at EPA (Citation2012c).

Figure 3. IMPROVE sampler schematic. Pictures are available at EPA (Citation2012c).

Figure 3. IMPROVE sampler schematic. Pictures are available at EPA (Citation2012c).

  • provides design specifications for IMPROVE and the three STN samplers. See (MetOne) and 3 (IMPROVE) for sampler schematics.

  • summarizes analytical laboratory methods by filter type and chemical component.

  • summarizes filter media, acceptance tests, and sample preparation procedures.

  • lists the original CSN and current CSN and IMPROVE operating conditions for the OC/EC analysis protocols.

  • describes denuders used to remove acidic gases in the channel that contains the nylon filter. See and .

  • provides a summary of sample handling, shipping conditions, holding times, and laboratory storage requirements.

  • describes the filter characteristics, flow rates, and face velocities for both networks.

Sampling methods

CSN and IMPROVE samplers both employ filter-based time-integrated (24-hr average) sampling approaches that can provide a nearly complete mass balance via subsequent laboratory chemical analysis of the collected PM. For the collection of nitrate, both networks use denuders and reactive nylon filters to minimize sampling artifacts caused by nitrate volatilization (Hering et al., Citation1988; Hering and Cass, Citation1999). Carbon artifacts are accounted for by using various blank correction approaches (Kim et al., Citation2005; Chow et al., Citation2008; Watson et al., Citation2009; Chow et al., Citation2010; Maimone et al., Citation2011), as discussed later. Therefore, these samplers are designed to minimize sampling artifacts for nitrate while allowing flexibility for estimating organic carbon artifacts. Additional details regarding CSN and IMPROVE samplers are given in and in standard operating procedures (RTI, Citation2013; IMPROVE, Citation2012b).

CSN

Three prototype and commercially available chemical speciation samplers were initially evaluated for use in the CSN (Solomon et al., Citation2000; Solomon et al., Citation2003). The prototype samplers were the MetOne SASS (Spiral Ambient Speciation Sampler), the URG MASS (University Research Glassware Mass Aerosol Speciation Sampler), and the Thermo Anderson RAAS (Reference Ambient Air Sampler). Photographs and schematics of the three samplers can be found in the literature (Solomon et al., Citation2000) and on the EPA’s Technology Transfer Network web site (EPA, Citation2012c). All three samplers, one at each of the three sites, also were included in the 2001–2003 CSN–IMPROVE comparison, which will be the focus of a subsequent paper. R&P samplers (model 2300; Rupprecht & Patashnick, Co., now Thermo-Scientific) were not available at the time of initial testing, and therefore were not included in the CSN–IMPROVE comparison study. In addition, few CSN sites used R&P samplers (7 % in 2006, none in 2012) and they are not discussed in this paper.

shows a schematic drawing of the MetOne sampler, which as of January 2007 was used at 84% of the CSN sites. Given the high percentage of MetOne samplers at that time, the EPA made a decision to convert the entire network to MetOne samplers for collecting PM on the Teflon and nylon filters. This conversion was completed by mid-2009. The original MetOne sampler consisted of five sampling channels. A second version (SuperSASS) was developed with eight channels to allow for sequential sampling, which as of the beginning of 2014 had not been implemented. In both the SASS and the SuperSASS, each channel operates at a flow rate of 6.7 L/min and each has its own sampling module designed for 47-mm diameter filters. In the SASS, three channels (SuperSASS, six channels—two sets used for sequential sampling) are normally used for CSN sampling, with one each for Teflon, nylon, and quartz-fiber filters (QFFs). In both samplers, three channels employ mass flow controllers, allowing flow control to within 1% or better, although corrections are needed for variations in temperature and pressure to obtain volumetric flow. The additional channels in both samplers employ critical flow orifices for flow control (Baker and Pouchot, Citation1983; Hinds, Citation1999) and are available for replicate sampling or other uses. Each sampling module uses a sharp cut cyclone (SCC 2.141) that has similar performance characteristics to the EPA Well Impact Ninety-Six (WINS) impactor used in the PM2.5 FRM (Peters et al., Citation2001). Sampling modules consisting of the filter cassettes and denuders are shipped to and from the laboratory with each sample.

The Anderson RAAS has four available flow channels, two each under two identical cyclone/manifold sets. The total flow rate through the inlet is 48 L/min and through each manifold is 24 L/min. Like the SASS, three channels are used for CSN sampling. The fourth channel is available for other needs, such as collecting a replicate sample. One channel under each manifold operates at 16.7 L/min and each includes a Teflon filter, with one used for mass and elemental analysis and the other for replicate measurements. The second channel under each manifold operates at 7.3 L/min. One channel includes the nylon filter, which is preceded by an MgO-coated annular denuder to remove acidic gases, and the other includes a QFF. All channels employ mass flow control and all use 47-mm diameter filters. Sampling modules consisting of the filter cassettes and denuders are shipped to and from the laboratory with each sample.

The URG MASS 400/450 sampling system uses two stand-alone samplers operating at 16.7 L/min, each with mass flow control. Each sampler includes a filter holder for 47-mm diameter filters and both samplers employ the WINS impactor identical to that used in the PM2.5 FRM program. The MASS 450 contains a single QFF for OC and EC analysis. The MASS 400 includes a sodium carbonate coated annular denuder upstream of a Teflon–nylon sequential filter pack. Nitrate that can volatilize from the front Teflon filter during sampling is captured and measured on the backup nylon filter (Hering and Cass, Citation1999; Solomon et al., Citation2000). Therefore, for this sampler only, nitrate is the sum of nitrate measured on the Teflon front and nylon backup filter. Mass, trace elements, and then ions in that sequence are measured on the Teflon filter. As a result, lower nitrate values are observed on the URG relative to the other samplers, since nitrate can volatilize off the Teflon filter during analysis by energy-dispersive x-ray fluorescence (EDXRF), which occurs prior to analysis for ions (Solomon et al., Citation2000; Solomon et al., Citation2003, Yu et al., Citation2006).

In February 2007, CSN began transitioning sampling on QFF for carbonaceous components to the URG 3000N sampler (EPA, Citation2007b; EPA, Citation2012e). This sampler is virtually identical to the sampler used by IMPROVE for collecting OC and EC, except the critical orifice was replaced with a mass-flow controller and how filters are shipped between the laboratory and field as described later. QFF collected by the URG 3000N are analyzed by the IMPROVE_A method. This transition was completed by late 2009 and all carbon analyses are now done using the URG 3000N/IMPROVE_A protocols.

CSN samplers are operated by state, local, and tribal (SLT) agency personnel. RTI International (Research Triangle Park, NC) is currently the laboratory responsible for filter preparation, shipping, chemical analysis, and database management for all STN sites with the exception of carbon analysis, which since 2009 has been preformed by the Desert Research Institute (DRI) using the IMPROVE-A method. Primarily RTI but also others are responsible for these tasks at SLAMS sites. These include state laboratories in California, Oregon, Nevada, the two former for California and Oregon sites and the latter for SLAMS sites in the state of Texas. Samples are collected once every 3 days at STN sites and once every 3 or 6 days at SLAMS sites; the latter schedule is defined by the STL agency responsible for the sampling site. Samples collected by CSN can remain on the sampler for up to 3 days (EPA, Citation2012d). All CSN filters remain in their sampling modules or filter holders, which are sealed at both ends, placed in sealed antistatic Ziploc plastic bags, and shipped in coolers with blue ice by overnight carrier service. The temperature target at receipt is not to exceed 4°C, although this has been frequently exceeded by a few degrees, particularly in the summer months. A flag is noted in the database for samples arriving at greater than 4°C. Before and after sampling filters remain in clear polystyrene petri dishes (Millipore PetriSlides or equivalent) that are stored at reduced temperatures (<0°C). Petri slides are used as they come from the manufacturer, without further cleaning, but are visually inspected for debris or deformities. Petri slides are not reused in CSN. Additional details about the CSN sampling and analysis protocols can be found at the AMTIC web site (EPA, Citation2012b).

IMPROVE

All IMPROVE and IMPROVE Protocol sites use the same sampler design as shown in and summarized in . The sampler consists of three separate modules for PM2.5 and a fourth module for PM10. Each module uses a critical orifice to control flow, and contains four solenoid valves and corresponding filter holders to allow for a blank sample to be collected and/or for sequential operation. One of the PM2.5 modules uses a 25-mm diameter Teflon filter for mass, light absorption, and trace elements; a second uses a 37-mm nylon filter for ions, which is preceded by a sodium carbonate-coated Al annular denuder; and a third uses a 25-mm QFF for organic and elemental carbon. Finally, a 25-mm Teflon filter is used in the PM10 module for PM10 mass.

IMPROVE samplers are operated by local operators at each site, often National Park Service (NPS) personnel or state agency personnel. Currently, IMPROVE samples are collected once every 3 days, to be consistent with CSN, whereas prior to 2000 they were collected only on every Wednesday and Saturday. Filters are allowed to be left on the sampler for up to 7 days after collection (IMPROVE, Citation2012c). IMPROVE filters are prepared at University of California (UC) Davis, with the exception of QFF filters that are prefired at the DRI and shipped to UC Davis prior to sampling. Filters are sent between the field sites and UC Davis in their sampling cassettes sealed in Ziploc plastic bags at ambient temperature. Shipping is by UPS or by standard U.S. mail in plastic containers designed for durability but not thermal stability. Following sampling, QFF filters for organic carbon analysis are shipped between UC Davis and DRI at reduced temperatures, where the QFF are stored below 0°C and analyzed. Nylon filters are shipped in Petri dishes at ambient temperatures by U.S. mail to RTI for analysis of ions. Petri dishes are used initially as they come from the manufacturer, without further cleaning but are visually inspected for debris or deformities. Used petri slides for nylon filters are washed in ethanol and reused with no contamination observed. Other petri slides are not reused. Additional details are given in the IMPROVE Quality Assurance Project Plan (Flocchini et al., Citation2002) and in IMPROVE standard operating procedures (IMPROVE, Citation2012b).

Analytical methods

Similar analytical methods are employed by the CSN and IMPROVE monitoring programs, as summarized in . PM2.5 mass is measured gravimetrically by weighing conditioned filters before and after sampling. Water-soluble ionic species are extracted from the nylon filter (the Teflon filter in the case of the URG MASS sampler) and determined by ion chromatography (IC). Organic and elemental carbon are determined by thermal optical analysis (TOA) from QFF and up to 48 trace elements have been measured by EDXRF from the Teflon filter used first for mass determination. Organic species are not measured routinely in either network. Numerous subtle and not so subtle differences exist between the methods employed in the two networks with regard to the analytical methods, preparation procedures, sample storage, shipping, and methods for determining and correcting for blanks and interferences.

Additional details for the CSN analytical methods are given in Solomon et al. (Citation2001) and EPA (Citation1998) and in specific laboratory standard operating procedures (SOPs) for mass and chemical speciation (RTI, Citation2008, Citation2009a, Citation2009b, Citation2009c, Citation2009d, Citation2009e, Citation2009f, Citation2011).

IMPROVE mass and chemical analyses are conducted at UC Davis, RTI, and DRI. Mass, elemental, and light absorption analyses are conducted at UC Davis, ions are analyzed at RTI, and OC/EC analysis is at DRI. Additional details for IMPROVE analytical procedures are given in the IMPROVE Quality Assurance Project Plan (Flocchini et al., Citation2002) and standard operating procedures (IMPROVE, Citation2012b).

The remainder of this section gives brief summaries of the methods employed by both programs and addresses some of the differences.

Acceptance testing

Acceptance testing of new filter lots is conducted to ensure filter quality and consistency over time. Acceptance tests and sample preparation procedures are summarized in for both networks (DRI, Citation2005; RTI, Citation2011, IMPROVE, Citation2012d). Differences between acceptance testing protocols exist but likely make little difference in the measurements since quality assurance procedures require the collection and analysis of blanks throughout the sample and analysis process.

For CSN, Teflon filters are visually inspected and a subset (10) of filters analyzed for trace elements. If the average blank level (μg/cm2) for any element is greater than three times the variability (3σ) of the MDL, then an intercept is included in the EDXRF calibration equation for that element. The same intercepts are used until a new batch of filters is received; the 10 filters do not constitute a fixed percentage of filters.

CSN nylon filters received from the manufacturer are washed for 24 hr in several changes of DI water (RTI, Citation2009b). The washing solution for 2 out of every 100 filters is analyzed for ionic species to ensure they are below acceptance levels as given in (EPA, Citation2000).

CSN QFF are heat treated to remove OC associated with new filters from the manufacturer (). This process lowers the blank level to below the MDL. However, while heat treating removes the initial OC artifact, QFF medium continues to sorb organic gases until reaching pseudo-equilibrium. The overall artifact is both passive and active and depends on the sampled volume and several other factors and can reach the equivalent of 4 μg/m3 for CSN filters sampled with the MetOne sampler (McDow and Huntzicker, Citation1990; EPA, Citation1998; Solomon et al., Citation2000; Turpin et al., Citation1994; Turpin et al., Citation2000; Maimone et al., Citation2011).

All Teflon filters used in IMPROVE are visually inspected for imperfections. For each new lot, the pressure drop at 22.8 L/min is checked in the laboratory using a manometer and a subset (10) of filters is weighed to verify the correct weight range of each new batch of filters. Finally, selected filters are analyzed by EDXRF, proton elastic scattering analysis (PESA, discontinued January 2011), and hybrid integrating plate/sphere analysis (HIPS) to check for contamination.

New lots of IMPROVE nylon filters also are visually inspected and the pressure drop is measured similarly to the Teflon filters. A subset of filters is analyzed for sulfate, nitrate, nitrite, and chloride contamination to ensure they are below acceptance levels as given in .

IMPROVE QFF filters are visually inspected and heat treated to reduce initial blank levels, similar to CSN. However, while filters in both networks are heated at the same temperature, the time is slightly different, as well as how the filters are treated and stored after heating (). A subset of filters is selected after the initial cleaning and analyzed for OC and EC to ensure blank values are within acceptable levels.

Mass

Gravimetric analysis is used in both networks to obtain PM2.5 mass (). IMPROVE also measures PM10 mass by this method. An estimate of coarse particle mass (particles in the size range between 2.5 and 10 μm AD) can be obtained by difference between simultaneously measured PM10 and PM2.5 mass concentrations. Collection of PM10 also provides a quality control check on the PM2.5 mass, since PM2.5 mass cannot be greater than PM10 mass.

CSN Teflon filters used to obtain PM mass are equilibrated for 24 hr before and after weighing to minimize mass variation due to particle-bound water (). CSN adheres to the FRM PM2.5 mass analysis protocol to obtain concentrations in units of micrograms per cubic meter (Federal Register, Citation1997; EPA, Citation1998; RTI, Citation2008). IMPROVE equilibrates Teflon filters for mass at a slightly different temperature and relative humidity (RH) range but only for several minutes (). Unpublished laboratory tests of exposed Teflon filters at UC Davis as well as EPA audits have demonstrated that filters collected at rural and remote monitoring sites appear to equilibrate to laboratory conditions within a few minutes (IMPROVE, Citation2012d).

Ions

Ion chromatography is used to determine concentrations of major water-soluble ions from the PM collected on the nylon filter in both networks, and nitrate only for the nylon filter used in the URG sampler (). Ions also were measured on the Teflon front filter collected by the URG MASS. In this case, total nitrate equaled the sum of nitrate measured on the Teflon front filter and nylon backup filter. Both networks measure sulfate and nitrate. CSN also measures ammonium, sodium, and potassium ions, whereas IMPROVE also measures chloride and nitrite ions and estimates ammonium based on fully neutralized sulfate ((NH4)2SO4) and nitrate (NH4NO3). RTI analyzes filters for ions for both networks (RTI, Citation2009c, Citation2009d; IMPROVE, Citation2005). The same IC conditions are used in both networks, and both networks extract nylon filters in deionized (DI) water, except for the CSN URG nylon backup filter, which is extracted in IC eluent. Slightly different extraction conditions are used between the networks. IMPROVE extracts at room temperature, whereas CSN extracts at reduced temperatures to stabilize semi-volatile species, such as nitrate and ammonium. However, high extraction efficiencies have been demonstrated for both programs (Clark, Citation2002a). Differences in ion analysis approaches are summarized in . Use of nylon as the collection medium results in a potential negative artifact for ammonium (Solomon et al., Citation2000; Yu et al., Citation2006).

Carbon

Thermal optical analysis is used to determine concentrations of organic and elemental carbon in both networks (). The two programs initially used different temperature protocols and optical approaches, the latter to correct for OC charring. The different temperature protocols are shown in . The different optical approaches occur during the He temperature steps (Chow et al., Citation1993; Chow et al., Citation2001; Chow et al., Citation2004; Chow et al., Citation2005a; Chow et al., Citation2007; Peterson and Richards, Citation2002; Watson et al., Citation2005; RTI, Citation2009e). Beginning in 2008 and finishing in 2009, CSN transitioned to the IMPROVE_A analysis protocol to obtain consistency between the networks. Prior to 2009, CSN used thermal optical transmittance (TOT) (Peterson and Richards, Citation2002), whereas IMPROVE used thermal optical reflectance (TOR) (Chow et al., Citation1993). Differences due to use of TOT versus TOR have been observed (Chow et al., Citation2004). Beginning with samples collected in January 2005, IMPROVE provides both TOR and TOT results (DRI, Citation2005; Chow et al., Citation2007). Several studies have been conducted comparing the IMPROVE, CSN, and in some cases other protocols for OC and EC (Chow et al., Citation2001; Schmid et al., Citation2001; Conny et al., Citation2003; Chow et al., Citation2004; Watson et al., Citation2005; Cheng et al., Citation2011).

The variation between the two approaches for OC measured on typical urban and rural samples ranges from about 10 to 30%; for EC the difference can be up to a factor of 2 or more for the different analytical protocols used in the comparison studies (Solomon et al., Citation2000; Chow et al., Citation2001; Schmid et al., Citation2001; Solomon et al., Citation2003; Chow et al., Citation2004; Watson et al., Citation2005; Chow et al., Citation2008; Cheng et al., Citation2011). Total carbon (TC), equal to the sum of OC and EC, usually agrees to within 10-15%. Agreement for TC indicates that differences in the analytical approaches for OC and EC are likely due to how the two protocols determine the split between OC and EC and/or correct for OC charring (i.e., CSN TOT approach vs. IMPROVE TOR approach).

IMPROVE also analyzes the PM2.5 Teflon filters for optical absorbance (1/m) of the particles collected on the filters by HIPS (Bond et al., Citation1999). Optical absorbance is not measured in CSN. Absorbance by the collected particles is important for estimating light extinction, a critical parameter for quantifying visibility and IMPROVE’s primary policy objective. The absorbance is divided by an empirically determined absorptivity coefficient (m2/μg) to convert it to black carbon (BC) in concentration units of μg/m3, although a range of values exists for this conversion (Bond and Bergstrom, Citation2006). BC measured optically is similar to but not the same as EC measured thermally (Fehsenfeld et al., Citation2004). In HIPS, the filters are exposed to light at 633 nm wavelength from an He(Ne) laser and the light transmitted through the sample is collected with a photodiode detection system. This measurement provides a quality control check on the absorptivity coefficient and/or elemental carbon measurements, the latter since elemental carbon is typically the predominant light absorbing particulate species in the atmosphere.

Trace elements

Energy-dispersive x-ray fluorescence is used by both programs to quantify trace elements (Na–Pb) in PM2.5 collected on Teflon filters () (RTI, Citation2009f; IMPROVE, Citation2012e). IMPROVE also has used PESA to measure hydrogen (IMPROVE, Citation2012f), but this analysis was discontinued in 2011 for budgetary reasons. The CSN program initially reported 48 elements (reduced to 33 elements in 2009), while IMPROVE reports 25, focusing only on those that are reliably measured above the MDL of the instrument. As noted in , prior to December 2001, IMPROVE used proton induced x-ray emission (PIXE) for Na–Mn and EDXRF for Fe–Pb (IMPROVE, 2006). The switch to only EDXRF provides lower MDLs for most elements of interest, as well as ensuring better sample preservation for reanalysis (PIXE can damage filters). Still, EDXRF is not sufficiently sensitive for many of the elements measured by both networks, so typically only about 10–15 elements are reported above their detection limit for the primarily urban CSN and about 15–20 elements for the primarily rural IMPROVE network. The higher filter loading (volume sampled/area of filter, m3/cm2) combined with differences in analytical parameters results in better ambient MDLs for IMPROVE, which allows more species to be measured above their detection limits.

Different EDXRF instruments are used by the two programs. In addition, to obtain sample throughput, CSN uses two EDXRF instruments and laboratories, one operated by RTI and the other by Chester LabNet (Portland, OR), the latter for only 10 percent of the samples.

The RTI EDXRF and Chester LabNet laboratories analyze filters from the long-term trend networks (STN and NCORE; EPA, Citation2005c; Scheffe et al., Citation2009). Filters collected by California, Oregon, and Texas are analyzed by state and local district laboratories (California Air Resources Board, DRI, and Oregon Department of Environmental Quality, South Coast Air Quality Management District) (Smiley, Citation2013). RTI uses Thermo Scientific QuantX EDXRF analyzers, each equipped with high-flux rhodium anode x-ray tubes and an electronically cooled lithium-drifted silicon detector (RTI, Citation2009f). Elements are analyzed using one of five different excitation conditions defined by different anode operating conditions and x-ray filters. Chester LabNet uses similar anodes and detector systems, though there are slight differences in the excitation conditions relative to those used by RTI (Chester, Citation2009). While each instrument is calibrated to meet quality control (QC) criteria individually, between-instrument and between-laboratory variations can introduce additional uncertainty in the CSN EDXRF results (Smiley, Citation2013).

Prior to 2011, IMPROVE used a Cu anode EDXRF system to quantify elements with atomic weights from Na through Fe, and an Mo anode for elements with atomic weights from Ni through Pb. Prior to 2005 the Cu anode system utilized a helium purge to remove air around the sample, thus minimizing interferences due to the argon spectral peak. Beginning with samples collected in January 2005, a vacuum system was introduced because He diffused through the Be window of the SiLi detector, changing the calibration and eventually destroying the detector. A second (nominally identical) vacuum-based EDXRF system was introduced in October 2005 to increase laboratory capacity. In 2011, the aging Cu and Mo anode systems were replaced by PANalytical Epsilon 5 EDXRF instruments. With the Epsilon 5 system, elements are analyzed using one of seven different excitation conditions defined by different exposure times and secondary x-ray targets. Data advisories addressing the impacts of changes in EDXRF hardware and procedures are available from IMPROVE (IMPROVE, Citation2012g). CSN has similar information available in annual data reports (http://www.epa.gov/ttn/amtic/specdat.html), special study reports (TTN, Citation2012a), and CSN the newsletters (2004–2009; http://www.epa.gov/ttn/amtic/spenews.html).

Self-absorption corrections for particle size and filter loading were applied by both networks, but a detailed description of the approaches employed is beyond the scope of this paper. Self-absorption corrections for IMPROVE were applied to the EDXRF data prior to 2011 (Dzubay and Nelson, Citation1975; IMPROVE, Citation1997). A homogeneously distributed sample deposit on the filter is assumed, as well as incorporating particle-size and mass loading effects. Self-absorption is strongest for the lightest elements, such as Na and Al, and decreases substantially for heavier elements. The correction was discontinued with the introduction of the new PANalytical Epsilon 5 instruments, which provide much-improved sensitivity for the lighter elements where self-absorption is greatest.

Similar to the IMPROVE methodology, the CSN self-absorption assumes a homogeneous and uniformly distributed sample and incorporates particle-size and mass-loading effects. The CSN self-absorption procedure is a modification of the protocol used by the EPA with the Thermo Scientific QuantX spectrometers (Dzubay and Wilson, 1974; EPA, Citation1999c). The procedure corrects for signal loss due to scattering and absorption by calculating an absorption correction factor that compensates for measured intensities in a homogeneous sample relative to a thin-film standard. Self-absorption corrections vary by sample composition, but in general are similar to those in the IMPROVE network.

EDXRF round-robin performance evaluations of CSN and IMPROVE laboratories were conducted by the EPA (Smiley Citation2013). Comparability between laboratories for many high-concentration elements (e.g., elements Fe, Ca, K, Zn) was considered acceptable. The poorest agreement was for Al and Na. This is expected, since low-molecular-weight elements have a relatively poor instrument response and typically require correction for self-absorptions, both of which can lead to higher uncertainty for these elements. Furthermore, the two networks use different approaches to correct for self-absorption as noted earlier, though the uncertainties seem to agree reasonably well (EPA, Citation2011).

Measurement Uncertainty

As in all measurement programs, measurement uncertainty in CSN and IMPROVE includes both random and systematic errors. Precision is a measure of random errors, which is usually determined for the combined field and laboratory operations by collecting colocated replicate samples or by propagating individual instrument errors through all components of the measurement process. Accuracy provides a measure of systematic error, which is a measure of the difference between a “true value” and the estimate of the true value based on the measurement. Overall measurement accuracy is not possible for most PM measurements since traceable standards applied in the field are not available (Fehsenfeld et al., Citation2004). However, analytical laboratory measurement accuracy is reported and general values are provided in Fehsenfeld et al. (Citation2004). Factors affecting overall uncertainty include the coupled effects of measurement precision, interferences, measurement and analysis artifacts, and MDL. Estimating these parameters is critical to understanding the validity of a measurement.

A number of studies have been conducted to better understand uncertainty and factors influencing uncertainty in CSN and IMPROVE measurements. A selection of papers published since about 2000, that is, most relevant to the current networks, are cited herein and provide information about uncertainties associated with measuring the chemical components of PM; however, a comprehensive review and evaluation of results published is beyond the scope of this paper. Studies examined uncertainty issues in only one network or the other, whereas other studies examined it between networks. These studies include the initial CSN intercomparisons (Solomon et al., Citation2000; Coutant and Stetzer, Citation2001; Solomon et al., Citation2003), estimating the overall measurement uncertainty for mass or one or more chemical components for CSN and IMPROVE (Solomon et al., Citation2000; Chow et al., Citation2001; Solomon et al., Citation2003; Ashbaugh et al., Citation2004a, Citation2004b; Chow et al., Citation2004; Watson et al., Citation2005; White et al., Citation2005; Yu et al., Citation2006; Smiley, Citation2013; Hyslop and White, Citation2008b), and estimating different uncertainty components (blanks, MDL, sampling artifacts) for OC, mass, ions, and elements by EDXRF (Iyer et al., Citation2000; Gutknecht et al., Citation2001; Kim et al., Citation2001; Lewtas et al., Citation2001; Fehsenfeld et al., Citation2004; Subramanian et al., Citation2004; Gego et al., Citation2005; Kim et al., Citation2005; Gutknecht et al., Citation2006; Zhao et al., Citation2006; Chow et al., Citation2008; Hyslop and White, Citation2008a; Watson et al., Citation2009; Chow et al., Citation2010; Gutknecht et al., Citation2010; Cheng et al., Citation2011; Maimone et al., Citation2011; Malm et al., Citation2011). Several special EPA Office of Air Quality Planning and Standards (OAQPS) studies also were conducted to examine sampling artifacts associated with QFF medium caused by a specific filter cassette (Clark, Citation2002b); the effect of QFF filter type (manufacturer) for collection and analysis of OC and EC (Peterson et al., Citation2007); the extraction efficiency of ions from nylon filters using water rather than a basic solution, the latter as used historically (Clark, Citation2002a; Clark, Citation2003); examining the efficacy and capacity of different diffusion denuder coatings for the removal of acid gases (Fitz, Citation2002); and the reanalysis of existing filters for independent interlaboratory comparisons and sample storage stability testing (Coutant and Stetzer, Citation2001; TTN, Citation2012b). Approaches used to estimate uncertainty for the CSN and IMPROVE networks based on CSN and IMPROVE measurements are summarized in the next few sections, including estimates of sampling artifacts, blank estimates, and MDL. Data completeness, an important parameter with regard to data validity, also is summarized for both networks.

CSN and IMPROVE Quality Assurance Programs

A rigorous quality assurance program (EPA, Citation1998; Flocchini et al., Citation2002; EPA, Citation2012d; EPA, Citation2012f; Smiley, Citation2013), as conducted by EPA for CSN and IMPROVE, helps to ensure the delivery of high-quality data with known uncertainty to data users, such as state and local agencies, the NPS, and other stakeholders. These groups use the data to develop strategies to protect public health and welfare from the adverse effects of air pollution. Quality assurance helps to identify factors that can increase uncertainty and, once minimized, ensures those uncertainty levels are maintained by establishing a set of performance criteria or data quality objectives (DQOs) (EPA, Citation1999b; White et al., Citation2005). For historical purposes, lists the performance criteria recommended for the CSN and IMPROVE networks in developing the joint networks (Koutrakis, Citation1999; Solomon et al., Citation2000, Solomon et al., Citation2003).

Table 8. Recommended performance criteria used to develop data quality objectives for CSN

QA field and laboratory programs

The EPA conducts system and performance audits for field and laboratory measurements in accordance with EPA Quality Assurance Project Plans for CSN and IMPROVE (EPA, Citation2000; Flocchini et al., Citation2002; EPA, Citation2012f). Field and laboratory audits include both system and performance audits. Laboratory audit results are available in EPA (Citation2011). Field audit results are not currently posted for public access, although RTI is beta testing an input to the EPA Air Quality System (AQS) for field audit results.

Field audits

Quality assurance project plans (QAPP) for CSN (EPA, Citation2000) and IMPROVE (Flocchini et al., Citation2002) field sampling operations provide guidance and detailed direction on field audits and procedures. The updated CSN QAPP (EPA, Citation2012f) and IMPROVE QAPP (IMPROVE, Citation2002) are currently the overarching quality assurance documents for CSN and IMPROVE’s external EPA field audits. Additional individual guidance documents are available for CSN (EPA, Citation2012b) and IMPROVE (IMPROVE, Citation2012e). A limited number of independent field audits were conducted by the EPA Office of Radiation and Indoor Air (ORIA), Radiation and Indoor Environments National Laboratory (R&IENL), Las Vegas, NV (western sites), and by OAQPS (eastern sites). Based on guidance in the CSN QAPP (EPA, Citation2000), comprehensive field audits were to occur on approximately 25% of the CSN sites annually. This percentage was never realized due to EPA travel budget reductions in 2001 so the EPA began to aggressively train SLT agency and NPS personnel to conduct the field audits. SLT and NPS audits are conducted by groups within these organizations that are independent of field sampling operations. Audits by EPA personnel were eliminated by 2010.

Field systems audits consist of six parts: (1) identifying the organization of and responsibilities for site operations, (2) evaluating safety, (3) validating that site location and samplers meet 40 CFR Part 58, Appendices A and E, siting criteria, (4) confirming that the monitoring area and logbooks are maintained, (5) validating proper sample handling and chain of custody, and (6) ensuring that proper procedures are being followed for onsite storage and shipping. Performance audits, conducted simultaneously with systems audits, consist of validating measurement quality objectives (MQOs) (; EPA, Citation2012f) against traceable reference standards.

Table 9. Measurement quality objectives for the CSN and IMPROVE networksa

Laboratory audits

The EPA Office of Radiation and Indoor Air, National Analytical Radiation Environmental Laboratory (NAREL) provides annual performance evaluation (PE) samples to laboratories that analyze CSN and IMPROVE filters. These samples consist of filters and solutions with known quantities of the analyte spiked onto the filters using NAREL laboratory analysis results as the reference (EPA, Citation2012b). NAREL also collects colocated ambient filters using the SuperSASS, which are then distributed with blank filters for analysis of mass (Teflon filters), ions (nylon and Teflon filters), carbon (QFF), and trace elements (Teflon filters). Separate anion and cation solutions of known concentrations and uncertainty also are provided for IC analysis. NAREL also sends performance audit (PA) samples to the analytical laboratories. These include NIST traceable metal weights for gravimetric analysis and sucrose spiked QFF for TOA analysis. For trace elements, Teflon filters with ambient PM are circulated that have been independently analyzed by EPA’s National Exposure Research Laboratory (NERL) EDXRF facility. Once the audited laboratories have analyzed the filters, they are returned to NERL for reanalysis to ensure that the concentrations of elements on the filters have not changed due to handling and shipping. On-site laboratory technical systems audits (TSA) also are conducted by NAREL personnel. Results from system and performance laboratory audits are available (EPA, Citation2011).

Artifact and blank corrections

A blank is a relatively constant error affecting the measurement value and can be evaluated with a control sample in which all steps of the analysis are performed in the absence of the actual sample. The blank value is then applied as a correction (subtraction) to the measurement. Three types of blank filters have been collected for these programs: laboratory, trip, and field. Each type helps to identify potential areas of contamination and is part of the standard EPA quality assurance/quality control programs.

Artifacts can be positive or negative and are variable errors that affect the measurement and may not always be properly evaluated with a blank. When sampling atmospheric particles, artifacts are typically due to collection of gases on a sampling substrate or volatilization of sample already collected. Determining the impact of an artifact on the measurement may be complicated (e.g., Chow, Citation1995; Hering and Cass, Citation1999; Solomon et al., Citation2000; Turpin et al., Citation2000; Kim et al., Citation2001; Pang et al., Citation2002; Ashbaugh et al., Citation2004a, Citation2004b; Subramanian et al., Citation2004; Chow et al., Citation2005a, Citation2005b; Kim et al., Citation2005; White et al., Citation2005; Yu et al., Citation2006; Watson et al., Citation2009; Chow et al., Citation2010; Maimone et al., Citation2011). Sampling artifacts are primarily associated with semivolatile components in organic carbon and ammonium nitrate. For example, issues associated with sampling OC on QFFs include adsorption and absorption on both collected particles and quartz-fiber filter material (positive sampling artifacts) and may occur passively during storage, transport, and handling or actively during sampling. Conversely, volatilization of semivolatile organic compounds (SVOC) from collected particles (negative sampling artifacts) can occur during sampling with a likely dependence on flow rate or pressure drop, temperature, and RH (McDow and Huntzicker, Citation1990). Negative artifact also can occur throughout the transport and storage processes after collection. Both positive and negative OC sampling artifacts are further influenced by volatile organic compounds (VOC) and SVOC ambient concentrations and species, by filter lot, filter preparation (thermal cleaning temperature and duration), filter holder material, and other variables.

Several alternative techniques have been suggested to estimate the OC artifact. Examples of these include use of a single QFF backup filter behind the primary QFF, such as IMPROVE uses, or use of a QFF behind a Teflon filter; use of a denuder (e.g., activated-carbon-impregnated strips or XAD-coated glass annular denuder) in front of a reactive backup filter (e.g., carbon- or XAD-impregnated glass-fiber filter) behind the primary QFF, and application of a linear regression method where OC (y-axis) is plotted versus PM2.5 mass as obtained on a Teflon filter (Peters et al., Citation2000; Turpin et al., Citation2000; Solomon et al., Citation2000; Lewtas et al., Citation2001; Tolocka et al., Citation2001; Subramanian et al., Citation2004; Kim et al., Citation2005; Watson et al., Citation2009; Chow et al., Citation2010; Maimone et al., Citation2011). Other statistical methods also have been investigated for CSN (Kim et al., Citation2001).

Loss of semivolatile ammonium nitrate also has been well documented (e.g., Hering and Cass, Citation1999; Clark, Citation2002b; Pang et al., Citation2002; Ashbaugh et al., Citation2004a; White et al., Citation2005; Yu et al., Citation2006; Solomon et al., Citation2008, and references within). Ammonium nitrate artifact (loss), related to nitrate only, is minimized in these networks by the use of a denuder-reactive filter (nylon) pair (; Solomon et al., Citation1992, Solomon et al., Citation2003; Hering et al., Citation2004; Chow et al., Citation2005b). The denuder removes the gas-phase interferences, primarily nitric acid, and the nylon filter captures any nitrate volatilized (HNO3) from the particles during sampling (Solomon et al., Citation1992, Solomon et al., Citation2003; Hering et al., Citation2004; Chow et al., Citation2005b).

The CSN and IMPROVE programs use different strategies to account for blanks and artifacts associated with the filter material, filter handling, transport, storage, and laboratory contamination with the greatest difference associated with OC. These different strategies for OC were part of the reason for CSN switching to IMPROVE protocols for sampling and analysis of organic and elemental carbon as noted above. We next describe the analytical protocols and highlight the differences providing data for 2006 and 2012 in .

Table 10. Tripa and field-filter blank values for ionic species by year and sampler

Table 11. Minimum detection limits for CSN and IMPROVE mass and species for 2006 and 2012

Table 12. Carbon artifact concentrations based on a nominal flow rates for CSN and IMPROVE and assuming a 24-hr sample: for CSN, average of trip and field blank samples; for IMPROVE, median of QFF after filter values

Table 13. Overview of MDL calculations for CSNa and IMPROVE

CSN

Field and trip blanks have been collected at a rate of 10% and 3%, respectively. This resulted in greater than 1000 trip and field blanks per year for each species. Approaches for collecting trip and field blanks in CSN are similar (EPA, Citation2000). Trip blanks are not removed from their containers in the field, while cartridges containing field blanks are installed in the sampler for a few minutes, sealed, and then stored in the sampler housing until the samples are retrieved. Based on CSN field and trip blank data collected since 2000, there is no statistical difference between these two types of blanks for the major species, so the data sets are merged to estimate blank or artifact values for each component by sampler type ( and ). For this reason and for budgetary reasons, CSN stopped collecting and analyzing trip blanks in 2013. Field and trip blank values, along with various other operational and QC data, are summarized in the CSN semiannual and annual data reports posted on AMTIC (Technology Transfer Network [TTN], Citation2012a).

CSN concentration data available in AQS are not corrected for measured blank or artifact values. As noted earlier, field and trip blanks only were collected for a small fraction of the total number of samples collected. Thus, in the absence of obtaining a unique blank value for each sampling event, blank correction requires assumptions about the distribution of blank levels across sites and averaging times and differs by species. Most importantly, CSN data users needs differ (e.g., spatially, local to national; averaging time, daily, seasonal, annual; planning, current vs. future estimates), and this approach allows data users flexibility to apply blank/artifact corrections to meet their own data analysis and policy objectives. CSN semiannual and annual data summary reports show low blank values for ionic species (sulfate, nitrate, ammonium, sodium, and potassium) as measured on nylon filters (), below MDL for EC as measured on QFF (), and for most trace elements as measured by EDXRF on Teflon filters. The latter are for below MDL values for data since late 2002 and early 2003 when filter contamination issues for a few specific elements were resolved. Gravimetric mass, on the other hand, is subject to relatively frequent outliers above the MDL, probably due to contamination during sampling and handling. Blank values for organic carbon (including subfractions) are almost always above the MDL of the analysis method, due to sorption of gas-phase carbon compounds on to the quartz-fiber filter medium as well as on previously collected particles (“organic artifact”), as discussed earlier. High blank values also can occur episodically for virtually any species. For example, during 2001 and 2002, sodium ion blanks were unusually high due to high sodium levels associated with nylon filters as received from the manufacturer. This situation was addressed by modifying the SOP to include washing new nylon filters prior to use as described in the modified SOP (RTI, Citation2009b). Teflon filters contaminated with excessive manufacturing debris apparently caused another episode of high blank values for gravimetric mass. This was addressed by flagging suspicious data and increasing the rate of laboratory QC. This QC included increased duplicate weighings: All filters were weighed in duplicate prior to being used for field sampling; after field sampling, replicates were performed at a rate of 30%. No further filter batch problems were detected after several years, so the rate of QC replicates was reduced to the baseline of 10%.

Average OC artifact values (average of trip and field samples) for CSN by sampler type for 2006 and 2012 and seasonal averages for each year are presented in . However, it should be noted that trip and field blanks may not be good estimates of the true OC artifact, since blanks only experience passive sorption and active positive and negative sampling artifacts are not reflected in the blank data (e.g., Maimone et al., Citation2011). Also not considered are positive and negative artifacts due to the filters remaining in the samplers for up to 3 days after collection. Since the artifact equilibrium changes due to a variety of physical (e.g., pressure drop across the filter) and environmental conditions (e.g., temperature [T], RH, SVOC concentration) (Turpin et al., Citation2000), the OC artifact associated with QFF field blanks varies not only by sampler type, but by season and from year to year, with MetOne and Anderson having similar levels but both higher than URG and IMPROVE.

When the EPA changed from using the MetOne sampler for collecting PM on QFF to the URG 3000N, CSN started collecting dynamic blanks that could be used for blank correction. The new approach as described in the following employed a QFF filter behind the front QFF filter with the assumption that the same artifact would be observed on both filters (Subramanian et al., Citation2004; Watson et al., Citation2009; Maimone et al., Citation2011). CSN results in AQS are reported without the dynamic OC blank correction, although the dynamic blank results are posted in AQS and can be used to make the correction. lists typical OC artifact corrections for each of the carbon components reported by CSN (ng/m3). Dynamic blanks for OC are collected at all sites rather then at 6 to 12 sites (depending on the year) as done for IMPROVE. To achieve this within budget, the number of field blanks collected in the rest of the network was reduced from 10% to 5% (EPA, Citation2007c). CSN trip blanks were discontinued in 2013, because no difference was seen between trip and field blanks and because of budget reductions within EPA.

IMPROVE

The IMPROVE program collects field blanks at a rate of approximately 1% for Teflon filters and 2 to 5% for nylon and QFFs. IMPROVE field blanks are installed in the sampler in an unused sampling port or sampling port in a separate module. Filters are left in the unused port during the entire sampling period, as opposed to CSN where they are installed in the sampler for only a few minutes.

The IMPROVE program applies blank and artifact corrections to data prior to calculating atmospheric concentrations (i.e., at the μg/filter level). Thus, data available on the IMPROVE website have been blank corrected. Blank values for mass and elements as measured on Teflon filters are typically below detection limits so no correction is applied. Sulfate, chloride, nitrite, and nitrate as measured on nylon filters are corrected using data obtained from field blanks. OC/EC artifacts are estimated using the dynamic blank, an active rather than a passive approach, as described later. Seasonal (e.g., March, April, May) median artifact correction values were applied to the concentration data for IMPROVE prior to June 2002. Since then, monthly artifact correction values are applied to the concentration data.

lists typical average blank values for 2006 and 2012 and seasonal averages for each year for ionic species reported by IMPROVE. While IMPROVE applies these corrections prior to calculating ambient concentrations, the values in have been divided by 33 m3, the nominal volume collected by an IMPROVE sampler (i.e., 22.8 L/min, 24 hr duration) to allow for direct comparison to CSN results.

Carbon artifact in IMPROVE is estimated from a backup QFF placed immediately downstream of the primary QFF, which is a dynamic blank rather than the static blank used in CSN (Turpin et al., Citation2000; Subramanian et al., Citation2004; Maimone et al., Citation2011). Prior to 2008, a backup QFF was collected with each sample at only six geographically dispersed sites. Beginning in 2008, the number of sites was expanded to 12. As described earlier, it is assumed that the second QFF collects the same fraction of gas-phase quartz-adsorbable SVOC and VOC as the front filter, and therefore provides an estimate of the front QFF OC artifact (Subramanian et al., Citation2004; Watson et al., Citation2009; Maimone et al., Citation2011). lists typical OC artifact corrections for each of the carbon components reported by IMPROVE, adjusted for sample volume as described already for ions. EC artifact corrections are applied, but they are typically near zero and are not listed here.

On a sample-by-sample basis, the median backup QFF should provide a reasonable estimate of the OC artifact on the primary QFF if both filters reach equilibrium, which is often the situation in urban atmospheres (Subramanian et al., Citation2004), although studies need to be conducted to ensure this is true in more remote areas where IMPROVE sites are located. However, averaging across multiple sites located in different parts of the country where the OC artifact also can vary significantly by season may bias the correction for samples that are not close to the average (Maimone et al., Citation2011).

Minimum detectable limits (MDL)

CSN and IMPROVE use slightly different approaches to calculate MDLs as summarized in and below. Hyslop and White (Citation2008a) examined the variability in MLD values between the two networks using colocated measurements within each network for six of the trace elements (Ti, Mn, Cu, As, Se, and Pb) measured by EDXRF. Analytical MDL values (ng/cm2) were similar, indicating that the analytical methods have similar sensitivities for species examined, although this was before IMPROVE changed to the PANalytical EDXRF instrument. Similar analytical sensitivities (ng/cm2) are observed for ion chromatography since the same laboratory analyzes samples from both networks. Since CSN switched to the IMPROVE protocol by the end of 2009 for TOA, the analytical MDL values are the same for OC and EC. As described in the following, differences are observed between the networks when MDL values are given in concentration units.

CSN

Between the start of the program in 2000 and June 2003, a single set of MDL values (μg/m3) was reported for each species and sampler type. These MDL values are given in the CSN data summary reports (TTN, Citation2012a) and were provided in response to data users’ inquiries. The original MDL estimates were established in several ways: For OC/EC, the instrument manufacturer’s published detection limit was used; for gravimetric mass, the balance resolution was taken as the limit of detection; for ions, the standard deviation of 7 to 10 low-level replicate measurements multiplied by the appropriate Student’s t factor was used for the MDL (Federal Register, Citation2014); and for EDXRF, MDL values were determined by Chester LabNet according to Method IO 3.3 (with a multiplier of 2) (EPA, Citation1999c).

After June 2003, MDL are determined for each analyzer separately and data records in AQS report instrument-specific MDL values with each concentration, and thus can sometimes vary significantly from filter to filter. Individual MDL values are based on three times the standard deviation of a large set of laboratory blanks, low-level samples, or equivalent (Federal Register, Citation2013, Federal Register, Citation2014; EPA, Citation1999c). MDL values are reevaluated and updated annually or when an instrument has undergone major maintenance that could affect its performance. AQS does not include a data field to identify the instrument used to analyze each sample.

CSN does not apply a separate flag for values below detection limit but posts the value along with its detection limit for each record in AQS. This allows the user to use whatever procedures for handling values below detection limits that they find appropriate. Negative concentration values are automatically set to zero when reporting to AQS.

A summary of average MDL values (ng/m3) for the CSN SASS sampler for 2006 and 2012 is provided in and results for all years since 2004 at TTN (TTN, Citation2012a). Method detection limits for OC, EC, and carbon fractions changed for CSN during this period, as shown in , due to the change from the CSN OC/EC method to the IMPROVE OC/EC method. In a few cases, CSN and IMPROVE MDL values appear higher in 2012 than in 2006. As noted earlier, MDL values are calculated based on data obtained from field and trip blanks or from the standard deviation of low-level samples, both of which can change over time. Replacing an old instrument also can result in changes to the MDL, as noted by IMPROVE; that is, changing XRF units resulted in a few species having slightly higher MDL values, although many had lower MDL values. As noted for CSN, major maintenance also can result in small changes to MDL values over time. Slight differences in analytical methods and sampling protocols also can account for differences in MDL values between the networks. Most importantly, both networks monitor and report these changes as described already. In some cases, MDL values appear significantly lower for one network than for the other.

IMPROVE

Individual MDLs are provided with each concentration value in the IMPROVE database. A concentration is determined to be significantly different from zero only if it is greater than the MDL. For IC and TOR OC/EC analyses the MDL corresponds to twice the standard deviation of the field blanks or backup QFF filters, respectively. For mass and light absorption, the MDL corresponds to twice the analytical precision determined by laboratory blanks (mass) or low-absorbing controls (light absorption). Prior to 2011, the MDL for each element for PESA and EDXRF (the latter using the Cu and Mo anode XRF systems described previously) was based on system blanks and specifically 3.29 times the square root of the background counts under the region that would have been occupied if the element was present in the sample (IMPROVE, Citation1997). Beginning in 2011, the MDL for EDXRF (using the PANalytical Epsilon 5) is determined as the 95th percentile of the intensity measured on a set of typically 50 to 100 field blanks (IMPROVE, Citation2014a).

Lower MDL values reported in ambient concentrations (μg/m3) are observed for IMPROVE relative to CSN because of the higher mass loading used in IMPROVE relative to the MetOne SASS (; Hyslop and White, Citation2008a). This relates to a factor of almost 6 for potential improvement in sensitivity (m3/cm2 of filter) often needed by IMPROVE due to the lower ambient PM concentrations in pristine areas. Larger filter areas and lower flow rates were employed in CSN to avoid filter clogging due to the higher PM concentrations observed in urban locations. The effect of this theoretical advantage in sensitivity between the networks is an important issue that will be investigated in a later paper. The ability to detect and quantify low levels of elements observed in remote areas and used as tracer species for certain sources (e.g., V and Se for oil and coal combustion) could significantly impact the ability of receptor models to apportion ambient PM to its sources (Flanagan et al., Citation2014).

Data completeness

Completeness describes the amount of valid data obtained from a measurement system compared to the amount that was expected to be obtained if all samples were collected successfully. It is an important quality assurance indicator and provides a measure of data validity. For example, when determining compliance for PM10, a minimum of 75% of the scheduled PM10 samples per quarter is required for compliance calculations (Federal Register, Citation2006, Part 50). provides percent data completeness for major PM components for both CSN and IMPROVE for two representative years, 2006 and 2012. Data completeness for both networks was in the range of 90–100%.

Table 14. Data completeness estimates for CSN and IMPROVE from Citation2006 and 2012 by year and analytical method

Precision based on colocated data

Whole-system precision estimates based on colocated sampling are obtained for both networks according to the procedure described in the Federal Register (Citation2006, Part 58, App. A, section 4.3).

Colocated CSN samplers have been operated at six STN sites since the beginning of the program, accounting for between 3% and 5% of the entire network or roughly 12% of the STN sites. The full network percentage has increased slightly as network reductions have taken place. Colocated samplers at a given site are operated by the same operator and use filters prepared and shipped at the same time, but shipped in separate containers. No effort is made to randomize the filter pairs between laboratory instruments. Precision estimates for CSN based on colocated sampling results are calculated annually and are summarized for the major species in the CSN annual data summary reports (EPA, Citation2012g). At each of the six colocated CSN sites, the primary sampler operates on a 1-in-3 day schedule, while most of the colocated samplers operate on a 1-in-6 day schedule.

Precision results as percent average relative difference between colocated samplers are presented in for CSN for 2006 and 2012 for components that are typically 3 times above their MDL. Other elements are not quantified so they are not included in the table. Precision results for CSN mass, ionic species (by IC), and carbon components are around 20 ± 5%. Trace elements by EDXRF are mostly in the range from 25% to 40%. Light elements analyzed by EDXRF, such as sodium and aluminum, show relatively poor precision compared with heavier elements due to self-absorption as described previously (see, e.g., Dzubay and Nelson, Citation1975; Berry et al., Citation1969).

Table 15. Precision estimates for major components as percent average relative difference between colocated samplers

The IMPROVE sampler includes four filter holders per module (either A, B, C, or D; see ) that have separate solenoid valves to allow for sequential sampling over multiple sampling days. Prior to 2003, one of the four filter sets was used to obtain a colocated sample for the specific filter type assigned to that module. Beginning in 2003, IMPROVE installed single colocated modules (either A, B, C, or D) at different sites. Phoenix has all four modules colocated. There are seven duplicate modules of each type (A, B, C, and D), which is equivalent to having colocated samplers at 4% of the ˜170 IMPROVE sites.

For major components, precision estimates in IMPROVE range from 3% for sulfate to 130% for the third elemental carbon fraction. 5 presents colocated precision estimates as percent average relative difference determined from IMPROVE samples collected in 2006 and 2012. Colocated precision tends to improve with increasing data completeness, is typically better when the analysis is performed on the whole filter instead of a fraction of the filter, and is better for species that are predominantly in the fine size fractions (Hyslop and White, Citation2008b).

CSN and IMPROVE Program Histories

Changes to network field sampling and analytical procedures can cause a step change in reported data. Therefore, it is important to identify when these changes took place. Several network changes took place in IMPROVE before 2000, and these are noted in the IMPROVE newsletters (IMPROVE, Citation2014b). provides a brief history of the networks since 2000, summarizes field and laboratory changes in both networks, and indicates the reason for the change. Major changes are highlighted in the following, with possible implications of each major change to long-term network results.

Table 16. Startup history and summary of changes in field and laboratory network operations since 2000 when CSN began initial operations

Between 2005 and 2006 CSN underwent a reduction in the number of monitoring locations due to reduced funding. Most of the reductions occurred at SLAM sites and at locations where PM concentrations no longer exceeded the PM NAAQS. However, the most significant change to CSN began in early 2007 with replacement of the CSN carbon measurement approach with the IMPROVE carbon measurement approach, albeit with minor exceptions as described earlier (EPA, Citation2007a, Citation2007b; EPA, Citation2009b; EPA, Citation2012e). The replacement was completed in October 2009 (EPA, Citation2009b). Starting in 2008, CSN started transitioning to the IMPROVE_A analysis method for carbon, which was completed by 2009. These changes were implemented to obtain consistency between the two networks for carbon. A reduction in the collection of network field blanks was applied to offset the increased cost of analyzing the additional backup QFF used to estimate OC artifact.

It is anticipated that as sites were changed from the CSN OC/EC approach to the IMPROVE approach a step change is likely to be observed, potentially with slightly lower OC and higher EC; total carbon will likely change little (Chow et al., Citation2001, Citation2004; Schmid et al., Citation2001; Conny et al., Citation2003; Watson et al., Citation2005; Cheng et al., Citation2011). This change also resulted in better precision and lower MDL values for OC, EC, and OC fractions as seen in when comparing 2006 to 2012. Since changing from the CSN to IMPROVE approach for OC and EC took place over about 2.5 years, with a few sites at a time, the Carbon Conversion Site Lists (EPA, Citation2012e) will need to be reviewed on a site-by-site basis when examining long-term trends for OC and EC during the period from mid 2007 through nearly the end of 2009. However, the consistency achieved between the networks for understanding regional impacts of carbon in urban areas and vice versa was deemed sufficient reason for the change.

Several small changes have taken place in CSN since 2000, including switching to a smaller shipping container and reducing the number of trace elements determined by EDXRF from 48 to 33 (see ), since many of the elements removed were below their MDL. Both of these were implemented to reduce cost but should have no long-term impact on the results. In 2009 CSN also adjusted the attenuation factor used for four low-molecular-weight elements. This resulted in a small step change to the reported concentrations of these elements.

IMPROVE has had several changes in CSN since 2000, although most have had only minor and typically positive impacts on results. At the end of 2001, the analysis method for elements from Na to Mn was changed from PIXE to EDXRF by Cu anode, and at the end of 2002 EDXRF run times were standardized at 1000 s. These changes resulted in improved and more consistent detection limits for elements of interest. At the beginning of 2005 the carbon analysis instrument was replaced with a new model, resulting in a small change to the IMPROVE analysis approach (IMPROVE_A), and details of this change can be found in Chow et al. (Citation2007). In 2008 the number of sites with QFF backup filters was increased from six to 12, providing better spatial coverage and geographic diversity of the data used to correct for potential positive and negative OC sampling artifacts. In 2011 the EDXRF was replaced with a new instrument (PANalytical Epsilon 5); the list of elements reported remained unchanged. Also in 2011 the PESA analysis for hydrogen was discontinued due to budget limitations. The hydrogen data from PESA were used almost exclusively for quality control checks, so eliminating the measurement had essentially no effect on results. Better precision and lower MDL values were realized for a number of elements due to the switch to the PANalytical XRF systems. Improvements were observed for sulfur and the soil-related elements required by the Regional Haze Rule, as well as improvements for the lighter elements such as sodium, aluminum, and silicon. There is degradation of the precision and MDL values for some trace elements such as zinc and selenium.

Summary and Conclusion

This paper describes the CSN and IMPROVE networks in detail through 2012 with minor updates through the beginning of 2014. It outlines the differences in the field and laboratory approaches and summarizes the analytical parameters that affect data quality.

In general, the field and laboratory approaches used in the CSN and IMPROVE networks are similar; however, there are many subtle differences in the sampling and analysis methods employed, as well as shipping, blank/artifact estimation and correction approaches, and estimation of uncertainties. These differences may be important depending on the application of the results and related policy objectives of the study using these data. Several changes in measurement and analytical protocol also have occurred since 2000. While some of these changes may result in step changes in the long-term database, they have been identified and in general will have minimal impact on use of these data.

A recent assessment of the CSN network (Landis, Citation2014) examined a range of parameters that would suggest which sites to remove from the CSN network beginning in early 2015 due to reduced budgets. The assessment considered, for example, the number of sites within a geographic area, how PM correlated among those sites, and whether the sites were within an attainment area. Fifty-three sites were recommended for defunding. Other recommended network reductions include eliminating CSN mass measurements, beginning in July 2014, at the 119 sites where mass is measured by the FRM; reducing sampling frequency to 1 in 6 days at all but NCORE sites (EPA, Citation2005c; Scheffe et al., Citation2009) and STN sites; reducing carbon blanks from 10% to 5%; and eliminating carbon dynamic blanks currently at 5%. Implementation of recommendations is scheduled for early 2015 if approved and after all stakeholders have input. While these are not final, the impact of these potential reductions will not be known for several years.

Future papers will directly compare results from colocated measurements at three urban and three rural sites, with each site having one of the three CSN samplers and an IMPROVE sampler, and will examine whether there are practical differences when using the same data sets in the same receptor modeling application.

Acknowledgment

The U.S. Environmental Protection Agency through its Office of Research and Development assisted in the preparation of this article. It has been subjected to the agency’s administrative review and approved for publication. Mention of trade names or commercial products does not constitute endorsement or recommendation for use.

Funding

Preparation of this paper was partially supported under an RTI Professional Development Award.

Additional information

Notes on contributors

Paul A. Solomon

Paul A. Solomon is a Senior Research Scientist at the U.S. Environmental Protection Agency, Office of Research and Development, National Exposure Research Laboratory, Las Vegas, NV.

Dennis Crumpler

Dennis Crumpler is an environmental engineer at the U.S. Environmental Protection Agency, Office of Air Quality Planning and Standards, Ambient Air Monitoring Group, Research Triangle Park, NC.

James B. Flanagan

James B. Flanagan is a former environmental scientist, R.K.M. Jayanty is the senior fellow and program manager for the CSN program, and Ed E. Rickman is the database manager at RTI International, Research Triangle Park, NC.

R.K.M. Jayanty

James B. Flanagan is a former environmental scientist, R.K.M. Jayanty is the senior fellow and program manager for the CSN program, and Ed E. Rickman is the database manager at RTI International, Research Triangle Park, NC.

Ed E. Rickman

James B. Flanagan is a former environmental scientist, R.K.M. Jayanty is the senior fellow and program manager for the CSN program, and Ed E. Rickman is the database manager at RTI International, Research Triangle Park, NC.

Charles E. McDade

Charles E. McDade is the principal investigator for the IMPROVE program at Crocker Nuclear Laboratory, University of California, Davis, CA.

References

  • Ashbaugh, L.L., and R.A. Eldred. 2004a. Loss of particle nitrate from Teflon sampling filters: Effects on measured gravimetric mass in California and in the IMPROVE Network. J. Air Waste Manage. Assoc. 54(1): 93–104. doi:10.1080/10473289.2004.10470878
  • Ashbaugh, L.L., C.E. McDade, W.H. White, P. Wakabayashi, J.L. Collett, Jr., and Y. Xiao-Ying. 2004b. Efficiency of IMPROVE network denuders for removing nitric acid. In Proceedings, Regional and Global Perspectives on Haze: Causes, Consequences and Controversies, 32-1–32-8. Pittsburgh, PA: Air & Waste Management Association.
  • Baker, W.C., and J.F. Pouchot. 1983. The measurement of gas flow, Part I. J. Air Waste Manage. Assoc. 33(1): 66–72. doi:10.1080/00022470.1983.10465548
  • Berry, P.F., T. Furuta, and J.R. Rhodes. 1969. Particle size effects in x-ray spectrometry. In Advances in X-Ray Analysis, ed. C.S. Barrett, 612–632. New York, NY: Plenum Press.
  • Bond, T.C., T.L. Anderson, and D.E. Campbell. 1999. Calibration and intercomparison of filter-based measurements of visible light absorption by aerosols. Aerosol Sci. Technol. 30(6): 582–600. doi:10.1080/027868299304435
  • Bond T.C., and R.W. Bergstrom. 2006. Light absorption by carbonaceous particles: An investigative review. Aerosol Sci. Technol. 40(1): 27–67. doi:10.1080/02786820500421521.
  • Brook, J.R., E. Vega, J.G. Watson, J.M. Hales, and G.M. Hidy. 2004. Chapter 7: Receptor methods. In Particulate Matter Science for Policy Makers—A NARSTO Assessment, Part 1, 235–281. Cambridge, UK: Cambridge University Press:.
  • Cheng, Y., F.K. Duan, K.B. He, M. Zheng, Z.Y. Du, Y.L. Ma, and J.H. Tan. 2011. Intercomparison of thermal-optical methods for the determination of organic and elemental carbon: Influences of aerosol composition and implications. Environ. Sci. Technol. 45(23): 10117–10123. doi:10.1021/es202649g
  • Chester. 2009. Standard Operating Procedure XR-002.04. Analysis of Elements in Air Particulate by X-Ray Fluorescence (Kevex 770 & 772), US EPA 10 3.3. Chester Labs, Tigard, OR. http://www.epa.gov/ttn/amtic/files/ambient/pm25/spec/ChesterXR00204.pdf ( accessed May 29, 2014).
  • Chow, J.C. 1995. Measurement methods to determine compliance with ambient air quality standards for suspended particles. J. Air Waste Manage. Assoc. 45(5): 320–382. doi:10.1080/10473289.1995.10467369
  • Chow, J.C., J.G. Watson, L.C. Pritchett, W.R. Pierson, C.A. Frazier, and R.G. Purcell. 1993. The DRI Thermal/Optical Reflectance carbon analysis system: Description, evaluation and applications in U.S. air quality studies. Atmos. Environ. 27A(8): 1185–1201. doi:10.1016/0960-1686(93)90245-T
  • Chow, J.C., J.G. Watson, D. Crow, D.H. Lowenthal, and T.M. Merrifield. 2001. Comparison of IMPROVE and NIOSH carbon measurements. Aerosol Sci. Technol. 34(1): 23–34. doi:10.1080/02786820119073
  • Chow, J.C., J.G. Watson, L.W.A. Chen, W.P. Arnott, H. Moosmuller, and K.K. Fung. 2004. Equivalence of elemental carbon by Thermal/Optical Reflectance and Transmittance with different temperature protocols. Environ. Sci. Technol. 38(16): 4414–4422. doi:10.1021/es034936u
  • Chow, J.C., J.G. Watson, L.-W.A. Chen, G. Paredes-Miranda, M.-C.O. Chang, D.L. Trimble, K.K. Fung, H. Zhang, and J.Z. Yu. 2005a. Refining temperature measures in thermal/optical carbon analysis. Atmos. Chem. Phys. 5(4): 2961–2972. http://www.atmos-chem-phys.net/5/2961/2005/acp-5-2961-2005.pdf (accessed May 29, 2014). doi:10.5194/acp-5-2961-2005
  • Chow, J.C., J.G. Watson, D.H. Lowenthal, and K.L. Magliano. 2005b. Loss of PM2.5 nitrate from filter samples in Central California., J. Air Waste Manage. Assoc. 55: 1158–1168. doi:10.1080/10473289.2005.10464704
  • Chow, J.C., J.G. Watson, L.-W.A. Chen, M.C.O. Chang, N.F. Robinson, D. Trimble, and S. Kohl. 2007. The IMPROVE_A temperature protocol for thermal/optical carbon analysis: Maintaining consistency with a long-term database. J. Air Waste Manage. Assoc. 57(9): 1014–1023. doi:10.3155/1047-3289.57.9.1014
  • Chow, J.C., P. Doraiswamy, J.G. Watson, L.-W.A. Chen, S.S.H. Ho, and D.A. Sodeman. 2008. Advances in integrated and continuous measurements for particle mass and chemical composition. J. Air Waste Manage. Assoc. 58(2): 141–163. doi:10.3155/1047-3289.58.2.141
  • Chow, J.C., J.G. Watson, L.-W.A. Chen, J. Rice, and N.H. Frank. 2010. Quantification of PM2.5 organic carbon sampling artifacts in US networks. Atmos. Chem. Phys. 10:5223–5239 http://www.atmos-chem-phys.net/10/5223/2010/acp-10-5223-2010.pdf (accesssed May 29, 2014). doi:10.5194/acp-10-5223-2010
  • Clark, M.S. 2002a. Nylon filter extraction study July 30, 2002. Technical memorandum to James Homolya, EPA, Office of Air Quality Planning and Standards, Research Triangle Park, NC. http://www.epa.gov/ttn/amtic/files/ambient/pm25/spec/nylfilstd.pdf (accessed May 29, 2014).
  • Clark, M.S. 2002b. PM2.5 quartz filter experiments. Technical memorandum to James Homolya, EPA, Office of Air Quality Planning and Standards, Research Triangle Park, NC. http://www.epa.gov/ttnamti1/files/ambient/pm25/spec/carboncsr.pdf (accessed May 29, 2014).
  • Clark, M.S. 2003. Nylon® filter extraction study #2. Technical memorandum to James Homolya, EPA, Office of Air Quality Planning and Standards, Research Triangle Park, NC. http://www.epa.gov/ttn/amtic/files/ambient/pm25/spec/nylstud2.pdf (accessed May 29, 2014).
  • Conny, J.M., D.B. Klinedinst, S.A. Wight, and J.L.Paulsen. 2003. Optimizing thermal-optical methods for measuring atmospheric elemental (black) carbon: A response surface study. Aerosol Sci. Technol. 37:703–723. doi:10.1080/02786820300920
  • Coutant, B., and S. Stetzer. 2001. Evaluation of PM2.5 speciation sampler performance and related sample collection and stability issues, Final report. EPA, Office of Air Quality Planning and Standards, Research Triangle Park, NC. http://www.epa.gov/ttn/amtic/files/ambient/pm25/spec/minitrends.pdf (accessed May 29, 2014).
  • DRI. 2005. Standard operating procedure for the determination of carbon fractions in particulate matter using the IMPROVE_A heating protocol on a Sunset Laboratory dual-mode analyzer. IMP_A Carbon Analysis of PM-SSL. Desert Research Institute, Reno, NV. http://www.epa.gov/ttn/amtic/files/ambient/pm25/spec/RTIIMPROVEACarbonAnalysisSSLSOP.pdf ( accessed May 29, 2014).
  • Dzubay, T.G., and R.O. Nelson. 1975. Self absorption corrections for X-ray fluorescence analysis of aerosols. In Advances in X-Ray Analysis, Vol. 18, ed. W.L. Pickles, C.S. Barrett, J.B. Newkirk, and C.O. Rund, 619–631. New York, NY, Plenum Publishing Corporation.
  • Eldred, R.A., T.A. Cahill, L.K. Wilkinson, P.J. Feeney, J.C. Chow, and W.C. Malm. 1990. Measurement of fine particles and their chemical components in the NPS/IMPROVE Networks. In Transactions, Visibility and Fine Particles, ed. C.V. Mathai, 187–196. Pittsburgh, PA: Air & Waste Management Association.
  • Federal Register. 1997. National Ambient Air Quality Standards for particulate Particulate Matter: Final rule. 40 CFR Parts 50, 53, and 58, July 18. http://www.epa.gov/ttn/oarpg/t1/fr_notices/pmnaaqs.pdf (accessed May 29, 2014).
  • Federal Register. 1999. Part II, Environmental Protection Agency, 40 CFR Part 51, Regional Haze Regulations, Final rule, July 1. http://www.epa.gov/ttn/oarpg/t1/fr_notices/rhfedreg.pdf (accessed May 29, 2014).
  • Federal Register. 2006. National Ambient Air Quality Standards for Particulate Matter: Final rule. 40 CFR Parts 50, 53, and 58, Volume 62 (138). Part 50, Part 53—Ambient Air Monitoring Reference and Equivalent Methods. http://www.ecfr.gov/cgi-bin/text-idx?SID=6786c9c05698e1df6a087fee4f1c0734&node=40:6.0.1.1.1& gn=div5 (accessed May 29, 2014). Part 58—Ambient Air Quality Surveillance, http://www.ecfr.gov/cgi-bin/text-idx?SID=6786c9c05698e1df6a087fee4f1c0734&node=40:6.0.1.1.6&rgn=div5 (accessedMay 29, 2014).
  • Federal Register. 2013. Ambient Air Monitoring Reference and Equivalent Methods, 40 CFR Parts 53 - 59, Chapter 1, Subchapter C—Air programs. http://www.gpo.gov/fdsys/pkg/CFR-2013-title40-vol6/pdf/CFR-2013-title40-vol6.pdf (accessed May 29, 2014).
  • Federal Register. 2014. Appendix B to Part 136—Definition and procedure for the determination of the method detection limit—Revision 1.11. 40 CFR Ch. I (7–1–12 Edition). file:///H:/Temp/Delete/CFR-2012-title40-vol24-part136-appB.pdf (accessed May 29, 2014).
  • Fehsenfeld, F.C., D. Hastie, J.C. Chow, and P.A. Solomon. 2004. Particle and gas measurements. In Particulate Matter Science for Policy Makers, A NARSTO Assessment, ed. P.H. McMurry, M.F. Shepherd, J.S. Vickery, 159–189. Cambridge, UK: Cambridge University Press.
  • Fitz, D. 2002. Evaluation of diffusion denuder coatings for removing acid gases from ambient air. U.S. Environmental Protection Agency, Office of Air Quality Planning and Standards, Research Triangle Park, NC. http://www.epa.gov/ttnamti1/files/ambient/pm25/spec/denudr.pdf (accessed May 29, 2014).
  • Flocchini, R., L. Ashbaugh, R. Eldred, P. Feeney, S. Ixquiac, P. Beveridge. 2002. Quality assurance guidance document revision 0.0. IMPROVE: Interagency Monitoring of Protected Visual Environments, Quality assurance project plan, OAQPS Category 1 QAPP. UC Davis, Davis, CA. http://vista.cira.colostate.edu/improve/publications/QA_QC/IMPROVE_QAPP_R0.pdf (accessed May 29, 2014).
  • Flanagan, J., L. Michael, R.K.M. Jayanty, K. Baumann, and P.A. Solomon. 2014. National PM2.5 monitoring networks in the United States—CSN/STN and IMPROVE: Intercomparison of PMF receptor modeling results using data from colocated IMPROVE and CSN PM2.5 chemical speciation samplers. In preparation.
  • Gego, E. L., P.S. Porter, J.S. Irwin, C. Hogrefe, and S.T. Rao. 2005. Assessing the comparability of ammonium, nitrate and sulfate concentrations measured by three air quality monitoring networks. Pure Appl. Geophys. 162:1919–1939. doi:10.1007/s00024-005-2698-3
  • Gutknecht, W.F., J.A. O’Rourke, J.B. Flanagan, W.C. Eaton, M.R. Peterson, and L.C. Greene. 2001. Research to investigate the source(s) of high field blanks for Teflon® PM2.5 filters. Report Number RTI/07565/019-01FR, EPA Contract No. 68-D-99-0013, prepared by Research Triangle Institute, Center for Environ. Measurement & Quality Assurance, Research Triangle Park, NC, for EPA, Research Triangle Park, NC.
  • Gutknecht, W.F., J.B. Flanagan, and A. McWilliams. 2006. Harmonization of interlaboratory X-ray fluorescence measurement uncertainties. Research Triangle Institute, Research Triangle Park, NC. http://www.epa.gov/ttn/amtic/files/ambient/pm25/spec/xrfdet.pdf (accessed May 29, 2014).
  • Gutknecht, W., J. Flanagan, A. McWilliams, R.K.M. Jayanty, R. Kellogg, J. Rice, P. Duda, and R.H. Sarver. 2010. Harmonization of uncertainties of X-ray fluorescence data for PM2.5 air filter analysis. J. Air Waste Manage. Assoc. 60(2):n184–194. doi:10.3155/1047-3289.60.2.184
  • Harrison, R.M., and J. Yin. 2000. Particulate matter in the atmosphere: Which particle properties are important for its effects on health? Sci. Total Environ. 249(1–3): 85–101. doi:10.1016/S0048-9697(99)00513-6
  • Hering, S.V., D.R. Lawson, P.A. Solomon, et al. 1988. The nitric acid shootout: Field comparison of measurement methods. Atmos. Environ. 22(8): 1519–1539. doi:10.1016/0004-6981(88)90379-4
  • Hering, S.V., and G.R. Cass. 1999. The magnitude of bias in the measurement of PM2.5 arising from volatilization of particulate nitrate from Teflon filters. J. Air Waste Manage. Assoc. 49:725–733. doi:10.1080/10473289.1999.10463843
  • Hering, S., P.M. Fine, C. Sioutas, P.A. Jaques, J.L. Ambs, O. Hogrefe, and K.L. Demerjian. 2004. Field assessment of the dynamics of particulate nitrate vaporization using differential TEOM and automated nitrate monitors Atmos. Environ. 38: 5183–5192. doi:10.1016/j.atmosenv.2004.02.066
  • Hinds, W.C. 1999. Aerosol Technology. Properties, Behavor, and Measurement of Airborne Particles. New York, NY: John Wiley & Sons.
  • Hopke, P.K., ed. 1991. Receptor Modeling for Air Quality Management. Amsterdam, The Netherlands: Elsevier.
  • Hopke, P.K. 2003. Recent developments in receptor modeling. J.Chemometrics 17(5): 255–265. doi:10.1002/cem.796
  • Hyslop, N.P., and W.H. White. 2008a. An empirical approach to estimating detection limits using colocated data. Environ. Sci. Technol. 42(14): 5235–5240. doi:10.1021/es7025196
  • Hyslop, N.P. and W.H. White. 2008b. An evaluation of interagency monitoring of protected visual environments (IMPROVE) colocated precision and uncertainty estimates. Atmos. Environ. 42(11): 2691–2705. doi:10.1016/j.atmosenv.2007.06.053
  • IMPROVE. 1997. SOP 301, IMPROVE standard operating procedures for X-ray fluorescence analysis, 2/4/97. Colorado State University, Boulder, CO. http://vista.cira.colostate.edu/improve/Publications/SOPs/ucdavis_sops/sop301.pdf (accessed May 29, 2014).
  • IMPROVE. 2002. IMPROVE: Interagency Monitoring of Protected Visual Environments quality assurance project plan OAQPS category 1 QAPP. Colorado State University, Boulder, CO. http://vista.cira.colostate.edu/improve/Publications/QA_QC/IMPROVE_QAPP_R0.pdf (accessed May 28, 2014).
  • IMPROVE. 2005. Standard operating procedures for National Park Service filter preparation, extraction, and anion analysis. Colorado State University, Boulder, CO. http://vista.cira.colostate.edu/improve/Publications/SOPs/RTI_SOPs/RTI_IonSOP102605.pdf (accessed May 29, 2014).
  • IMPROVE. 2006. Report IV, November. Colorado State University, Boulder, CO. http://vista.cira.colostate.edu/improve/Publications/Reports/2006/2006.htm (accessed May 29, 2014).
  • IMPROVE. 2012a. Interagency Monitoring of Protected Visual Environments. Colorado State University, Boulder, CO. http://vista.cira.colostate.edu/improve/ (accessed May 29, 2014).
  • IMPROVE. 2012b. IMPROVE standard operating protocols. Colorado State University, Boulder, CO. http://vista.cira.colostate.edu/improve/Publications/IMPROVE_SOPs.htm (accessed May 29, 2014).
  • IMPROVE. 2012c. SOP 201, Sampler maintenance by site operators. Colorado State University, Boulder, CO. http://vista.cira.colostate.edu/improve/Publications/SOPs/ucdsop.asp (accessed May 29, 2014).
  • IMPROVE. 2012d. SOP 251, Sample handling. Colorado State University, Boulder, CO. http://vista.cira.colostate.edu/improve/Publications/SOPs/UCDavis_SOPs/SOP251_Sample_Handling_2013.pdf (accessed May 29, 2014).
  • IMPROVE. 2012e. Quality assurance for the IMPROVE network. Colorado State University, Boulder, CO. http://vista.cira.colostate.edu/improve/Data/QA_QC/qa_qc_Branch.htm (accessed May 29, 2014).
  • IMPROVE. 2012f. SOP 326, PIXE and PESA analysis. Colorado State University, Boulder, CO. http://vista.cira.colostate.edu/improve/Publications/SOPs/ucdsop.asp (accessed May 29, 2014).
  • IMPROVE. 2012g. IMPROVE Data advisories. Colorado State University, Boulder, CO. http://vista.cira.colostate.edu/improve/Data/QA_QC/Advisory.htm (accessed May 29, 2014).
  • IMPROVE. 2014a. IMPROVE standard operating procedure for the X-ray fluorescence analysis of aerosol deposits on PTFE filters (with PANalytical Epsilon 5) SOP 301, SOP 301, Version 2, January 17, 2014. Colorado State University, Boulder, CO. http://vista.cira.colostate.edu/improve/Publications/SOPs/ucdavis_sops/sop301_XRF_2014.pdf (accessed May 29, 2014).
  • IMPROVE. 2014b. IMPROVE quarterly newsletters. Colorado State University, Boulder, CO. http://vista.cira.colostate.edu/improve/Publications/news_letters.htm (accessed May 29, 2014).
  • Iyer, H., P. Patterson, and W.C. Malm. 2000. Sampling duration calculations. J. Air Waste Manage. Assoc. 50(5): 888–893. doi:10.1080/10473289.2000.10464118
  • John, W., and G. Reischl 1980. A cyclone for size-selective sampling of ambient air. J. Air Pollut. Control Assoc. 30(8): 872–876. doi:10.1080/00022470.1980.10465122
  • Kampa, M., and E. Castanas. 2008. Human health effects of air pollution. Environ. Pollut. 151(2):362–367. doi:10.1016/j.envpol.2007.06.012
  • Kenny, L.C., and R.A. Gussman. 2000. A direct approach to the design of cyclones for aerosol-monitoring applications. J. Aerosol Sci. 31(12): 1407–1420. doi:10.1016/S0021-8502(00)00047-1
  • Kenny, L.C., R.A. Gussman, and M.B. Meyer. 2000. Development of a sharp-cut cyclone for ambient aerosol monitoring applications. Aerosol Sci. Technol. 32(4): 338–358. doi:10.1080/027868200303669
  • Kenny, L., G. Beaumont, A. Gudmundsson, A. Thorpe, and W. Koch. 2005. Aspiration and sampling efficiencies of the TSP and louvered particulate matter inlets. J. Environ. Monit. 7(5): 481–487. doi:10.1039/b419001g
  • Kim, B.M., J. Cassmassi, H. Hogo, and M.D. Zeldin. 2001. Positive organic carbon artifacts on filter medium during PM2.5 sampling in the South Coast Air Basin. Aerosol Sci. Technol. 34(1): 35–41. doi:10.1080/027868201300081941
  • Kim, E., P.K. Hopke, and Y. Qin. 2005. Estimation of organic carbon blank values and error structures of the speciation trends network data for source apportionment. J. Air Waste Manage. Assoc. 55(8):1190–1199. doi:10.1080/10473289.2005.10464705
  • Koutrakis, P. 1998. Recommendations of the Expert Panel on the EPA Speciation Network. Prepared for Office of Air and Radiation, OAQPS, Research Triangle Park, NC. http://www.epa.gov/ttnamti1/files/ambient/pm25/spec/speciate.pdf (accessed May 29, 2014).
  • Koutrakis, P. 1999. Recommendations of the Expert Panel on the EPA Speciation Network (Draft 7/12/99). Research Triangle Institute, Research Triangle Park, NC. http://www.epa.gov/ttnamti1/files/ambient/pm25/spec/lvpanel.pdf (accessed May 29, 2014).
  • Landis, B. 2014. CSN network assessment. ORD sponsored webinar, May 1, 2014. Research Triangle Park, NC. http://epa.gov/ncer/events/calendar/2014/may01/presentation.pdf. CSN assessment—Decision matrix for 53 low value sites recommended for defunding. http://epa.gov/ncer/events/calendar/2014/may01/defund-list.pdf (accessed May 29, 2014).
  • Lewtas, J., Y. Pang, D. Booth, S. Reimer, D.J. Eatough, and L.A. Gundel. 2001. Comparison of sampling methods for semi-volatile organic carbon associated with PM2.5. Aerosol Sci. Technol. 34(1): 9–22. doi:10.1080/02786820118935
  • Maimone, F., B.J. Turpin, P. Solomon, Q. Meng, A.L. Robinson, R. Subramanian, and A. Polidori. 2011. Correction methods for organic carbon artifacts when using quartz-fiber filters in large particulate matter monitoring networks: The regression method and other options. J. Air Waste Manage. Assoc. 61(6): 696–710. doi:10.3155/1047-3289.61.6.696
  • Malm, W.C., B.A. Schichtel, and M.L. Pitchford. 2011. Uncertainties in PM2.5 gravimetric and speciation measurements and what we can learn from them. J. Air Waste Manage. Assoc. 61(11): 1131–1149.
  • McCarthy, G. 2013. Initial area designations for the 2012 Revised Primary Annual Fine Particle National Ambient Air Quality Standard. U.S. Environmental Protection Agency, Washington DC. http://www.epa.gov/pmdesignations/2012standards/docs/april2013guidance.pdf (accessed May 29, 2014).
  • McClellan, R., B. Jessiman, P. McMurry, M. Shepherd, and J. Vickery. 2004. Health context for management of particulate matter. In Particulate Matter Science for Policy Makers—A NARSTO Assessment, ed. P.H. McMurry, M.F. Shepherd, and J.S. Vickery, 69–101. Cambridge, UK: Cambridge University Press.
  • McDade, C.E., A.M. Dillner, and H. Indresand. 2009 Particulate matter sample deposit geometry and effective filter face velocities. J. Air Waste Manage. Assoc. 59(9): 1045–1048. doi:10.3155/1047-3289.59.9.1045
  • McDow, S.R., and J.J. Huntzicker. 1990. Vapor adsorption artifact in the sampling of organic aerosol: Face velocity effects. Atmos. Environ. 24:2563–2571. doi:10.1016/0960-1686(90)90134-9
  • Pang, Y., N.L. Eatough, J. Wilson, and D.J. Eatough. 2002. Effect of semi-volatile material on PM2.5 measurement by the PM2.5 Federal Reference Method sampler at Bakersfield, California. Aerosol Sci. Technol. 36:289–299. doi:10.1080/027868202753504489
  • Peters, A.J., D.A. Lane, L.A. Gundel, G.L. Northcott, and K.C. Jones. 2000. A comparison of high volume and diffusion denuder samplers for measuring semivolatile organic compounds in the atmosphere. Environ. Sci. Technol. 34:5001–5006. doi:10.1021/es000056t
  • Peters, T.M., R.A. Gussman, L.C. Kenny, and R.W. Vanderpool. 2001. Evaluation of PM2.5 size selectors used in speciation samplers. Aerosol Sci. Techno. 34(5): 422–429. doi:10.1080/027868201750172833
  • Peterson, M.R., J. Smiley, S. Taylor, Jr., R.L. Hines, and M.H. Richards 2007. Effects of quartz filter type on sampling and organic carbon/elemental carbon analysis of PM2.5. RTI International, Research Triangle Park, NC. http://www.epa.gov/ttn/amtic/files/ambient/pm25/spec/20070103quartzfiltercomparison.pdf (accessed May 29, 2014).
  • Peterson, M.R., and M.H. Richards. 2002. Thermal-optical-transmittance analysis for organic, elemental, carbonate, total carbon, and OCX2 in PM2.5 by the EPA/NIOSH method. In Proceedings, Symposium on Air Quality Measurement Methods and Technology—2002, ed. E.D. Winegar and R.J. Tropp, 83-1–83-19. Pittsburgh, PA: Air & Waste Management Association.
  • RTI. 2008. Standard operating procedure for PM2.5 Gravimetric analysis, Revision 7. RTI International, Research Triangle Park, NC. http://www.epa.gov/ttn/amtic/files/ambient/pm25/spec/RTIGravMassSOPFINAL.pdf (accessed May 29, 2014). Updates may be available at http://www.epa.gov/ttn/amtic/specsop.html (accessed May 29, 2014).
  • RTI. 2009a. Standard operating procedures for coating aluminum honeycomb denuders with magnesium oxide. RTI International, Research Triangle Park, NC. Updates may be available at http://www.epa.gov/ttn/amtic/specsop.html (accessed May 29, 2014).
  • RTI. 2009b. Standard operating procedure for cleaning nylon filters used for collection of PM2.5 material. RTI International, Research Triangle Park, NC. http://www.epa.gov/ttn/amtic/files/ambient/pm25/spec/pm25nyloncleaningsop.pdf (accessed May 29, 2014). Updates may be available at http://www.epa.gov/ttn/amtic/specsop.html (accessed May 29, 2014).
  • RTI. 2009c. Standard operating procedure for PM2.5 cation analysis. RTI International, Research Triangle Park, NC. http://www.epa.gov/ttn/amtic/files/ambient/pm25/spec/pm25cationsop.pdf (accessed May 29, 2014). Updates may be available at http://www.epa.gov/ttn/amtic/specsop.html (accessed May 29, 2014).
  • RTI. 2009d. Standard operating procedure for PM2.5 anion analysis. RTI International, Research Triangle Park, NC. http://www.epa.gov/ttn/amtic/files/ambient/pm25/spec/pm25anionsop.pdf (accessed May 29, 2014). Updates may be available at http://www.epa.gov/ttn/amtic/specsop.html (accessed May 29, 2014).
  • RTI. 2009e. Standard operating procedure for the determination of organic, elemental, and total carbon in particulate matter using a thermal/optical transmittance carbon analyzer. RTI International, Research Triangle Park, NC. .http://www.epa.gov/ttn/amtic/files/ambient/pm25/spec/RTIOCECSOP.pdf (accessed May 29, 2014). Updates may be available at http://www.epa.gov/ttn/amtic/specsop.html (accessed May 29, 2014).
  • RTI. 2009f. Standard operating procedure for the X-ray fluorescence analysis of PM2.5 deposits on Teflon filters. RTI International, Research Triangle Park, NC. http://www.epa.gov/ttn/amtic/files/ambient/pm25/spec/pmxrfsop.pdf (accessed May 29, 2014). Updates may be available at http://www.epa.gov/ttn/amtic/specsop.html ( accessed May 29, 2014).
  • RTI. 2011. Standard operating procedure for procurement and acceptance testing of Teflon, nylon, and quartz filters. RTI International, Research Triangle Park, NC. http://www.epa.gov/ttn/amtic/files/ambient/pm25/spec/RTIFilterProcurementSOPFINAL.pdf (accessed May 29, 2014). Updates may be available at http://www.epa.gov/ttn/amtic/specsop.html (accessed May 29, 2014).
  • RTI. 2013. Chemical speciation—Field standard operating procedures. RTI International, Research Triangle Park, NC. http://www.epa.gov/ttn/amtic/spectraining.html (accessed May 29, 2014). Updates may be available.
  • Schauer, J.J., W.F. Rogge, M.A. Mazurek, L.M. Hildemann, G.R. Cass, and B.R.T. Simoneit. 1996. Source apportionment of airborne particulate matter using organic compounds as tracers. Atmos. Environ. 30(22): 3837–3855. doi:10.1016/1352-2310(96)00085-4
  • Scheffe R.D., P.A. Solomon, R. Husar, et al. 2009. The National Ambient Air Monitoring Strategy: Rethinking the role of national networks. J. Air Waste Manage. Assoc. 59:579–590. doi:10.3155/1047-3289.59.5.579
  • Schmid, H.P., L. Laskus, H.J. Abraham, U. Baltensperger, V.M. H. Lavanchy, M. Bizjak, P. Burba, H. Cachier, D. Crow, J.C. Chow, T. Gnauk, A. Even, H.M. ten Brink, K.P. Giesen, R. Hitzenberger, C. Hueglin, W. Maenhaut, C.A. Pio, J. Puttock, J.P. Putaud, D. Toom-Sauntry, and H. Puxbaum. 2001. Results of the “Carbon Conference” international aerosol carbon round robin test: Stage 1. Atmos. Environ. 35(12): 2111–2121. doi:10.1016/S1352-2310(00)00493-3
  • Seinfeld, J.H., and S.N. Pandis. 1998. Atmospheric Chemistry and Physics From Air Pollution to Climate Change. New York, NY: John Wiley and Sons.
  • Smiley, J. 2013. Experimental inter-comparison of speciation laboratories—Study #8. Research Triangle Park, NC, Technical Memorandum to Dennis Crumpler, June 26, 2013. EPA, Office of Air Quality Planning and Standards. http://www.epa.gov/ttn/amtic/files/ambient/pm25/qa/MultilabSpeciationPT2012.pdf (accessed May 29, 2014) Previous reports are available at http://www.epa.gov/ttn/amtic/pmspec.html (accessed May 29, 2014).
  • Solomon, P.A., L. Salmon, T. Fall, and G.R. Cass. 1992. The spatial and temporal distribution of atmospheric nitric acid and particulate nitrate concentrations in Los Angeles. Environ. Sci. Technol. 26(8): 1594–1601. doi:10.1021/es00032a016
  • Solomon, P.A., W. Mitchell, D. Gemmill, M.P. Tolocka, G. Norris, R. Wiener, S. Eberly, J. Rice, J. Homolya, R. Scheffe, R. Vanderpool, R. Murdoch, S. Natarajan, and E. Hardison. 2000. Evaluation of PM2.5 chemical speciation samplers for use in the EPA National PM2.5 Chemical Speciation Network. Office of Air Quality Planning and Standards, Research Triangle Park, NC. http://www.epa.gov/ttnamti1/files/ambient/pm25/spec/fourcty.pdf (accessed May 29, 2014).
  • Solomon, P.A., M.P. Tolocka, G. Norris, M. Landis. 2001. Chemical analysis methods for atmospheric aerosol components. In Aerosol Measurement: Principles, Techniques, and Application, ed. P. Baron and K. Willeke, 261–293. New York, NY:John Wiley & Sons.
  • Solomon, P.A., K. Baumann, E. Edgerton, R. Tanner, D. Eatough, W. Modey, H. Maring, D. Savoie, S. Natarajan, M.B. Meyer, and G. Norris. 2003. Comparison of integrated samplers for mass and composition during the 1999 Atlanta Supersites project. J. Geophys. Res. 108(D7): 8423. doi:10.1029/2001JD001218
  • Solomon, P.A., and P.K. Hopke. 2008. Introduction: special issue supporting key scientific and policy- and health-relevant findings from EPA’s Particulate Matter Supersites Program and related studies: An integration and synthesis of results. J. Air Waste Manage. Assoc. 58(2): 137–139. doi:10.3155/1047-3289.58.2.137
  • Solomon, P.A., P.K. Hopke, J. Froines, and R. Scheffe. 2008. Key scientific and policy- and health-relevant findings from EPA’s Particulate Matter Supersites Program and related studies: An integration and synthesis of results. J. Air Waste Manage. Assoc. 58(13): S1–S92. doi:10.3155/1047-3289.58.13.S-1
  • Solomon, P.A., M. Costantini, T.J. Grahame, M.E. Gerlofs-Nijland, F. Cassee, A.G. Russell, J.R. Brook, P.K. Hopke, G. Hidy, R.F. Phalen, P. Saldiva, S. Ebelt Sarnat, J.R. Balmes, I.B. Tager, H. Özkaynak, S. Vedal, S.S.G. Wierman, and D.L. Costa. 2012. Air pollution and health: Bridging the gap from sources to health outcomes: Conference summary. Air Qual. Atmos. Health 5(1): 9–62. doi:10.1007/s11869-011-0161-4
  • Subramanian, R., A.Y. Khlystov, J.C. Cabada, and A.L. Robinson. 2004. Positive and negative artifacts in particulate organic carbon measurements with denuded and undenuded sampler configurations. Aerosol Sc. Technol. 38(Suppl. 1): 27–48. doi:10.1080/02786820390229354
  • Technology Transfer Network. 2012a. Chemical speciation—Data management and reporting. Technology Transfer Network Ambient Monitoring Technology, AMTIC. EPA, OAQPS, Research Triangle Park, NC. http://www.epa.gov/ttn/amtic/specdat.html (accessed May 29, 2014).
  • Technology Transfer Network. 2012b. Chemical speciation—Special studies. Technology Transfer Network Ambient Monitoring Technology, AMTIC, EPA, OAQPS, Research Triangle Park, NC. http://www.epa.gov/ttn/amtic/specstud.html (accessed May 29, 2014).
  • Tolocka, M.P., P.A. Solomon, W. Mitchell, G.A. Norris, D. B. Gemmill, R.W. Wiener, R.W. Vanderpool, J.B. Homolya, and J. Rice. 2001. East versus west in the U.S: Chemical characteristics of PM2.5 during the winter of 1999. Aerosol Sci. Technol. 34(1): 88–96. doi:10.1080/02786820118957
  • Turpin, B.J., S.V. Hering, and J.J. Huntzicker. 1994. Investigation of organic aerosol sampling artifacts in the Los Angeles Basin. Atmos. Environ. 28: 3061–3071. doi:10.1016/1352-2310(94)00133-6
  • Turpin, B.J., P. Saxena, and E. Andrews. 2000. Measuring and simulating particulate organics in the atmosphere: Problems and prospects. Atmos. Environ. 34(18): 2983–3013. doi:10.1016/S1352-2310(99)00501-4
  • U.S. Environmental Protection Agency. 1998. Quality assurance guidance document 2.12, Monitoring in PM2.5 ambient air using designated reference or Class I equivalent methods. U.S. Environmental Protection Agency, Office of Air Quality Planning and Standards, Research Triangle Park, NC. http://www.epa.gov/ttn/amtic/files/ambient/pm25/qa/m212covd.pdf (accessed May 29, 2014).
  • U.S. Environmental Protection Agency. 1999a. Particulate matter (PM2.5) speciation guidance, Final draft, October 7, 1999. U.S. Environmental Protection Agency, Monitoring and Quality Assurance Group, Emissions, Monitoring, and Analysis Division, Office of Air Quality Planning and Standards, Research Triangle Park, NC. http://www.epa.gov/ttn/amtic/files/ambient/pm25/spec/specfinl.pdf (accessed May 29, 2014).
  • U.S. Environmental Protection Agency. 1999b. Data quality objectives for the trends component of the PM2.5 Speciation Network. U.S. Environmental Protection Agency, Office of Air Quality Planning and Standards, Research Triangle Park, NC. http://www.epa.gov/ttn/amtic/files/ambient/pm25/spec/dqo3.pdf (accessed May 29, 2014).
  • U.S. Environmental Protection Agency. 1999c. Compendium of methods for the determination of inorganic compounds in ambient air compendium method IO-3.3 Determination of metals in ambient particulate matter using X-ray fluorescence (XRF) spectroscopy. U.S. Environmental Protection Agency, Office of Air Quality Planning and Standards, Research Triangle Park, NC. http://www.epa.gov/ttnamti1/files/ambient/inorganic/mthd-3-3.pdf (accessed May 29, 2014).
  • U.S. Environmental Protection Agency. 2000. Quality assurance guidance document, Final quality assurance project plan: PM2.5 Speciation Trends Network field sampling. U.S. Environmental Protection Agency, Office of Air Quality Planning and Standards, Research Triangle Park, NC. http://www.epa.gov/ttn/amtic/files/ambient/pm25/spec/1025sqap.pdf (accessed May 29, 2014).
  • U.S. Environmental Protection Agency. 2005a. PM2.5 Speciation Network Newsletter, Issue 4, October. U.S. Environmental Protection Agency, Office of Air Quality Planning and Standards, Research Triangle Park, NC. http://www.epa.gov/ttn/amtic/files/ambient/pm25/spec/spnews4.pdf (accessed May 29, 2014).
  • U.S. Environmental Protection Agency. 2005b. PM2.5 Speciation Network Newsletter, Issue 2, January. U.S. Environmental Protection Agency, Office of Air Quality Planning and Standards, Research Triangle Park, NC. http://www.epa.gov/ttn/amtic/files/ambient/pm25/spec/spnews2.pdf (accessed May 29, 2014).
  • U.S. Environmental Protection Agency. 2005c. Draft, National Ambient Air Monitoring Strategy. U.S. Environmental Protection Agency, Office of Air Quality Planning and Standards, Research Triangle Park, NC. http://www.epa.gov/ttn/amtic/files/ambient/monitorstrat/naamstrat2005.pdf (accessed May 29, 2014).
  • U.S. Environmental Protection Agency. 2006. PM2.5 Speciation Network Newsletter, Issue 5, April. U.S. Environmental Protection Agency, Office of Air Quality Planning and Standards, Research Triangle Park, NC. http://www.epa.gov/ttn/amtic/files/ambient/pm25/spec/spnews5.pdf (accessed May 29, 2014).
  • U.S. Environmental Protection Agency. 2007a. EPA PM2.5 Chemical Speciation Network Carbon Sampler Replacement Program: Phase I. U.S. Environmental Protection Agency, Office of Air Quality Planning and Standards, Research Triangle Park, NC. http://www.epa.gov/ttn/amtic/files/ambient/pm25/spec/EPA_Carbon_report%20ARS%20JRICE%2010-10-07.pdf (accessed May 29, 2014).
  • U.S. Environmental Protection Agency. 2007b. URG 3000N phase I carbon implementation report, August 2007. Prepared by U.S. Environmental Protection Agency, Research Triangle Park, NC. http://www.epa.gov/ttn/amtic/files/ambient/pm25/spec/EPA_Carbon_report%20ARS%20JRICE%2010-10-07.pdf (accessed May 29, 2014).
  • U.S. Environmental Protection Agency. 2007c. Field/trip blank collection for carbon in the PM2.5 Chemical Speciation Network (CSN). Changes with the new URG3000N carbon measurement system, Note to URG3000N Phase 1 Implementation Group1, July 18. U.S. Environmental Protection Agency, Office of Air Quality Planning and Standards, Research Triangle Park, NC. http://www.epa.gov/ttnamti1/files/ambient/pm25/spec/blank-Schedule.pdf (accessed May 29, 2014).
  • U.S. Environmental Protection Agency. 2009a. Integrated science assessment for particulate matter (EPA/ 600/R-08/139F). U.S. Environmental Protection Agency, Office of Research and Development, Research Triangle Park, NC. http://cfpub.epa.gov/ncea/cfm/recordisplay.cfm?deid=216546 (accessed May 29, 2014).
  • U.S. Environmental Protection Agency. 2009b. URG3000N phase II carbon implementation report. Prepared by U.S. Environmental Protection Agency, Research Triangle Park, NC. http://www.epa.gov/ttn/amtic/files/ambient/pm25/spec/URG3000NPhaseIIReportFinal.pdf (accessed May 29, 2014).
  • U.S. Environmental Protection Agency. 2011. PM 2.5 Speciation—Lab audit reports and assessments. EPA, National Air and Radiation Environmental Laboratory, Montgomery, AL. http://www.epa.gov/ttn/amtic/pmspec.html (accessed May 29, 2014).
  • U.S. Environmental Protection Agency. 2012a. Technology Transfer Network (TTN), Air Quality System (AQS). U.S. Environmental Protection Agency, Office of Air Quality Planning and Standards, Research Triangle Park, NC. http://www.epa.gov/ttn/airs/airsaqs/ (accessed May 29, 2014).
  • U.S. Environmental Protection Agency. 2012b. Technology Transfer Network Ambient Monitoring Technology Information Center, Chemical Speciation. U.S. Environmental Protection Agency, Office of Air Quality Planning and Standards, Research Triangle Park, NC. http://www.epa.gov/ttnamti1/speciepg.html (accessed May 29, 2014).
  • U.S. Environmental Protection Agency. 2012c. CSN and IMPROVE air sampler photographs and diagrams. U.S. Environmental Protection Agency, Office of Air Quality Planning and Standards, Research Triangle Park, NC. http://www.epa.gov/ttn/amtic/files/ambient/pm25/spec/pm25pict.pdf (accessed May 29, 2014).
  • U.S. Environmental Protection Agency. 2012d. Quality assurance guidance document, Quality assurance project plan: PM2.5 chemical speciation sampling at Trends, NCore, Supplemental and Tribal Sites (An update to the PM2.5 Speciation Trends Network Field Sampling QAPP, December 2000), EPA EPA-454/B-12-003. U.S. Environmental Protection Agency, Office of Air Quality Planning and Standards, Research Triangle Park, NC. http://www.epa.gov/ttn/amtic/files/ambient/pm25/spec/CSN_QAPP_v120_05-2012.pdf (accessed May 29, 2014).
  • U.S. Environmental Protection Agency. 2012e. Chemical speciation—URG3000N carbon implementation. U.S. Environmental Protection Agency, Office of Air Quality Planning and Standards, Research Triangle Park, NC. http://www.epa.gov/ttn/amtic/specurg3000.html (accessed May 29, 2014).
  • U.S. Environmental Protection Agency. 2012f. Quality assurance project plan, Chemical speciation of PM2.5 filter samples. U.S. Environmental Protection Agency, Office of Air Quality Planning and Standards, Research Triangle Park, NC. Rpt. RTI/0212053/01QA, May. http://www.epa.gov/ttn/amtic/files/ambient/pm25/spec/qapp.pdf (accessed May 29, 2014)., last asccessed 4-24-24.
  • U.S. Environmental Protection Agency. 2013. Master CSN site list including trends 2013. U.S. Environmental Protection Agency, Office of Air Quality Planning and Standards, Research Triangle Park, NC. http://www.epa.gov/ttnamti1/files/ambient/pm25/spec/MasterCSNListforAMTIC050313.xlsx ( accessed May 29, 2014).
  • Watson, J.G., J.C. Chow, and L.-W.A. Chen. 2005. Summary of organic and elemental carbon/black carbon analysis methods and intercomparisons. Aerosol Air Qual. Res. 5(1): 65–102. http://aaqr.org/VOL5_No1_June2005/6_AAQR-05-06-OA-0006_65-102.pdf (accessed May 29, 2014).
  • Watson, J.G., L.W. Chen, J.C. Chow, P. Doraiswamy, and D.H. Lowenthal. 2008. Source apportionment: Findings from the U.S. Supersites Program. J. Air Waste Manage. Assoc. 58(2): 265–288. doi:10.3155/1047-3289.58.2.265
  • Watson, J.G., J.C. Chow, L.W. Chen, and N.H. Frank. 2009. Methods to assess carbonaceous aerosol sampling artifacts for IMPROVE and other long-term networks. J. Air Waste Manage. Assoc. 59:898–911. doi:10.3155/1047-3289.59.8.898
  • Watson, J.G., and J.C. Chow. 2011. Ambient aerosol sampling. In Aerosol Measurement: Principles, Techniques and Applications, ed. P. Kulkarni, P.A. Baron, and K. Willeke, 3rd ed., 591–614. Hoboken, NJ: John Wiley & Sons.
  • White, W.H., L.L. Ashbaugh, N.P. Hyslop, and C.E. McDade. 2005. Estimating measurement uncertainty in an ambient sulfate trend. Atmos. Environ. 39(36): 6857–6867. doi:10.1016/j.atmosenv.2005.08.002
  • Yu, X.Y., T. Lee, B. Ayres, S.M. Kreidenweis, W. Malm, and J.L. Collett. 2006. Loss of fine particle ammonium from denuded nylon filters. Atmos. Environ. 40(25): 4797–4807. doi:10.1016/j.atmosenv.2006.03.061
  • Zhao, W.X., P.K. Hopke, G. Norris, R. Williams, and P. Paatero. 2006. Source apportionment and analysis on ambient and personal exposure samples with a combined receptor model and an adaptive blank estimation strategy. Atmos. Environ. 40(20): 3788–3801. doi:10.1016/j.atmosenv.2006.02.027

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.