UPDATES is a monthly email newsletter on stormwater management, assessment (including monitoring), and maintenance research at St. Anthony Falls Laboratory and the University of Minnesota.
Or Text SWUPDATES to 22828 to Join. Message and data rates may apply.
Standard and Uniform Stormwater Runoff Monitoring
October 2011 (volume 6 - issue 9)
Contributed by Masoud Kayhanian, Department of Civil and Environmental Engineering, University of California - Davis (recipient of the 2010 J.S. Braun/Braun Intertec Professorship from the University of Minnesota)
As part of the NPDES permit requirement, many entities are required to monitor the quality of stormwater runoff to ensure protection of receiving waters. Each year, a tremendous amount of resources are devoted to collecting and performing chemical analysis on samples to meet regulatory requirements. Unfortunately, the data collections are typically not performed in a standard and uniform manner on either national or regional scales. Applying a non-standard water quality sampling and analysis methodology will create regulatory problems when computing total daily maximum loads (TMDLs). For example, a discrepancy in quality assurance/quality control was observed among different monitoring program when reviewing the nationwide NPDES database (Pitt et al., 2005). A similar problem has been recognized by the current international database for stormwater best management practices (BMPs) since data used in the database come from different sources with wide ranges of QA/QC procedures (http://www.bmpdatabase.org). Hence, utilization of such data may require prior screening and data evaluation, which is not be possible given that the necessary information is not available.
The variation in monitoring data can partially be related to different analytical techniques, use of old and advancement analytical instrumentation, lack of an adequate lab and field QA/QC implementation. These problems were recognized during the early phase of the California Statewide Stormwater Runoff Characterization Study (Kayhanian et al., 2002a). To make the statewide monitoring consistent, to prevent errors and to increase the scientific validity of monitoring data, a stormwater guidance manual and several tools were developed. The guidance document and the related tools improved the quality of collected runoff data and allowed comparison on a statewide basis. The tools were also found to save costs since analytical analysis of samples were only performed if found to be representative. The guidance manual and other tools can be found on the Caltrans web site at: http://www.dot.ca.gov/hq/env/stormwater/special/newsetup/index.htm#wq_ha... brief description and primary objective of the guidance manual and related tools are presented below.
Stormwater monitoring guidance manual: Designed to perform the monitoring on a consistent and standard basis throughout the state. This document provides guidance on all aspects of stormwater runoff monitoring including, planning, site selection, sample collection, sample preservation and processing, analytical methods, sample analysis, data validation, and data reporting.
Comprehensive data reporting protocol: Developed to report the collected monitoring data on the site, event and sample characteristics uniformly. Site data includes information about the physical characteristics of the site including drainage area size, number of highway lanes, annual average daily traffic, land use information, etc. Event data includes total rainfall, maximum rain intensity, antecedent dry period, etc. Sample data includes information related to physical, chemical and biological characteristics of runoff samples.
Hydrologic utility: Designed to collect and report the hydrologic data and to ensure sample representativeness. Hydrologic utility software actually serves two purposes: (1) it allows monitoring personnel to assess representativeness, and (2) to standardize calculation of important storm and sampling parameters, such as total flow volume, total event rain, rain intensity, estimated percent capture, etc.
Automated data validation and electronic data delivery software: to check for errors against standard QA/QC procedures specified in the guidance manual while validating all laboratory and field data. This automated program permits quick and efficient evaluation of lab data against data quality objectives and standard measures of data quality, and provides extensive error-checking for a standard set of possible analytical or data transcription errors. The resulting error free and validated data are then transported into the storm water quality database for further use and data analysis.
Data analysis tool: Developed to uniformly analyze analytical data that contains non-detects. Depending on the constituents and point of sampling, a number of chemical analysis may be reported as non-detects (NDs). However, even though the data are below detection limits they still provide useful information. Substituting an arbitrarily-selected value like zero, the reporting limit or ½ the reporting limit for non-detects will artificially bias the statistics (Kayhanian et al, 2002b; Helsel, 1990). Review of all available science-based statistical methods showed that regression of order statistics (ROS) and maximum likelihood estimation (MLE) is superior for data containing NDs (Shumway et al., 2002). Based on this assessment, a data analysis software tool compatible with Microsoft excel was prepared using the ROS technique.
Two examples are presented below to illustrate the impact of non-standard and uniform monitoring and related QA/QC on analytical results, data analysis and data evaluation/interpretation.
Example 1: TSS measurement
There are two standard methods available to measure total suspended solids (TSS): (1) standard method 2540-D and (2) U.S.EPA method 160.2. There is a slight difference between these two methods and, depending on the nature of the runoff sample; there is the possibility of substantial variation in measured values. On one occasion, for example, 11 runoff samples were split to three representative subsamples and send it to three different Laboratories (Lab A, Lab B and Lab C) for TSS measurement without specifying the method of analysis. The results of TSS measurement by three Labs are shown in Figure 1. As can be seen, the difference in TSS result for certain samples can be up to six fold. The large variation in measurement values is partially related to the non-uniform method of analysis. That in turn may be influenced by other factors including the particle size distribution and inadequate mixing (Kayhanian et al., 2005; Kayhanian et al., 2008). Using a specific standard method for TSS measurement across the monitoring program will eliminate this problem.
Figure 1: TSS measurement of the same runoff samples by three different laboratories without specifying any standard (ASTM or U.S.EPA) method
Example 2: The issue of non-detects and load calculation
This example is associated with variation in pollutant loading with data containing large number of non-detects (NDs). For this example we will use Hg monitoring data collected from highways in San Francisco Bay Area. As part of this monitoring study, 42 storm events were analyzed for a three year period. Hg concentration was detected in only 21.4 percent of the samples (the detection limit was 50 ng/L) and the rest of the data were reported as non-detects. Both the conventional (ignoring NDs, ND=DL, ND=1/2 DL, ND=0) and science-based (regression on order statistics, ROS) statistical methods were employed to compute mean and standard deviation of Hg data. The results of this statistical analysis are shown in Table 1. As shown, the mean values computed from commonly used conventional methods ranges from 17 to 100 ng/L. This impact can further be illustrated for load calculation using highway land use within the shaded watershed in San Francisco Bay Area (see Figure 2a). As shown in Figure 2b, difference in load calculation up to 82 percent can be observed. This large variation in load calculation can be problematic when allocating load for implementing total maximum daily load (TMDL) regulation; particularly when the monitoring data within the watershed are not collected and analyzed uniformly. Adopting a science-based statistical data analysis technique will eliminate part of the inherent problem.
Table 1: Mean and standard deviation for total Hg using different methods of statistical analysis
|Statistical Method||Mean(ng/L)||Standard Deviation (ng/L)|
|ND = DL||57.2||22.0|
|ND = ½ DL||36.4||29.7|
|ND = 0||16.7||38.5|
Note: sample size = 42; percent detected = 21.4; ng/L = nanogram per liter; detection limit = 50 ng/L
(a) highway land use in shaded watershed.
(b) Load allocation based on different statistical method.
Figure 2: Estimated total Hg load for highway land use within the shaded watershed using different statistical analysis methods.
- Helsel, D. R. (1990). Less than Obvious, Enviro. Sci. Technol. 24 (12), 1767-1774.
- Kayhanian M. (2005). “Advanced Stormwater Runoff Characterization.” 10th International Conference on Urban Storm Drainage, August 21-26, Copenhagen, Denmark.
- Kayhanian M., E. Rasa, A. Vichare, and J. E. Leatherbarrow (2008). “Utility of Suspended Solid Measurements for Stormwater Runoff Treatment.” Environmental Engineering, Vol. 134, No. 9, 712-721.
- Kayhanian M., J. Johnston H. Yamaguchi, and S. Borroum (2002a). "Caltrans Storm Water Management Program." Stormwater, Vol. 2, No. 2, 52-67. http://www.stormh2o.com/march-april-2001/caltrans-stormwater-management....
- Kayhanian, M., A. Singh, and S. Meyer (2002b). Impact of Non-Detect in Water Quality Data on Estimation of Constituent Mass Loading. Water Science Technology, Vol. 45, No. 9, pp 219-225.
- Pitt, R., A. Maestre, R. Morquecho, T. Brown, T. Schueler, K. Cappiella and P. Sturm, 2005. Evaluation of NPDES Phase 1 Municipal Stormwater Monitoring Data. University of Alabama and the Center for Watershed Protection.
- Shumway R. H., A. S. Azari, and M. Kayhanian (2002). "Statistical Approaches to Estimating Mean Water Quality Concentrations with Detection Limits." Environmental Science and Technology, Vol. 36, No. 15, 3345-3353.
We want to hear from you!!!
Let us know your thoughts, experiences, and questions by posting a comment. To get you thinking, here are a few questions:
- Why do we expect to see a large variation in TSS measurement for urban stormwater runoff?
- Is it really a big deal and what can be done about it?