Skip Navigation

              
 glossary
Search
Triad Overview Triad Management Regulatory Information Triad FAQ User Experiences Reference/Resources
     
Triad Management
 Real-Time Measurement Systems
 Technology Selection

Glossary: Search and browse definitions

Index: Search and browse document index

Acronyms: Search and browse acronyms

Frequently Asked Questions





Multiagency support for Triad
Triad is a Federal/State Interagency Partnership


Technology Selection

The appropriate mix of measurement technologies for a Triad project depends on several factors.

One of the critical steps in designing a cost-effective and technically-defensible Triad-based data collection program is selecting the appropriate combination of measurement technologies that will be deployed. This is one outcome of the systematic planning process. It is important that suitable technical expertise is available to support the selection process, preferably an analytical chemist who is both familiar with standard fixed-laboratory techniques applicable to the contaminants of concern, and also with non-standard field-deployable methods that can potentially be used for generating real-time information.

The types of technologies that are under consideration may include methods that are not widely available commercially, either because they are proprietary or because there is limited experience with their deployment by service providers. In this case, there may be the need to involve vendors and/or service providers at some level in the selection process. The vendor or service provider may be the only source of the technology-specific technical performance information required to properly compare alternatives. In addition, consultation with the vendor or service provider will be useful if site-specific method modifications are required to optimize technology performance. Finally, the vendor or service provider may be aware of technology-specific deployment needs that may be important when selecting measurement technologies and designing data collection programs.

There are combinations of contaminants of concern and cleanup levels that are problematic for currently available real-time measurement technologies. Tritium and thorium-230 are examples of radionuclides for which rapid turnaround or field-deployable technologies are not currently available. Examples in the chemical world include obscure compounds or bioassays. For these cases, surrogate parameters for which real-time methods are available can potentially be used to facilitate real-time decision-making as part of a dynamic work strategy. Surrogates are parameters, compounds, or elements that are readily measurable, and whose presence and level have a strong relationship with contaminants of concern that are important for decision-making, but that are more difficult to identify and/or quantify with real-time techniques.

A very basic example of a surrogate is if soil contamination is associated with a soil layer or type that is visually different from uncontaminated soils. This situation can arise when contamination was the result of contaminated media dumping practices, or associated with buried disposal areas. In this example, visual media characteristics observed by the experienced eye of a field geologist may provide an effective, cost efficient surrogate for soil samples as a means for guiding characterization and/or remediation work. Another example is where multiple contaminants are consistently collocated, with one or more present that are amenable to real-time measurement and that can be used to support in-field decisions even if they are not of primary risk concern. An example would be using an XRF at a site with beryllium and lead commingled, with beryllium being the primary risk driver. While an XRF would not be effective for beryllium, it could be for lead. In this case XRF results for lead could potentially be used as a surrogate for beryllium to support decisions pertaining to beryllium.

The potential applications of data produced by alternative measurement techniques are important to the selection process. Generic applications (arranged in order of analytical quality requirements, from less to more) are as follows:

  • Reliably identify the presence or absence of classes of contaminants. This is the simplest application, with the result simply being the conclusion that a class of contaminants (e.g., PCBs or pesticides) is or is not potentially present, but no conclusions drawn about contaminant concentrations. Technologies capable of making this determination can be used to narrow the list of contaminants of concern for specific areas or waste streams. Examples include direct push conductivity probes for subsurface plume identification, gamma walkover surveys for surface radionuclide contamination, and ion probes or paper strips to detect nitrites/nitrates.


  • Reliably identify the presence of a particular contaminant above an upper threshold. In this case, the assumption is that a method can reliably identify the presence or absence of a particular contaminant (e.g., lead) at a level that is higher than cleanup requirements. Technologies capable of making this determination can be used for hot spot identification, to support contaminant boundary delineation, and potentially to satisfy waste acceptance criteria.


  • Reliably identify the presence or absence of a particular contaminant around cleanup guideline concentration levels. In this case, the assumption is that a method can reliably determine that a particular contaminant is or is not present above cleanup requirements. Technologies capable of making this determination can be used to support remediation decision-making and demonstrate compliance with cleanup requirements.


  • Reliably quantify the level of a particular contaminant with some known level analytical quality (e.g., precision and bias) down to levels below cleanup requirements. Technologies with these characteristics produce data that can potentially be used for risk assessments (base-line and post-remediation), documenting closure conditions, etc.

As discussed earlier, the expected site-specific performance of any particular measurement technology is as dependent on the site-specific distribution of contamination and media characteristics as it is on the generic performance characteristics of the technology itself. The CSM plays an important role in predicting measurement technology performance, since the CSM captures the understanding about the spatial distribution of contamination and nature of contaminated media. As one moves through the characterization and remediation process, the CSM is constantly evolving and becoming more accurate in this regard. Selecting appropriate real-time measurement techniques and predicting their performance becomes progressively easier as the cleanup process proceeds.

Where there is significant uncertainty about the potential performance of particular techniques or the need to modify a particular technique to meet site-specific requirements, a demonstration of method applicability is essential. This type of study can be conducted as a stand-alone exercise or integrated with planned characterization activities. This is one of the reasons for introducing a Triad approach as early into the characterization and remediation process as possible. Fielding alternative measurement technologies in a limited fashion as part of site assessment or remedial investigation activities can pave the way to more effective and comprehensive deployment during later stages of work.

The ideal data collection technology is one that provides inexpensive, highly confident (i.e., definitive) real-time results for all contaminants of concern at concentrations well below their action levels. For the vast majority of settings, an ideal data collection technology does not exist. The Triad challenge is to cost-effectively manage decision uncertainty as a product of relational, analytical, and sampling uncertainty with the measurement technologies that are available. Several key points need to be remembered. (1) Both standard, fixed-laboratory methods and field-deployable systems often provide a range of analytical quality depending on a number of factors that project managers can control. The issue is not just selecting a particular measurement technology; it is also determining how much should be invested to achieve particular levels of analytical quality. (2) Increasing analytical quality comes at a price. Conversely, steps can be taken that may produce slightly lower analytical quality for a particular method but significantly reduce costs. (3) Sampling uncertainty is inversely related to sample numbers and overall data collection costs. Increasing sample numbers reduces sampling uncertainty. (4) For both sampling and analytical uncertainty, there are diminishing returns to additional investments in either analytical or sampling quality. In other words, investing a lot more in analytical quality will generally not produce an equivalent reduction in analytical uncertainty.

The optimal mix of measurement technologies depends on a number of factors that include:

  • Cost. Both per unit sampling and analysis costs and deployment costs (i.e., mobilization, demobilization, regulatory acceptability expenses) are a factor in selecting measurement technologies. As the amount of data collection increases, fixed costs associated with technology deployment become less of an issue. For projects with relatively fixed budgets, unit analytical costs become a critical item when managing sampling uncertainty. Lower per unit costs mean more data points can be collected, which will usually translate directly into lower sampling uncertainty and potentially better decisions.


  • Turn-Around Times. Turn-around times are a critical factor for dynamic work strategies. Turn-around requirements are site and decision-specific. They can range from a few minutes or hours in the case of remediation support, to a few days for large Triad-based characterization programs.


  • Through-Put Rates. Sample or measurement through-put rates are an important factor because they are linked to method costs, turn-around times, analytical quality, and daily method sample capacity.


  • Logistical Support Requirements. Logistical support requirements can be an issue for those methodologies that have specific logistical needs. These include dedicated power sources, climate controlled conditions, and special operator's licenses (e.g., an XRF with a sealed radioactive source).


  • Operational Constraints. Examples of operational constraints that may be factors in technology selection are the ability of a technology to be deployed across a wide range of temperature, dust, and humidity levels, to be fielded in wet conditions, or to handle saturated or partially-saturated media (e.g., moist soils or sediments, snow, standing water conditions, etc.).


  • Interference Potential. This issue is related to the specificity/selectivity of proposed methods in the context of interfering factors, and the potential for interfering factors to be present at a particular site. Non-specificity is not necessarily an issue if the data user understands how to interpret the results or if all responding analytes are of interest (such as the daughter products of pesticides). Non-selective methods are not a problem if it can be shown that interferences are not present or expected. If interferences are present, there may be requirements for cleanup during sample preparation that will affect overall method cost, turn-around times, and throughput.


  • Application Requirements. Earlier in this section generic applications of analytical methods were discussed (e.g., contaminant class identification, hot spot identification, cleanup level confirmation, quantitative results that can be fed to data analyses). Site-specific application requirements (e.g., detection limits, the need for quantitative data as opposed to qualitative or semi-quantitative results, etc.) are critical to appropriate technology selection.


  • Spatial Contaminant Distribution. Site-specific usefulness of any analytical technology, standard or alternative, for supporting decisions is tightly coupled with the expected spatial distribution of contamination relative to cleanup level definition and method detection/quantitation limits. This means that a technology that provides predictive data in one setting may not provide information equally useful in another. For example, a technology with precision too poor to support decision-making when concentrations are close to cleanup levels may in fact be perfect for a site (or other portions of the same site) where contamination is either well below cleanup requirements, or far above. This is where the CSM becomes important from a technology selection perspective.


  • Relative Analytical Quality. Relative analytical quality assumes that one has more than one option for obtaining analytical data. Multiple options include having completely different analytical choices (e.g., ASV vs. XRF vs. AA vs. ICP for metals), and they can also include the same technique with different levels of sample preparation and QA/QC. Less rigorous analytical quality data sets are only justified if they provide some project benefit in return (e.g., reduced per sample costs, improved decision-making, or the availability of real-time results).


  • Regulatory Acceptability. In a perfect world, measurement technology selection would be based solely on expected technical performance in meeting project objectives. However, the selection process is at least partly subjective, and the ultimate site-specific performance of any particular technology will not be completely known until the project is complete. Regulatory acceptance of non-standard techniques will be required, and, much as with project decision-making, a weight-of-evidence approach will likely be the most effective way to gain acceptance. Contributors to evidence include previous precedent of successful technology use at similar sites, wide commercial availability of the proposed technique, third-party reports evaluating technology performance under standardized conditions, inclusion of the technique in compendia of recognized methods, and, if necessary, site-specific demonstrations of method applicability.


  • Multiple Contaminants. The presence of multiple contaminants can complicate technology selection. Many cheaper, field deployable technologies either have very analyte-specific performance (e.g., great for lead, lousy for beryllium), or are non-specific but capable of identifying the presence of classes of contaminants (e.g., immunoassay kits). If standard laboratory analyses are the only option for one or more of the contaminants of concern at a site and there are no suitable surrogates, or the decisions required demand a level of specificity that field methods cannot provide, the range of cost-effective options may be significantly limited. Alternatively, if a good surrogate that is amenable to real-time techniques can be identified for decision-making purposes, multiple contaminants may not be a critical technology selection issue.

There truly is a wide range of options for providing real-time measurement support. The list is continuously growing. There are several resources that contain useful information for project managers and technical staff as they search for appropriate techniques. These include a listing of technologies maintained by the USEPA at http://clu-in.org/fate and by the Federal Remediation Technology Roundtable at http://www.frtr.gov/site/.





Home | Overview | Triad Management | Regulatory Info | User Experiences | Reference/Resources
News | Glossary | Document Index | Acronyms | FAQs
Privacy/Security | Site Map | Contact Us