1 August 2005
Crisp, clean, no nicotine
Integrating water quality remote monitoring systems.
By Martin Harmless
Tactics of poisoning the enemy's wells is an age-old means of waging war, denying drinking water to the populace and their livestock. And the threat of biological, chemical, and radiological weapons against society is just as real today.
Just watch the news—terrorists attempted to poison the water supply of the U.S. Embassy in Rome with potassium cyanate. Two American citizens with connections to Al Queda planned to attack water supplies in the West. Three alerts have occurred in the Saginaw Bay area. Threats have surfaced in France. Authorities have also thwarted other attempts that haven't reached the public.
People tend to think first of water sources, such as reservoirs, rivers, and wells, when discussing contamination of drinking water. But the sources are not the most vulnerable component of drinking water systems. The U.S. General Accountability Office (GAO) determined, through dialog with experts representing water utilities, the most vulnerable component of a drinking water system is the distribution system. The vast array of potential water quality contaminants includes those we can't detect directly in near-real-time with existing sensor technology. We're making strides in the area of detection and identification of biological contaminants directly, in near-real-time, but the technology is not yet commercialized. Some families of chemical contaminants require expensive, maintenance-heavy, and delicate instruments to detect and identify them.
The U.S. Environmental Protection Agency (EPA) acknowledges the art of detecting the threat of contamination in drinking water is to remotely monitor a combination of traditional and nontraditional parameters and report abnormal changes in their values. Yet contaminants we can't detect and measure directly, we can often detect indirectly. The introduction of nicotine to chlorinated drinking water will affect the chlorine content, causing it to drop from its normal range of values. Nicotine is a highly toxic chemical you can't detect or identify in near real time without expensive, maintenance-heavy instrumentation, yet a drop in chlorine will indicate its presence.
You can also take a two-tiered approach to cost-effectively provide the most information about water quality and systems operations information. Monitoring points in the first tier can monitor a water quality baseline and report changes with a few low-maintenance sensors. The second tier will include an enhanced set of sensing technology, ranging from specialized analyzers, gas chromatographs, and mass spectrometers in an attempt to identify the contaminant or at least the type of contaminant present. You can deploy the first tier systems in sufficient numbers to cover the whole distribution system. You can deploy the second tier systems strategically to provide validation and verification of a detected threat.
The goal is to develop an enhanced system involving this approach that will give utilities the most information for the money. The system will use advanced software to produce dependable information that is complete enough to interact with and guide any emergency response to detect contaminants in drinking water systems.
The system and process
To remotely monitor water quality in a distribution system, you'll need to systematically integrate several processes. A complete integrated water quality remote monitoring system facilitates detection and reporting of contamination with acquisition, analysis, management, and communication of data. The data consists of the values of physical and chemical parameters the sensors acquire in near real time. To enhance this system, follow this process:
- Conduct identification and analysis of threat scenarios to reach consensus on the most likely threat scenarios.
- Select the appropriate parameters you want to monitor based on an analysis of water chemistry as it relates to its interaction with the potential contaminants identified by the EPA and others.
- Do an evaluation to determine the value of various parameters you want to monitor as they contribute to an understanding of the threat scenarios.
- Perform research to identify sensors available to monitor selected parameters. Evaluate sensors for accuracy, sensitivity, reliability, and maintenance requirements (frequency of cleaning, calibration, electrolyte service, replacement, value of the information, and cost).
- Select parameters suitable for first-tier and second-tier monitoring applications.
- Couple sensors with distributed data processing and communications to create an intelligent network capable of detecting and reporting the evidence of contamination.
- Analyze available software technologies that can interpret data from sensors to determine the threat level and, potentially, the threat type.
- Finalize the monitoring solution. Review sensor and parameter selection (based on the information each parameter contributed) to the process of water quality threat analysis and generation of definitive reporting.
- Network the system among the two tiers to develop a meshed network or intelligent nodes.
- Make sure the system can accept new technologies as they come online.
We use various types of chlorine and chlorine compounds in drinking water distribution systems to maintain disinfection. The chlorine oxidizes and kills biological contaminants in the water, so drinking water utilities must maintain chlorine residual throughout the distribution system to prevent bacteria, viruses, protozoa, and fungi from repopulating the water. Chlorine is an aggressive oxidizer and readily oxidizes many chemicals, as well as biota. Introducing bacteria or nicotine into the water will cause the chlorine content to drop due to its oxidation of the introduced contaminant. The chlorine becomes bound up chemically in the process and is no longer available for further oxidation. Therefore, you can measure the impact of biological and certain chemical contamination on chlorine and infer the presence of the contaminant. The reduction in chlorine does not allow you to identify the contaminant but only detect it indirectly.
Oxidation reduction potential (ORP) is a measurement of the potential of the water to oxidize (add neutrons) or reduce (add protons) to chemicals. ORP levels generally track with the chlorine concentration. Positive millivolts reading for ORP will reflect a high level of chlorine. As the chlorine becomes unavailable during oxidation, the ORP will drop. As long as there is a sufficient amount of chlorine to oxidize contaminants, there is some defense against the addition of certain types of contamination.
Testing in a closed loop testing facility showed chlorine and ORP reacted strongly to the injections of nicotine, aldicarb, arsenic trioxide, and E. coli. The chlorine oxidized these contaminants, and the ORP dropped accordingly, validating the chlorine data. The success in detecting a wide range of contaminants suggests monitoring these two parameters provides a solution for detecting the evidence of contamination. Some scientists believe ORP alone offers an effective first-tier monitoring solution, but they go on to say water quality surveillance benefits from additional information other sensors generate as well. This suggests chlorine and ORP should be primary components of the water quality baseline or the first tier.
Remote monitoring of water quality provides the utility operator with readily available information about the status of the system from an operational and a water-quality standpoint. A drinking water distribution system generally includes distribution piping, pump stations, and water towers. The complexity of the systems varies with size and age. Some utilities do not currently have any remote monitoring and must take grab samples throughout the system to measure chlorine on a daily basis. Some utilities have installed sensors in the field to monitor chlorine and pH along with the status of pumps and liquid levels at pump stations and water towers. While the locations of these monitoring points are appropriate to monitor system operations, they might not be appropriate to provide complete coverage of the system to monitor water quality and detect contaminants. To obtain complete coverage of the distribution system to monitor water quality, you might need to add sensors at existing monitoring points. You might also need to establish monitoring points with necessary telemetry.
Interviews within utility operations have revealed a need for water quality monitoring throughout distribution systems for both operations and water equality issues. Systems must be affordable and easy to maintain in order to get adequate coverage and vigilance. Budgets are already constrained, and when faced with high system costs and added expenses for technicians, chemicals, and power required to operate and maintain complex systems, those responsible will look for more efficient alternatives. The challenge is to provide cost-effective remote monitoring systems that detect the majority of contaminants, produce useful near-real-time information about utility operations and water quality, and provide adequate geographical coverage while meeting the budgetary restraints of the utilities.
Good data produces good information. It is imperative to choose the best sensors for the application to give accurate readings at the lowest possible levels of detection. The sensors must first have the needed sensitivity and accuracy, and they must be reliable. You should evaluate periodic maintenance requirements. You'll need to replace some sensors every three or four months, and for pH and chlorine sensors, you should calibrate monthly. Have confidence in the data, as you'll base your decisions on an analysis of reported values. The best systems in existence will not make good decisions from analyzing bad data.
Once you've selected accurate, sensitive, and reliable sensors, acquire the signals they generate and convert them into useful data. A lot of sensors generate an analog output in milliamps or millivolts. Convert this analog data to digital to facilitate analysis and communications. Perform this conversion with the best resolution possible to maintain the accuracy and integrity of the data. Microprocessors acquire the data by taking a reading from the sensors on a periodic basis. Using distributed processing creates efficient, intelligent networks that perform analyses and data management locally and communicate critical information.
Adjust the sampling rate to give the right number of data points; take samples too often and there will be too much data to efficiently manage, not often enough, and you might miss a change in water quality. Store the data in a database for management and to allow historical analysis. Display current values of the parameters you're monitoring in a web-based format. A data acquisition and management device with its own Internet protocol address maximizes efficiency.
The user can access the data by connecting to the internet protocol address of each site either through the Internet or a local area network. The embedded software in the data acquisition and management device analyzes the data for compliance with set limits and notifies if any parameter is out of acceptable range. The data acquisition and management device stores the data until you download it remotely or locally into a database for analysis. Ideally, you'll store the data from all parameters you're monitoring to capture one year's data. You'll calculate hourly, daily, weekly, and monthly averages of the values to determine the baseline or normal range of each parameter for each site during various times of the year. This baseline becomes the basis to which you compare current values. The values of some parameters might not change much over time; others might change significantly due to seasonal variations, weather, and operational adjustments.
The source of drinking water can affect how much the water quality changes throughout the year. Surface water sources such as rivers and reservoirs are much less static in water quality than are underground sources. Well water tends to be fairly constant in water quality, though minor variations will occur. Surface waters are subject to the impacts of seasonal changes. Turbidity (cloudiness), temperature, and chemical contamination are affected by seasonal variations. Precipitation runoff, itself affected by land-use practices such as framing and construction, can contribute to the degradation of surface water sources. Eroded soil, pesticides, herbicides, fertilizers, and biodegradable vegetable matter can place additional demands upon the treatment plant. Water temperature will reflect seasonal temperature fluctuations, which will affect the water's ability to hold gases like oxygen and chlorine in solution. The warmer the water, the less gas it can hold. Most utilities use more chlorine during the warmer months for this reason.
Select allowable ranges of values established for the baseline based on the normal variations. The database will allow you to determine the ranges based on historical values you capture over time. Knowing the normal ranges of each parameter at each site and adjusting the acceptable ranges based on the normal ranges during the year is a good way to look for abnormal water quality and the evidence of contamination. You should get a notification if a parameter goes slightly out of its normal range or if a parameter goes significantly out of its normal range. These notifications should reflect the significance of the change, however. Take into account a slight variation from normal, and investigate it. A significant change, one that causes the water to exceed drinking water standards, requires a response to determine the cause and mitigate the problem. Make the adjustments to the acceptable ranges manually and remotely, or automatically, as the microprocessor consults the library for its site.
Utilities might want to have a computer continuously monitoring the remote sites. Software on the computer will enunciate any alerts or alarms, calling the operator's attention to the situation. The software will also report system malfunctions, send emails and pages to selected persons, require acknowledgement of the notification, record the action taken, and print the activity log for the event.
Behind the byline
Martin Harmless is president of Clarion Sensing Systems in Indianapolis, Indiana.
Biological warfare serious business
Releasing toxic chemicals, biological agents, or radioactive substances into drinking water systems presents a significant threat to military installations and densely populated areas of the country. In recognizing this threat, recent legislation has established goals and objectives to protect and ensure safe drinking water. Presidential Directive 63 (PDD-63) designated the water sector as critical infrastructure susceptible to physical and cyber attacks. PDD-63 outlines a coordinated, collaborative effort between all levels of government and the private sector to address and solve issues related to protecting critical infrastructure. Title IV of the Public Health Security and Bioterrorism Preparedness and Response Act of 2002 is entitled "Drinking Water Security and Safety." Sections 1433, 1434, and 1435 are entitled, "Terrorist and other Intentional Acts," "Contaminant Prevention, Detection, and Response," and "Supply Disruption Prevention, Detection and Response," respectively. Section 1433 says each community water system serving more than 3,300 persons must "conduct an assessment of the vulnerability of its system to a terrorist attack or other intentional act intended to substantially disrupt the ability of the system to provide a safe and reliable supply of drinking water." Section 1434 requires the U.S. Environmental Protection Agency and the Centers for Disease Control to conduct a review of "current and future methods to prevent, detect, and respond to the intentional introduction of chemical, biological, or radiological contaminants into community water systems and source water for community water systems."
Return to Previous Page