Turbidity

(endorsed 2011)

Guideline

Chlorine-resistant pathogen reduction: Where filtration alone is used as the water treatment process to address identified risks from Cryptosporidium and Giardia, it is essential that filtration is optimised and consequently the target for the turbidity of water leaving individual filters should be less than 0.2 NTU, and should not exceed 0.5 NTU at any time.

Disinfection: A turbidity of less than 1 NTU is desirable at the time of disinfection with chlorine unless a higher value can be validated in a specific context.

Aesthetic: Based on aesthetic considerations, the turbidity should not exceed 5 NTU at the consumer’s tap.

General description

Turbidity is a measure of the light-scattering property of water caused by the presence of fine suspended matter such as clay, silt, plankton and other microscopic organisms. The degree of scattering depends on the amount, size and composition of the suspended matter. At low levels, turbidity can only be detected by instruments, but at higher levels the water has a “muddy” or “milky” appearance clearly visible to the naked eye. As a guide, water with a turbidity of 5 Nephelometric Turbidity Units (NTU) appears slightly muddy or milky in a glass, while at >60 NTU, it is not possible to see through the water. “Crystal-clear” water usually has a turbidity of less than 1 NTU.

There are three distinct aspects to turbidity to be considered within the catchment-to-consumer risk management framework:

  • the use of turbidity as a measure to provide assurance of the optimal operation of filter performance, where filtration is used to address identified risks associated with chlorine-resistant pathogens in the source water;

  • the impact of turbidity on the efficacy of disinfection processes;

  • the effect that turbidity has on the aesthetics of the treated water.

Measurement

For laboratory-based analyses, the ration-recording nephelometroc turbidity meter is the preferred method for turbidity meanreument, as it can compensate for the effect of dissolved colour. Results are expressed in NTU and are calibrated against a prepared formazin standard (APHA 2130B, 2005). The detection limit is about 0.1 NTU.

When using turbidity for accurate monitoring of filter performance (i.e. where filtration is the only water treatment process to remove chlorine-resistant pathogens), it is recommended that on-line, continuously reading turbidity meters be installed on the outlet of each individual filter in addition to any on-line turbidity meter that is installed on the combined filter outlet. It is prudent to have the turbidity meter outputs linked into plant SCADA and/or alarm systems, to ensure that immediate action is taken in response to the detection of filtered water turbidity above the set target. This intensity of operational monitoring is strongly recommended to ensure that any performance issues related to individual filters are detected and addressed proactively (USEPA 2004, Mosse 2009). Particle counting facilities are used for the same purpose of filter optimisation but the results are too dependent on the actual equipment used and their mode of operation to provide general guidance in the same context as for tubidity.

While real-time monitoring of the turbidity trends generated from the on-line instruments is crucial in determining the instantaneous performance of the plant, and therefore the safety of the water, longer-term monitoring is beneficial to demonstrate the need for continuous improvement and maintenance activities such as filter inspections, optimised backwash and other process procedures.

Treatment of drinking water

Pathogen reduction

Chlorine-based disinfection is only effective against bacterial and most viral, pathogens. At the doses typically applied in water treatment, chlorine is not effective against the protozoan pathogen Cryptosporidium and only has a limited effect on Giardia in the absence of large filtered water storages to provide adequate contact time for effective disinfection. Cryptosporidium oocysts are quite small (4-6 Ξm) and will pass readily through a conventional media filter in the absence of effective coagulation and flocculation. Filtration combined with effective coagulation, flocculation and clarification can be used as a barrier for Cryptosporidium and other protozoan pathogens. In many cases, coagulation-assisted clarification and filtration may be the only existing treatment barrier to protozoan pathogens.

In the absence of reliable real-time pathogen detection methodologies, continuous turbidity monitoring is considered the best available surrogate for assessing filter performance.

Many studies have investigated the relationship between pre-treatment turbidity, turbidity reduction (or particle removal) via filtration, and pathogen reduction. It has been demonstrated in pilot scale trials that a change in filter effluent turbidity from 1.0 through 0.5 to 0.3 NTU would not significantly improve the reliability of pathogen control. However, by setting filter effluent turbidity goals below 0.2 NTU, significant improvements in microbial quality could be obtained (Xagoraraki et al. 2004). The USEPA identified that turbidity limits of 0.15 NTU from individual filters with an upper limit of 0.3 NTU provided a substantial improvement in removal of Cryptosporidium compared to its previous limits of 0.3 NTU, with an upper limit of 1 NTU (USEPA 2006).

Targets for filtered water turbidity should be based on the pathogen risks in the raw water; for example, surface run-off from a catchment with significant sewage inputs or dairy farms would have tighter turbidity targets than a catchment without such impacts. Therefore, when setting turbidity targets for filtered water, raw water quality and treatment capabilities need to be aligned to manage any potential health risks. The United States Environmental Protection Agency Long Term 2 Enhanced Surface Water Treatment Rule (USEPA 2006) and the Drinking-water Standards for New Zealand (NZ-MOH 2008) directly relate raw water quality to the setting of filtered water turbidity targets.

Where a given water supply system risk assessment identifies a significant risk associated with protozoan pathogens, and a high level of operational monitoring of turbidity and any associated adjustment or maintenance of coagulation, flocculation, clarification and filtration processes or facilities are not considered practical, then alternative processes (e.g. ultraviolet radiation disinfection) may need to be applied to ensure the identified risk is adequately addressed.

Catchment management and source protection can be good enough to obviate the need for water treatment to remove and/or inactive protozoan pathogens. Exclusion of contamination from humans and domesticated animals in run-off from catchments and source areas generally leads to only minimal risk from protozoan pathogens in the Australian context, and specific treatment to remove protozoa is not required. In many cases, however, catchments and sources are not sufficiently managed and protected to ensure safe drinking water without additional treatment.

Where water is harvested from partly protected catchments and sources with a relatively low level of contamination, protozoan pathogens can be removed adequately by conventional treatment alone. Conventional treatment involves the addition of coagulants, removal of solids using clarifiers such as sedimentation, solids contact or dissolved air floatation, and removal of the remaining solids in clarified water in media filters, followed by chlorine-based disinfection. Such treatment is widely used and technically capable of reducing turbidity to below 0.2 NTU but requires close operator attention and continuous monitoring as discussed above.

Where water is harvested from sources with significant risks of contamination with protozoan pathogens, filtration to 0.2 NTU alone may not reduce the risk from protozoan pathogens to acceptable levels. Other treatment, such as membrane filtration, or disinfection by ultraviolet radiation or ozonation, may be needed.

In most cases, the turbidity of the filtered water during ripening periods after filter backwash, may exceed 0.3 NTU. It is considered best practice to limit these short spikes in turbidity to no longer than 15 minutes. Spikes above 0.3 NTU represent periods of increased risk, and appropriate risk management practices should be employed, such as rejecting ripening water to waste or optimising filter backwash processes.

Turbidity added after treatment can arise from the use of lime to raise the final pH of the water. This turbidity is unlikely to have an associated pathogen risk.

Disinfection

High turbidity has been shown to shield microorganisms from the action of disinfectants (Katz 1986). Low turbidity, however, is no guarantee that water is free from pathogenic microorganisms.

If the turbidity in a water supply exceeds 1 NTU, adequate disinfection may be more difficult to maintain, but may nevertheless be achievable.

Where water that is to be disinfected has not been previously filtered, it is desirable that the turbidity be less than 1 NTU at the time of disinfection, subject to the type of disinfectant being used. For example, disinfection using ultraviolet light is likely to remain effective at turbidities above 1 NTU, providing transmission is maintained, whereas the effectiveness of chlorine-based disinfectant can be affected above 1 NTU.

If water of a higher turbidity is to be disinfected, then validation work should be undertaken to demonstrate that disinfection of water under such conditions is effective.

Disinfection is discussed in more detail in Information Sheet 1 Disinfection of drinking water.

Aesthetics

Turbidity has an impact on the aesthetic acceptability of water. Many consumers relate the appearance of water to its safety, and turbid or coloured water is interpreted as being unsafe to drink. Turbidity must therefore be maintained as low as possible to the point of supply to customers.

Passage of water through a distribution system can also lead to an increase in turbidity, generally as a result of the resuspension of fine sediments settled over a long period of time, or from the breakdown of pipe materials or biofilms lining the walls of the pipes. While the associated health risk is generally minimal, it may be significant in poorly maintained systems, as some biofilms are known to harbour living microorganisms. Therefore turbidity in the distribution system can be also used as an indicator of good distribution management practices.

Health considerations

Consumption of highly turbid waters is not necessarily a health hazard, but may constitute a health risk if the suspended particles harbour pathogenic microorganisms capable of causing disease in humans, or if the particles have adsorbed toxic organic or inorganic compounds.

For a treatment system designed for chlorine-resistant pathogen reduction via filtration only, detection of increases in the turbidity of filtered water above 0.5 NTU should trigger investigative action. Major filtration failures should be referred to the relevant health authority or drinking water regulator to assess the potential health risk.

Turbidity can have a significant impact on the microbiological quality of drinking water. High turbidity interferes with both the detection and the disinfection of pathogens, by adsorbing them into the particulate matter and thus shielding them. Some turbidity may also promote bacterial growth if they provide a source of nutrients.

It is important to recognise the sources of suspended or particulate matter in water, and the potential associated risks to human health. Particulate matter from multi-use surface catchments often contains human pathogens. The poor management of turbid water events is a significant factor in many waterborne disease outbreaks (Hrudey and Hrudey 2004).

References

APHA Method 2130B (2005). Turbidity: nephelometric method. Standard Methods for the Examination of Water and Wastewater, 21st edition. American Public Health Association, Washington.

Hrudey SE, Hrudey EJ (2004). Safe Drinking Water – Lessons from recent outbreaks in affluent countries. IWA Publishing, London.

Katz EL (1986). The stability of turbidity in raw water and its relationship to chlorine demand. Journal of the American Water Works Association, 78:72–75.

Mosse P (2009). Longterm filter monitoring and safe drinking water – data from two Australian water utilities. Water Works – Official Journal of the Water Industry Operators Association, June 2009, Water Industry Operators Association, Shepparton, p8-11.

NZ-MOH (New Zealand Ministry of Health) (2008). Drinking-water Standards for New Zealand 92005, revised 2008)

USEPA (2004). Long Term 1 Enhanced Surface Water Treatment Rule – Turbidity Provisions – Technical Guidance Manual.

USEPA. 2006. National Primary Drinking Water Regulations: Long Term 2 Enhanced Surface Water Treatment Rule: Final Rule.

Drinking-water Standards for New Zealand 2005 (revised 2008) (2008). Ministry of Health.

Xagoraraki I, Harrington GW, Assavasilavasukul P, Standridge JH (2004). Removal of emerging pathogens and pathogen indicators. AWWA Journal, 96(5):102-113

Last updated

Logo

Australian Drinking Water Guidelines 6 2011, v3.9

Go back to NHMRC website