Due to extensive purification of ground water or surface water, the level of pathogenic microorganisms and indicator organisms are absent in detectable levels in treated drinking water. To ensure that microbial safety of drinking water is maintained during distribution, the two most important control measures are the physical integrity of the distribution system and a continuously high water-pressure. These measures combined prevent the intrusion of contaminated water carrying pathogenic microorganisms in the distribution system. If these two requirements are not met, a chlorine residual in the distribution system may protect the drinking water to some extent to intrusion of microorganisms. The drinking water in the Netherlands is distributed without a chlorine disinfection residual, which requires extra vigilance during repair works or incidents. If small cracks are present and the water pressure is reduced, for example due to a pipe break or pressure transients, ground water may leak into the distribution system [1
]. During repair works or incidents contamination may occur as well, despite the hygienic procedures that are in place. Sewage and drinking water pipes are often present in the underground and are close to each other. For that reason the soil and ground water next to drinking water pipes contain high numbers of microorganisms (median values): fecal coliforms (2–59 × 101
MPN/100 mL, 2 × 101
–1 × 102
MPN/100 g), Clostridium perfringens
(5 × 101
–1 × 103
cfu/100 mL; 1 × 101
–1 × 103
cfu/100 g), Bacillus subtilis
(1.3 × 106
cfu/100 mL; 1.3 × 108
cfu/100 g) and coliphages (1 × 104
pfu/100 mL; absent in soil) [1
]. Intrusion of ground water into the distribution system may thus lead to contamination of the drinking water and distribution system with enteric pathogens. Even intrusion of a small volume into the distribution system may be of concern as several enteric pathogens are highly infectious [3
Several studies have shown that incidents or repair works of the drinking water distribution system are associated with an elevated risk of gastrointestinal diseases. A cohort study in Norway among households downstream of a main break or maintenance works were compared to unexposed households and showed that exposed households reported 1.58 times more gastrointestinal illnesses [5
]. A cohort study in Sweden showed that households were 2.0 times as likely to report vomiting complaints and 1.9 times as likely to report acute gastrointestinal illness after a pipe break or works on the distribution system [6
]. Risk factors were identified and included the presence of sewage pipelines at the same level as drinking water pipelines in the trench. Flushing was also associated with an elevated risk, leading to the conclusion that current safety measures might not be sufficient in eliminating the risk of gastrointestinal diseases. Households in Canada consuming tap water reported 19% to 34% more gastrointestinal illnesses compared to households receiving water bottled directly at the production plant or receiving tap water treated with reverse osmosis or receiving spring water [7
]. In addition, in a Swedish study routinely available data on the incidence of Campylobacter was used and shown to be positively associated with the average water-pipe length per person suggesting contamination of the water during distribution [9
]. Both studies suggest that during distribution the water was contaminated with pathogens, although the distribution systems did meet the microbiological standards (monitoring of indicator bacteria). In contrast, no proof was found for contamination during distribution in two blinded studies: no difference was found for the risk of waterborne gastroenteritis between households in Melbourne [10
] or Iowa [11
] using a real or sham water treatment unit installed in the kitchen. In a systematic review, a significant association between gastrointestinal disease and tap water versus point of use treated tap water was found for non-blinded studies. However these differences disappeared when only blinded studies were included [12
To estimate the potential health impacts of contamination of the drinking water distribution system, the QMRA (Quantitative Microbial Risk Assessment) approach was applied [13
]. A QMRA of negative pressure transients showed that the health risk was mainly affected by the duration of a negative pressure event [14
] and by the number of nodes drawing negative pressure. The concentration of the contaminant, in both studies Norovirus was used, was not critical, probably due to the high infectivity of the virus [3
]. In a QMRA of the health impact of contamination during repair works, the concentration of the contaminant, the time of day the valves are opened after repairs, and the time of consumption were the most important parameters [17
]. Additionally, the type of pathogen and its specific dose–response relationship highly impacts the resulting infection risk [18
]. For example, ingestion of roughly 10–100× higher numbers of bacteria compared to protozoa, or 100–1000× higher compared to viruses, are required to become infected [19
]. Together these studies suggest a (potential) role for the loss of physical integrity, pipe breaks and repair works of the drinking water distribution system in the transmission of gastrointestinal diseases.
Flushing and, in case of a persistent microbial contamination in the distribution system, shock chlorination are often used to clean the distribution system after maintenance work or pipe breaks. However, to our knowledge, only limited research has been performed to test the efficacy of flushing and shock chlorination on microorganisms in water and pipe wall biofilm. Flushing experiments in a pilot distribution system used sand particles with a diameter of 0.25–4 mm and showed that a threshold velocity of 0.8–0.9 m/s was required to achieve efficient removal [22
]. In the same study the results were extrapolated to the removal of microorganisms. Chlorination was tested in a batch reactor, in which the microorganisms were present in the water but not in a biofilm. Flushing (0.8 m/s) does not remove Bacillus subtilis
from the biofilm of a cement-lined ductile iron pipe, whereas shock chlorination (200 mg/L, 2 h) in the same pilot distribution system led to a modest 1.2–1.4 log removal from the biofilm [23
]. However, removal from the water was not tested.
After cleaning the distribution system using flushing or chlorination, drinking water companies want to put the distribution system back into service as quickly as possible. Current practice is that the cleaned area can be put back into service after the monitoring of the water for the fecal indicators E. coli and enterococci shows that these are absent. To ensure the safety of the consumers and to be able to restore normal water distribution, it is important that monitoring can be performed quickly after the cleaning regime is finished and that monitoring is optimized in such a way that the probability of detecting fecal indicators (if present) is as high as possible. Currently, a sample is taken from a convenient tap in the vicinity of the cleaned network, but information about the impact of time and place of sampling on the probability of picking up residual fecal indicators is lacking.
In this paper we used a model pipe-loop system, in which a biofilm was cultured, to study (i) the influence of the waiting time between flushing of the system and sampling the water and (ii) the effect of the distance between sampling point and point of contamination. This knowledge is used to underpin the sampling strategy after flushing in which the waiting time should be as short as possible without decreasing the probability of detecting the fecal indicators. We extend the knowledge on the efficacy of flushing and shock chlorination in removal of a microbial contamination from drinking water pipes. Both flushing and shock chlorination were tested in a model pipe-loop system containing pipes with a cultured biofilm or pipes with an old and natural biofilm that were excavated from the drinking water distribution system. Flushing was tested using several conditions (flushing velocity and flushed water volume) and chlorination was tested for several time periods. Following this, we describe the application of these results to real-life situations.
2. Materials and Methods
Experiments were carried out with Escherichia coli (E. coli
WR1, NCTC 13167), Enterococcus faecium
WR63, NCTC 13169), Clostridium perfringens
D10, NCTC 13170) spores, somatic coliphage phiX174 (ATCC 13706-B1) and bacteriophage MS2 (ATCC 15597-B1). E. coli
is the most commonly studied microorganism and also the one most often used as a bacterial indicator of fecal pollution in drinking water. Bacteriophages are used in this study as an alternative for enteric viruses, due to their match in morphology and biological properties [24
]. The efficacy of chlorine disinfection of the distribution system depends on several factors including the chlorine concentration, contact time and the type of microorganism. Therefore several types of microorganisms were selected representing bacteria (E. coli
), bacterial spores (C. perfringens
spores) and viruses (the bacteriophage phiX174), and also representing a spectrum of chlorine-sensitivity, from sensitive (E. coli
) to insensitive (spores of C. perfringens
). For analysis of the optimal sampling strategy, the fecal indicator bacteria E. coli
and enterococci were used. These two indicators are routinely monitored and monitoring is compulsory after works in the distribution system. A water sample of 100 mL should be negative for both bacteria.
and Enterococcus faecium
used to test the optimal sampling strategy (experiments A1–A4, Table 1
), were grown in mineral medium supplied with glucose and potassium nitrate, respectively glucose and brain heart infusion broth. Growth at 22 °C was monitored by streak-plating on Lab Lemco Agar (LLA) plates. When the maximum colony count was reached the bacterial suspensions were stored at 4 °C until use. One suspension of each bacterium was prepared and used in all experiments. Shortly before each experiment the colony counts were determined on a specific medium: Laurylsulphate agar (LSA) for E. coli
and Slanetz and Bartley agar (S&B) for Enterococcus faecium
. The E. coli
colony counts were determined according to NEN-EN-ISO 9308-1 using membrane filtration (0.45 μm pore size), or streak-plating of the sample on LSA-agar plates. Agar plates were incubated for 5 h at 25 °C followed by 14 h at 36 °C. The Enterococcus faecium
colony counts were determined on S&B agar for 48 h at 36 °C, according to NEN-EN-ISO 7899-2. MS2 F-specific bacteriophages were purchased via GAP EnviroMicrobial Services. The number of MS2 was determined according to the double-agar layer method in NEN-ISO 10705-1. In short, decimal dilutions of the sample containing MS2 was mixed with the host bacterium Salmonella typhimurium
WG49 in log-phase and semi-solid Tryptone-Yeast Extract-Glucose agar. The mixture was immediately spread on an agar plate and allowed to solidify. Incubation was performed at 36 °C for 24 h. The heterotrophic plate count (HPC) was determined according to NEN-EN-ISO 6222. The sample is mixed with dissolved plate count agar solution and poured into a petri dish. The sample is incubated at 22 °C for 68 h.
For the flushing and chlorination experiments (experiments B1–B5 and C1–C3, Table 2
and Table 3
), E. coli
bacteria were grown in Lab Lemco Broth (LLB) for 72 h at 36 °C. Growth medium was removed by washing the bacteria three times in sterile tap water. The number of E. coli
bacteria in the suspension was determined on LSA agar plates as described above. For each experiment the bacteria were freshly prepared. C. perfringens
D10 was cultured on Perfringens agar base (PAB), after which the colonies were aseptically transferred to sterile tap water. To induce sporulation of the bacteria, the bacteria were incubated for two weeks at 36 °C after which the spores were stored at 4 °C. One spore suspension was prepared and used for all experiments. A few days before each experiment the number of spores in the suspension was determined according to NEN-ISO 6461. In short, prior to enumeration the sample was heated at 60 °C for 30 min to kill vegetative bacteria. The colony count was determined using membrane filtration or streak-plating on PAB agar plates at 36 °C for 24 to 48 h. phiX174 somatic coliphages were grown by infection of an E. coli
WG5 culture with phiX174 for 5 h at 36 °C. The growth medium was removed by centrifugation and ultrafiltration. One suspension of phiX174 coliphages was prepared and used for all experiments. A few days before each experiment the number of phiX174 in the suspension was determined according to NEN-ISO 10705-2. The sample was mixed with E. coli
WG5 in log-phase and semi-solid Modified Scholtens Agar, spread immediately on an agar plate and allowed to solidify. Incubation was performed at 36 °C for 24 h.
To determine the adenosine triphosphate (ATP) concentration in the samples luciferin and luciferase were added. In the presence of ATP luciferin is degraded by luciferase during this process light is produced and measured in a luminometer. For measuring the iron concentration the water sample was treated with nitric acid according to NEN-EN-ISO 15587-2:2002. The concentration of the released iron determined using inductively coupled plasma mass spectrometer (ICP-MS).
Analyses of the biofilm were performed by swabbing roughly 7 cm of pipe wall biofilm (all sides), 1–2 cm pipe wall biofilm at the start of the pipe segment was not swabbed. For swabbing multiple sterile cotton swabs were used. Of each pipe the swabbed surface area was calculated. The swabs were pooled in 40 mL sterile tap water and the biomass was released from the swab by low-energy sonification using a Branson Sonifier ultrasonic cell disruptor for 2 min at 40-kHz and 90-Watt power output (equivalent to 45% amplitude). During sonification the mixture was kept on ice. In the resulting water sample the required parameters were determined.
Die-off kinetics of all microorganisms were determined in tap water to which a concentration of microorganisms was added in similar concentrations as to the pipe-loop system. The water was incubated for 24–72 h at 22 °C, the number of microorganisms was determined several times during this period.
In some of the experiments an artificial biofilm was cultured in the pipe-loop system prior to the experiment. To achieve a reproducible biofilm acetate (NaCH3COO, 10 µg C/L), nitrate (KNO3, 2 µg N/L) and phosphate (KH2PO4, 0.1 µg P/L) were added to the water. After continuous circulation of the water at 0.1 m/s in the pilot system for two weeks the biofilm and water were analyzed for ATP and HPC. The system was drained and flushed on low speed to remove remaining nutrients before starting the experiment.
Three slightly different model pipe-loop systems were built (Figure 1
). All systems were 20 m long and consisted of PVC-pipes with an outer diameter of 63 mm and inner diameter of 55 mm. At several points along the 20 m-system taps for water sampling were placed. A flow meter was placed immediately after the flushing pump. After each experiment the biofilm was removed from the complete system by flushing with a SDS solution, tap water, citric acid solution and tap water. In addition, the long stretches of pipe were replaced with new pipes. Depending on the specific research questions, small alterations were made to the pipe-loop system, as described below.
For determining the optimal sampling strategy (experiments A1–A4), the system in Figure 1
a was used. Sampling taps were placed after 1, 5, 10 and 20 m. For all experiments a biofilm was cultured. Next, the system was drained and artificially contaminated water or sand was added to the system immediately after the flushing pump and flow meter. The contaminated water or sand was left in the drained pipe for one hour to allow the microorganisms to attach to the biofilm. Flushing was performed according to Table 1
. This procedure was chosen to mimic repair works on the distribution system as much as possible (i.e., the opening of a drained pipe and contaminated water or sand that may enter and reside for some time in the pipe before flushing is started). After flushing the system was closed, without any circulation or flow of the water to mimic the withholding time of the water in the pipe before the sample is taken, and water samples were taken after 1, 2, 6 and 24 h at the four sampling taps. At the end of the experiment the system was drained and the biofilm was sampled at 1, 10, and 20 m to determine the amount of microorganisms present in the biofilm. The efficacy of the several flushing regimes on this pipe-loop system was compared by calculating a mass balance. It was assumed that before flushing all microorganisms were present in the water and were absent from the biofilm. Numbers found in the water and biofilm samples 24 h after flushing were extrapolated to the entire water phase and pipe wall. Using these numbers the log removal of the microorganisms was calculated.
To determine to efficacy of flushing and chlorination on removal of microorganisms from water and biofilm (experiments B1–B5), the system in Figure 1
b was used. A large number of removable pipe segments were included with ball valves on both sides. Individual segments could be taken out for analyses and replaced by a new segment, without loss of water or the need to drain the system. A small circulation reservoir, circulation pump, and flow meter were included in the system. During the entire experiment the water was circulated with 0.1 m/s (the average flow in the Dutch distribution system) using the circulation pump. For all experiments a biofilm was cultured. Microorganisms were added to the water and circulated for 24 h before the system was flushed according to Table 2
. Only the pipes between the flushing pump and the drain to the sewage system were flushed. The parts necessary for circulation (pump, reservoir, flow meter) were blocked by closed valves. After flushing, samples were taken and the water in the entire system was circulated for 30 min to ensure mixing of microorganisms throughout the water phase. Shock chlorination was started by addition of concentrated sodium hypochlorite to the circulation reservoir up to a concentration of 10 mg/L in the water of the entire system. The free chlorine concentration was measured after 20 min, 1–3–6–24 h (Hach, LCK310). During the flushing and chlorination procedures the water and biofilm were sampled at different time points, depending on the experiment: after 3–6–10–15 flushing volumes (the volume of the pipe segment that is flushed, i.e., 3 flushing volumes implies that the volume of the pipe segment was replaced three times) and after 1, 6 and 24 h contact time with chlorine.
To test the efficacy of flushing and chlorination on real-world pipes (PVC) with a natural biofilm (experiments C1–C3), the system of Figure 1
b was adapted in such a way that pipes excavated from a distribution system could be incorporated in the pipe-loop system. The pipes were derived from different Dutch drinking water companies and were 35, 51, and 27 years old. Five meters of pipe, in segments of 50 cm, were excavated under hygienic procedures. The segments were closed with a cap and packaged in plastic bags, transported at 4 °C to the laboratory and incorporated within 6 h in the pilot distribution system. Before incorporation the sawdust was gently removed and the segments were incorporated using the same orientation (flow direction and up and bottom side of the pipe) as in the distribution system. Only the incorporated part was flushed (Figure 1
c). No biofilm was grown in the remaining 15 m pipe that were not flushed, to ensure that only the effect of a natural biofilm was monitored. After incorporation of the real-world pipe segments in the system, the system was circulated for 4 days with location-specific drinking water. After 4 days the water was refreshed, microorganisms were added and circulated for 24 h at 0.1 m/s. Flushing and chlorination were performed as described above (Table 3