From Exit to Entry: Long-term Survival and Transmission of Salmonella

Salmonella spp. are a leading cause of human infectious disease worldwide and pose a serious health concern. While we have an improving understanding of pathogenesis and the host-pathogen interactions underlying the infection process, comparatively little is known about the survival of pathogenic Salmonella outside their hosts. This review focuses on three areas: (1) in vitro evidence that Salmonella spp. can survive for long periods of time under harsh conditions; (2) observations and conclusions about Salmonella persistence obtained from human outbreaks; and (3) new information revealed by genomic- and population-based studies of Salmonella and related enteric pathogens. We highlight the mechanisms of Salmonella persistence and transmission as an essential part of their lifecycle and a prerequisite for their evolutionary success as human pathogens.


Introduction
The evolutionary success of bacterial pathogens is dependent on their ability to colonize and cause disease in susceptible hosts. Equally important is how effectively these pathogens are transmitted between hosts. For human-specific or human-adapted pathogens, such as Helicobacter, Neisseria species and others, it is assumed that life outside the host represents only a small part of their OPEN ACCESS S. enterica subsp. enterica serovars can also be described as either host-generalist, host-adapted or host-restricted [14]. These categories have major implications on the transmission characteristics of each isolate, which will be described later. Host-adapted or -restricted serovars have evolved strategies for persisting inside of the host and evading immune defenses. Salmonella ser. Typhi, for example, disseminates from the gastrointestinal tract to the reticuloendothelial system, where it can colonize the surface of gallstones [15]. Approximately 1-6% of patients who have been infected with Salmonella ser. Typhi become chronic, asymptomatic carriers [16][17][18]. In contrast, pathogenesis of host-generalist serovars usually leads to gastroenteritis, and infected patients shed Salmonella for a relatively short period of time. There have been instances where shedding occurs after recovery, but only at low levels [19]. The lifecycle of host-generalist NTS strains has a greater dependency on survival in the environment, presumably due to their reduced long-term shedding capacity.

In vitro Evidence that Salmonella spp. Can Survive for Long Periods of Time under Harsh Conditions
Salmonella spp. are known to survive in non-host environments [20], but the mechanisms of persistence are not well understood. For example, the well-characterized acid tolerance response [21] is usually presumed to be a pathogenesis adaptation to ensure smooth passage of Salmonella through the mammalian stomach. From analysis of Salmonella persistence in poultry houses and other food processing environments, the idea took hold that vectors (i.e., rodents, insects) represent a main environmental reservoir of Salmonella spp. [22,23]. More recently, there is evidence of biofilm formation, a multicellular behavior that may enable Salmonella spp. to survive long-term in the environment without requiring an animal reservoir.
Fimbriae (or pili) have long been thought to play a central role in the interactions between bacterial pathogens and their hosts. Genome sequencing revealed that S. enterica isolates can possess at least 15 different fimbrial types [24]. Since most fimbrial operons had a scattered distribution within the S. enterica serovars [24,25], it was assumed that different fimbrial types were required for colonization of different hosts. Curli (or thin aggregative fimbriae) were distinct in that their subunit genes were detected throughout S. enterica subsp. enterica (i.e., 603 of 604 isolates, representing 95 serovars) [26] and even E. coli [25]. Biochemical characterization of curli fibers showed they are resistant to boiling, bases, detergents and proteolytic digestion [27]. The presence of these incredibly resistant structures on the cell surface was hypothesized to be a potential survival advantage for Salmonella during passage through the mammalian stomach into the small intestine [27]. The conservation of curli throughout S. enterica indicated that these organelles have an important evolutionary role in the Salmonella lifecycle.
Curli production was associated with cell-cell aggregation and the formation of adhesive colonies by both S. enterica and E. coli isolates [28,29]. Ute Romling and colleagues [30] termed this phenotype the "rdar morphotype" for red, dry, and rough colonies formed by Salmonella ser. Typhimurium on nutrient-limited laboratory media containing the indicator dye Congo red. Romling et al. also demonstrated that the curli genes (csgDEFG and csgBAC) were functionally interchangeable between Salmonella and E. coli [31]. Further characterization of the rdar morphotype led to the discovery that cellulose was an integral part of the extracellular matrix, tightly linked to curli on the cell surface [32], and responsible for "long-range" interactions within rdar colonies [33]. This allows the entire colony to be lifted off the agar surface in one piece [30]. The chemical resistance and strength of cellulose and curli suggests that they may function as an inert matrix or scaffolding that holds cells together. Additional components of the rdar matrix have since been discovered: an O-antigen capsule [34] and additional polysaccharides [34,35]; as well as a large, cell surface protein termed BapA that contains repetitive stretches of amino acids that are presumed to be involved in aggregation [36]. The subsequent discovery that curli fibers represent a "functional" amyloid [37,38] and that amyloid fimbrial structures can be found in diverse natural biofilms [39] is suggestive that the rdar morphotype represents a biofilm-like state for Salmonella.
The resistance properties conferred by the rdar morphotype suggest that this physiology may have a role in long-term survival. Rdar morphotype cells have shown increased resistance to hydrogen peroxide and acid [40], sodium hypochlorite [41][42][43] and various disinfecting agents [44,45], as well as an increased ability to stick to abiotic surfaces [46,47]. We performed some of the first in vitro experiments to compare survival of rdar-forming (rdar + ) Salmonella ser. Typhimurium to isogenic mutants that were lacking different extracellular components [43]. Rdar + cells survived significantly better than mutants under desiccation and starvation conditions [43], and the O-Ag capsule appeared to be critically important for this survival [34]. After 14 months stored on plastic, cell numbers in rdar colonies were at 2-5% of starting CFU levels [43], and cells remained at this level even after 30 months [48], suggesting that survival could go on indefinitely. These 30-month-old rdar + cells were still able to cause infection in mice [48]. The ability of rdar + cells to persist and remain pathogenic in this physiological state after such a long period of time was unexpected. This strongly suggested that S. enterica isolates do not need an animal reservoir to survive long-term in the environment. One consistent finding to date is that the rdar morphotype has not been associated with increased virulence [42,49]. Decreased invasion of epithelial cell lines was recently reported for another biofilm-like state of serovar Typhimurium [50], suggesting this may be a common theme. However, survival and persistence are as equally important as pathogenic ability if S. enterica isolates have an extended phase of life outside their hosts.
The evidence accumulated so far indicates that the rdar morphotype represents a conserved survival strategy for S. enterica. Metabolomic and transcriptional analysis has shown that Salmonella ser. Typhimurium rdar + cells have up-regulated several well-known stress resistance pathways, such as reactive oxygen species defense, osmoprotection and nutrient acquisition [51]. The changes in metabolism required for extracellular matrix production were synchronized with up-regulation of the resistance adaptations, suggesting that there is a coordinated shift in physiology as cells enter this state [51]. Grantcharova et al. [52] analyzed three different Salmonella biofilm models related to the rdar morphotype and observed that there was always a balance between multicellular aggregates (i.e., rdar morphotype) and planktonic cells. The curli-related transcriptional binding protein CsgD was found to act as a bistable control switch between these two cell populations [52]. Interestingly, bistable switches often play key roles in high investment processes, such as cellular differentiation, in which only the end-result of the process is functional [53]. In the case of S. enterica, this type of control strategy would maintain the developmental potential of cell populations and maximize the chances for survival in many natural environments.

Does Salmonella Enter a Viable, Non-culturable State?
The long-term survival of Salmonella ser. Typhimurium cells in rdar morphotype colonies suggested that the cells could be in a metabolically dormant state, perhaps similar to persister cells within biofilms [54]. After 30 months storage on plastic, rdar morphotype cells were hypersensitive to bile salts [48], indicating that the cell membranes may be damaged [55]. However, while the total CFU count was <5% of the starting value after 30 months, greater than 50% of the cells were scored as alive after staining with a Live/Dead bacterial viability kit (Invitrogen, Carlsbad, CA, USA, cat. No. L-7012) (Figure 1). The physiological state of these cells was not further characterized, but it is possible that they could represent viable, non-culturable cells (VBNC). If so, this would imply that long-term survival of Salmonella was underestimated in our in vitro experiments [43,48]. Compared to Liquid Cultures. Dots represent the total CFU from 1 mL aliquots of cells grown for 18 h in Luria broth at 37 °C (overnight) or rdar morphotype colonies grown on 1% tryptone agar for two days at 28 °C . To ensure consistency, overnight cultures were normalized to an optical density of 1.0 at 600 nm prior to removing aliquots. The "overnight lyophilized" samples represent 1 mL aliquots of cells that were frozen and lyophilized for 48 h prior to measurement. Rdar morphotype colonies were stored for two weeks or 30 months, one colony per well, in a plastic, 24-well tissue culture plate [43] prior to measurement. Each grey dot represents the average CFU value from at least four biological replicates. Cells samples were stained using a Live/Dead bacterial viability kit (Invitrogen, Carlsbad, CA, USA, cat. No. L-7012) and enumerated by manual scanning on a fluorescent microscope; n represents the total number of cells that were counted from two to three biological replicates of each sample type. The bars reflect the percentage of cells that were scored as live (green) or dead (magenta), with a combined total of 100%.
The pathological significance of the VBNC phenotype in the S. enterica lifestyle is uncertain. Colwell et al. [56] first proposed the idea of VBNC Salmonella in 1984 after monitoring the status of Salmonella ser. Enteritidis cells suspended in river water. These cells became non-culturable as early as 48 hours and could be resuscitated by the addition of nutrients. It is thought that the VBNC state may represent either the crippling effect of extreme stress or a regulated Salmonella survival mechanism. Since Salmonella is a foodborne pathogen, there is considerable public health concern whether VBNC can retain growth or pathogenic potential once introduced to new surroundings. Salmonella VBNC cells can be resuscitated [57][58][59][60], suggesting that they could potentially grow and re-infect. However, in several experimental models, the Salmonella VBNC cells were unable to cause infections in chickens or mice, indicating that cells failed to resuscitate during passage through the gastrointestinal tract [61][62][63]. Nevertheless, since VBNC cells are proposed to play a role in transmission and survival of other enteric pathogens, such as Vibrio cholerae [64], they could play a similar role for S. enterica.

Salmonella in the Environment: Lessons from Outbreaks
Salmonella serovars cause an estimated 1.2 million illnesses annually in the U. S. and are the most common causes of hospitalization and death among foodborne pathogens that are tracked by the Foodborne Diseases Active Surveillance Network (FoodNet) [65]. Despite the efforts of highly developed regulatory bodies like the Food and Drug Administration (FDA) and United States Department of Agriculture (USDA), the incidence of Salmonella was 3% higher in 2010 than it was in 1996-1998. This is in contrast to other foodborne pathogens such as EHEC, Campylobacter, Listeria, Shigella and Yersinia spp., whose occurrences have decreased by 44%, 27%, 38%, 57% and 52%, respectively [65]. The variety of different foods that Salmonella can be isolated from is a testament to its widespread presence in the food supply chain. Salmonella outbreaks have been linked to contaminated meat, poultry, eggs, unpasteurized dairy products, tomatoes, sprouts, melons, lettuce, mangoes, chocolate, powdered infant formula, raw almonds, dry seasonings, cereals and peanut butter. Outbreaks of S. enterica associated with these food vehicles involve host-generalist, NTS serovars, as opposed to human-adapted serovar Typhi. In 2009, from 6,371 isolates that were serotyped by the U.S. Center for Disease Control and various State health departments, the percentages of the top five S. enterica serovars were Enteritidis (19.2%), Typhimurium (16.1%), Newport (12.1%), Javiana (8.5%), and Heidelberg (3.6%) [66].
A historical analysis of Salmonella outbreaks highlights the adaptability of this pathogen to a variety of different food processing environments. Environments, like pond water, the inside of a tomato fruit, stainless steel factory surfaces or inside low-moisture foods are so different that one may wonder how Salmonella is able to persist in such diverse settings. Table 1 describes selected outbreaks as far back as 1970 and was compiled with the objective of illustrating themes in Salmonella adaptability.

Tomato-related Salmonella Outbreaks
From 1990-2012, the total number of reported cases of Salmonella in the United States involving tomatoes as a food vehicle was 2,059. It is likely that this statistic vastly underestimates the impact of tomato-related Salmonella infections, as it is suggested that only one of every 38 cases is reported to public health authorities [13]. Several epidemiological studies have tried to pinpoint the source of contamination for these outbreaks. Table 1 shows that investigations of multi-state outbreaks often lead investigators to packing houses or to the produce fields that supply them. However, direct isolation of Salmonella from the production environment rarely occurs. For example, from all tomatoassociated outbreaks listed in Table 1, in one instance only were investigators successful in obtaining the isolate responsible [69]. Nevertheless, investigators are increasingly able to use epidemiological evidence as a primary method for identifying the source(s) of contamination. In many tomato-associated Salmonella outbreaks, investigators concluded that Salmonella had been introduced to plants through contaminated water. Investigation of outbreaks in 1990 and 1993 (Table 1) suggested that improper monitoring of chlorination of a water bath at a packing plant lead to the cross-contamination of tomatoes [67]. In an outbreak in 2005, Salmonella ser. Newport was isolated from an irrigation pond next to the produce field [69]. This same strain of serovar Newport was also responsible for a large outbreak in 2002, suggesting a persistent contamination of the water source. In another study investigating tomato-producing farms in 2009-2010, local groundwater, irrigation pond water, pond sediment, irrigation ditch water, rhizosphere and irrigation ditch soil, leaves, tomatoes, harvest bins and sanitary facilities were tested for Salmonella [98]. Twenty-nine percent of farms were positive for Salmonella, with the conclusion being that irrigation water and soil led to pre-harvest contamination of tomatoes [98].
Salmonella has been isolated from streams and rivers where farmers obtain their irrigation water and is usually associated with fecal contamination [99][100][101][102]. Santo Domingo et al. [103] inoculated four strains of Salmonella ser. Typhimurium into water samples collected from the Great Miami River, which is an irrigation source for farmers in this region. As measured by culturability, Salmonella levels dropped from 10 8 CFU/mL to 10 4 CFU/mL after 45 days, whereas by direct microscope counts, the cell levels remained relatively constant. This discrepancy suggested that a large percentage of cells in the population were dead, however flow cytometry experiments using a live-dead stain demonstrated that most cells were viable [103]. It was suggested that these cells could represent a VBNC population. Salmonella may also persist in aqueous systems in the form of biofilms. Recently, Salmonella was isolated from natural biofilms in Spring Lake, San Marcos, Texas [104]. The authors of this study hypothesized that biofilms could facilitate the long-term persistence of Salmonella and allow for eventual transfer into the food chain when waters were tapped for irrigation.
Several studies have focused on contamination of tomato plants from Salmonella-containing irrigation water. Hintz et al. [105] explored the potential routes of tomato plant contamination using a clinical isolate of Salmonella ser. Newport. From 92 tomato plants irrigated with contaminated water, 25 were confirmed positive for serovar Newport. Sixty-five percent of the positive samples were contaminated in the roots, while the remainder of samples had Salmonella in the stems, leaves and fruits. Tomato fruit contamination was present but only accounted for 6% of the total contamination. High levels of root contamination suggest that this represents an important entry route for Salmonella into tomato plants. Guo et al. [106] performed a root-based invasion assay using five different Salmonella serovars involved in produce-related outbreaks. Tomato plants were grown hydroponically and exposed to a nutrient solution containing 10 4 -10 5 CFU Salmonella/mL. Within one day of exposure, there were ~10 3 CFU/g in the stems and seed leaves of germinating tomato seedlings [106]. In another study, Salmonella ser. Montevideo was internalized into tomato plants from contaminated irrigation water, but the fruit did not show internalization [107]. Gu et al. [108] used confocal microscopy to monitor the spread of Salmonella in tomato plants after internalization. In parts of the plant that were directly inoculated, Salmonella was observed in both vascular components of the plant, the phloem and xylem. In adjacent parts of the plant that were not directly inoculated, Salmonella was only observed in the phloem, suggesting that colonization of other parts of the tomato plant occurs via the phloem. One caveat for many of these Salmonella internalization studies is that high inocula (i.e., 10 7 -10 8 CFU/mL) were often used, which is likely unrealistic in nature.
The probability of Salmonella internalization into the tomato fruit is far greater in the harvest and post-harvest stage. In these stages, the tomato surface is more likely to be damaged, allowing access of Salmonella that has contaminated the surface to the insides of the tomato. Packing plants often have communal water baths where tomatoes are washed and disinfected. If these water baths contain cold water contaminated with Salmonella, the water is taken up and the tomatoes can easily become contaminated [109]. To reduce the chances of internalization, current FDA guidelines recommend maintaining water temperature at least 10°F warmer than pulp temperature (U.S. FDA). It has also been observed that Salmonella cell numbers within chopped tomatoes can increase by 1.5 to 2.5 log units after only 24-hours storage [109].
In summary, analysis of tomato-associated S. enterica outbreaks indicate that the bacterium can survive for long periods of time in the environment whether it be in water or on the surfaces of plants. Furthermore, different NTS serovars may gain entry to the tomato fruit during processing and have the capacity to reach high cell numbers. In addition to the larger outbreaks we have described, presumably there are also many sporadic cases of Salmonella infection associated with ingestion of contaminated tomatoes.

Sprout-related Salmonella Outbreaks
During the past 15-20 years, over twenty outbreaks of Salmonella gastroenteritis have occurred due to the ingestion of contaminated alfalfa and clover sprouts (the smallest outbreak we make reference to in Table 1 had 18 confirmed cases of human infection). Epidemiological analysis has shown that contaminated seeds are the major source of contamination in most sprout-related outbreaks. In 1995, an outbreak of Salmonella ser. Stanley affecting people in the US and Finland was traced back to contaminated alfalfa grown by nine different sprout growers who were supplied by a single Dutch seed shipper [75]. In late 1995 and 1996, an outbreak of Salmonella ser. Newport affecting people in Oregon and British Columbia was traced back to contaminated seeds supplied by a different Dutch shipping company [76]. In a 1997 outbreak, alfalfa seeds that tested positive for Salmonella ser. Anatum were received from local farms in Kansas and Missouri [78]. In a 1998 outbreak of serovars Havana and Cubana, S. enterica strains isolated from seeds had pulsed-field gel electrophoresis patterns identical to those isolated from infected patients [110]. The fact that alfalfa seeds are often the source of the outbreak suggests that Salmonella is contaminating alfalfa seed in the farm fields or storage facilities. In 1999, the National Advisory Committee on Microbiological Criteria for Foods released recommendations to the sprouting industry to reduce the risk of contamination of alfalfa sprouts [111]. Recommendations for seed production suggested that growers should evaluate their sources of irrigation and monitor the presence of animal production facilities that could inadvertently expose alfalfa crops to contaminated manure. In addition, the committee concluded that good seed cleaning, storage and handling practices were necessary during seed handling and sprouting in order to reduce cross-contamination [111].
Since almost all sprout outbreaks lead back to contaminated seed lots, research has centered on the ability of S. enterica to survive on seeds and resist disinfection. Successful decontamination must inactivate Salmonella but preserve seed viability. In a study designed to evaluate various chemical treatments for their effectiveness in killing Salmonella on alfalfa seeds, contaminated seeds were immersed in solutions containing 20,000 ppm chlorine, 5% trisodium phosphate, 8% hydrogen peroxide, 1% calcium hydroxide, 1% calcinated calcium, 5% lactic acid or 5% citric acid for ten minutes [112]. Several treatments caused reductions of Salmonella populations of up to 10 3 CFU/g when analyzed by direct plating; however, no treatment was able to eliminate the pathogen [112]. During a 1999 Salmonella ser. Muenchen outbreak investigation, an implicated sprout grower signed an affidavit stating that seeds were disinfected with a 20,000-ppm chlorine solution for 15 minutes prior to germination. Despite this FDA-recommended disinfection step, there were at least 157 cases of Salmonella gastroenteritis resulting from eating these sprouts [80]. One possible explanation is that Salmonella may be protected from lethal concentrations of chlorine by lodging within the rough features on the seed surface [113].
The dynamics of Salmonella replication and growth during the commercial sprouting process are not clearly understood. In an attempt to understand how Salmonella survives and grows throughout the sprouting process, Jacquette et al. [114] inoculated alfalfa seeds with a Salmonella ser. Stanley strain that was isolated from the 1995 sprout outbreak. After a period of soaking, germination, sprouting and refrigeration, Salmonella levels increased from ~10 3 CFU/g to 10 7 CFU/g. Disinfection procedures were able to reduce CFU numbers, but elimination could not be reliably achieved [114]. This study proved that S. enterica can multiply to high cell numbers on alfalfa seeds despite standard disinfection and handling precautions. In another study, Salmonella ser. Eimsbuettel and ser. Poona inoculated on alfalfa seeds increased by 3-4 logs during sprouting [115]. Dong et al. [116] found that three S. enterica strains isolated from previous alfalfa sprout outbreaks were able to survive on the seed surface and internalize into alfalfa sprouts during the germination process. It is possible that the attachment, resistance and survival of Salmonella on alfalfa sprouts may be due in part to the rdar morphotype. Barak et al. [117] created a transposon mutagenesis library in a Salmonella ser. Newport strain isolated from an outbreak associated with contaminated alfalfa seeds. The transposon library was screened to find mutants defective in attachment to alfalfa sprouts. Loss of expression of genes involving the rdar morphotype (i.e., curli and cellulose production) caused a reduction in binding of the serovar Newport isolate to alfalfa sprouts [117] In summary, alfalfa-related Salmonella outbreaks are primarily caused by NTS serovars that are present on contaminated seeds. These strains can survive on alfalfa seeds for protracted periods of time and resist chemical stresses. During the sprouting process, S. enterica can internalize into sprouts and multiply to reach high cell numbers.

Salmonella Outbreaks Associated with Processed Foods
Over the last 20 years, the food vehicle of many Salmonella gastroenteritis outbreaks has been low-moisture foods such as dry cereal, peanut butter, spray-dried milk, infant formula, nuts and dry seasonings. The low water activity of these foods does not typically support the growth of pathogens. However, Salmonella can survive for long periods of time in low-moisture products, and ingestion of fewer than 10 3 S. enterica cells can still lead to illness [92]. Investigations of outbreaks related to lowmoisture foods illustrate themes of Salmonella's ability to persist in the food-processing environment. For a comprehensive review of Salmonella survival in low-moisture foods, see [118]. For the purpose of this review, we wish only to highlight outbreaks where S. enterica strains are exceptional in their ability to cross-contaminate, resist, and survive in this environment, as well as the foods themselves.
One of the pre-requisites for Salmonella to persist in food-processing environments and cause cross-contamination is the ability of cells to attach to surfaces and survive there. Chia et al. [119] found that S. enterica strains isolated from food-processing environments could attach to a variety of surfaces that are commonly used in food processing, such as stainless steel, teflon, glass, rubber and polyurethane. Because Salmonella can attach to all of these surfaces, cells can easily be transferred from one surface to another. A study looking at cross-contamination of surfaces in oil meal plants found S. enterica isolates on the processing floor, in dust, on the gloves and boots of operators and on tools [120]. To obtain data on cross-contamination, investigators disinfected the boots of plant workers before they began their normal operations for the day. Within one day of disinfection, all workers' boots tested 100% positive for Salmonella. Ultimately, investigators had to recommend restricting the movement of workers from one area of the building to others [120]. In 1998, an outbreak of 209 reported cases of Salmonellosis was associated with ingestion of toasted oat cereal [88]. FDA officials tested potential areas of cross-contamination and found low levels of Salmonella throughout the entire processing plant. It was concluded that equipment, air-handling systems and traffic flow had cross-contaminated the plant [88].
Cross-contamination by S. enterica serovars is not limited to highly processed foods. Salmonella contamination in the poultry industry (not reviewed here) is also a well-known problem. Marin et al. [121] identified the potential risk factors for Salmonella contamination in 44 broiler and 51 layer farms and determined the biofilm-forming capacity of the strains that were isolated. 41.3% of broiler houses tested were contaminated with Salmonella, and approximately 50% of strains isolated were able to produce a biofilm. The most important risk factors for Salmonella contamination were determined to be dust, environmental surfaces (i.e., wall crevices, floor joints) and chicken feces. Rodents, flies and beetles also played an important role in the recirculation of Salmonella in laying hen houses because they were able to taint the feed and house surfaces. Finally, the use of glutaraldehyde (50% vol/vol), formaldehyde (37% vol/vol) and hydrogen peroxygen (35% vol/vol) at a concentration of 1.0% in field conditions were found to be inadequate for Salmonella elimination irrespective of the serotype, the biofilm development capacity and the disinfectant contact time [121].
Since Salmonella spp. are known to form the rdar morphotype under low-moisture conditions and it confers on Salmonella the ability to attach, resist and survive, it may be an important adaptation for Salmonella in the factory environment. A correlation between biofilm capacity and persistence in factory environments has been reported. One-hundred eleven strains of serovars Agona, Montevideo, Senftenberg and Typhimurium were isolated from feed and fish meal factories, and several of these strains had persisted for at least three years [122]. Vestby et al. [123] hypothesized that several of these clones would be relatively strong biofilm producers because of their persistence in factories. When comparing biofilm formation capacity to serovar Typhimurium, which is rarely isolated from factories but known to be endemic in local avian wildlife [124], serovars Agona and Montevideo produced 423% and 390% more biofilm [123].
In many studies, Salmonella strains isolated from the environment have been found to produce biofilms. Solomon et al. [125] analyzed a collection of 71 strains isolated from clinical, produce and meat sources. Curli fimbriae were produced by 100% of clinical and meat isolates, and 80% of produce isolates. Cellulose was expressed in clinical (73%), meat (84%) and produce (52%) isolates. In another study, a total of 122 Salmonella strains were isolated from humans, animals or food, and all strains were found to produce biofilm [126]. Patel and Sharma [127] tested biofilm formation from five S. enterica isolates that were associated with produce outbreaks: serovar Thompson 2051H; Tennessee 2053N; Negev 26 H; Braenderup; and Newport. All formed biofilms, with ser. Tennessee and ser. Thompson forming the greatest amount.
In addition to biofilm formation, S. enterica isolates have the ability to resist desiccation and heat stress. These traits are well-demonstrated in studies of Salmonella outbreaks resulting from contaminated chocolate. Outbreaks associated with chocolate first appeared in the 1970s; a Salmonella ser. Durham epidemic linked to contaminated cocoa caused infections in 110 people [91]. A few years later, an outbreak occurred in North America involving chocolate containing Salmonella ser. Eastbourne; 217 people were infected [92,93]. An outbreak in the UK in 1982 that infected 245 people was traced to chocolate bars contaminated with Salmonella ser. Nepoli [94]. In Norway, more than 300 people were affected after consuming chocolate contaminated with Salmonella ser. Typhimurium [96]. In many of these outbreaks, cocoa beans or cocoa powder were suspected to be contaminated with Salmonella prior to their use in chocolate production. Under these conditions, Salmonella must survive desiccation, heat treatments during chocolate processing and survive in chocolate on store shelves for long periods of time. Most bacteria capable of causing foodborne illness do not grow below a water activity of 0.85 [128], however, Salmonella can survive in chocolate, which has a water activity of 0.4-0.5 [129]. S. enterica cells surviving in low-water activity foods are more tolerant to heat processing. During processing, chocolate is heated to 70-80°C for 8-24 hours but Salmonella is not destroyed [130]. Salmonella also can survive for long periods of time in processed chocolate products. Tamminga et al. [129] reported recovery of S. enterica from chocolate after nine months of storage. In a later study, these authors were able to recover Salmonella from chocolate after 19 months of storage [131].
The number of NTS cells needed to cause gastroenteritis is generally high [132], but in the case of chocolate-related outbreaks, low cell numbers can cause infection. An average of 2.5 Salmonella ser. Eastbourne organisms per gram of chocolate was found in infected person's homes in the 1973 outbreak [92]. Investigators concluded that no more than 1000 cells could have caused the infection, which was the number estimated to be in a one pound bag of chocolate. However, the authors stated that technical difficulties in the recovery of Salmonella from chocolate might underestimate the actual number of cells per gram of chocolate. In the Salmonella ser. Nima outbreak of 1989 (Table 1), concentrations as low as 0.043 cells per gram were found in chocolate [95]. In the most recent Salmonella ser. Oranienburg outbreak, there were an estimated -one to three cells per gram of chocolate [97]. Although the number of S. enterica cells per gram of chocolate is low, the matrix of chocolate may protect Salmonella from the acidic conditions of the stomach, thereby increasing the levels of viable cells that reach the intestine [133].
Tomatoes, sprouts, and chocolate are just three examples from the myriad of food products and sources that have been implicated in outbreaks of Salmonella gastroenteritis. These examples illustrate the diverse adaptations of S. enterica subsp. enterica for persistence and survival, as well as the inherent difficulty in eradication-all factors that contribute to human infection.

Genomic-and Population-based Studies of Salmonella and Related Enteric Pathogens
Based on the multitude of serovar types that have been implicated in Salmonella outbreaks and sporadic infections, there is seemingly a large diversity of S. enterica isolates. However, on the basis of sequence identity, we know that most serovars are closely related. As stated above, S. enterica subsp. enterica serovars can be loosely grouped as host-generalist, with the ability to colonize and infect multiple animal species (i.e., Typhimurium, Enteritidis, Heidelberg), host-adapted, such as Choleraesuis in swine and Dublin in cattle, and then host-restricted, such as serovars Typhi and Paratyphi in humans and Gallinarum in poultry. With the advent of genome sequencing and genomebased microarrays 10-15 years ago, it was assumed that the host-specificities of S. enterica serovars would be explained by the presence or absence of specific genes [134]. However, the distinction between serovars has turned out to be more complicated than previously thought.
Several genomic-based studies have begun to reveal more information about the population structure of S. enterica subsp. enterica. Using resequencing array technology, Didelot et al. [135] demonstrated that there are at least five different lineages within S. enterica; these lineages have also been classified as part of two larger phylogenetic clades [136]. Recombination between isolates of the same lineage was significantly greater than between lineages, suggesting that barriers between lineages might exist, possibly due to physical separation as a consequence of host adaptation or to sequence divergence [135]. Serovars Enteritidis and Typhimurium were the central members for two separate lineages and appeared to be monophyletic in origin. This suggests that most Enteritidis and Typhimurium isolates are clonal, with the main differences between isolates due to mutations. Salmonella ser. Typhimurium ST313 is a notable exception to this rule. Other serovars, such as Newport and Paratyphi B are polyphyletic in origin, consisting of several distinct groupings within the serovar [137]. Isolates within these serovars have evidence of extensive recombination and often appear to have different origins, despite sharing the surface antigens that were detected by serotyping. Most S. enterica serovars are assumed to lie somewhere between these polyphyletic groups and the most highly clonal group, which is Salmonella ser. Typhi [138].
In general, the evolution of S. enterica serovars towards becoming host-adapted, and ultimately host-restricted, is characterized by an accumulation of pseudogenes (loss of gene function) [139], as opposed to physical loss of genes from the chromosome. Recently, there have been efforts to identify pseudogenes that are common between host-adapted and host-restricted serovars. Betancor et al. [14] compared the genomes of four serovar Dublin isolates with 29 serovar Enteritidis isolates and matched the identified pseudogenes to the published serovar Gallinarum sequence (strain 287/91). As expected, Dublin and Gallinarum each had approximately three times more pseudogenes than Enteritidis; however, only 21 pseudogenes common to Dublin and Gallinarum represented active genes in Enteritidis. Nine of these pseudogenes were also present in serovar Choleraesuis, and two were in common with serovar Typhi and Paratyphi A isolates. One of the common pseudogenes, shdA, has been implicated in intestinal persistence and fecal shedding of Salmonella ser. Typhimurium in the mouse model of infection [140,141]. One of the hallmarks of host-adaptation or -restriction is the gain of a systemic mode of infection, presumably at the expense of intestinal persistence [142]. Further research into the role of pseudogenes in host adaptation should yield valuable information about the S. enterica lifecycle.
The recent explosion of genome sequencing of different S. enterica serovars [137,[142][143][144][145][146][147] is moving towards defining a core genome of S. enterica subsp. enterica [148,149]. From the 73 subsp. enterica genomes that were available at the time of analysis, the core genome was determined to consist of 2882 genes [149]; the order of genes and their sequence was highly conserved within the core genome. Most S. enterica subsp. enterica isolates also have hotspots of unique genes, which occur in relatively similar positions within the genome. From 73 subsp. enterica genomes analyzed, over 7000 unique genes were identified [149]. The majority of differences between serovars were due to the presence or absence of bacteriophage, plasmids or other mobile elements [143,144], fimbrial operons [24,136] and the loss of metabolic functions [136,144]. Analysis of the unique or highly variable genes within subsp. enterica did not yield any obvious phylogenetic information [149] but was informative when analyzed on a lineage-by-lineage basis [136,144]. The observed variation resulting from recombination and mutation has enabled Salmonella to be remarkably adaptable, expanding to fill a spectrum of new niches and responding to environmental challenges.
The genomic variation between different groups or lineages of S. enterica subsp. enterica could mean that the ecology, lifecycle and transmission characteristics are also different [136]. For example, serovars Typhi and Paratyphi A have lost the function of several genes relating to intestinal persistence and pathogenesis [139], in addition to numerous fimbrial operons that are used for attachment to host cells [24]. It is hypothesized that Typhi and Paratyphi A are similar due to convergent evolution that has occurred under selection pressure within the human host [150]. It is well known that Salmonella ser. Typhi has unique factors relating to host persistence, generating a carrier state, possibly through the colonization of gallstones [15]. It has also been speculated that because of a restricted host range, a long-term reservoir would be necessary for survival of serovar Typhi [1]. Genomic analysis of 19 Typhi isolates showed that evolution was dominated by genetic drift, rather than recombination or gene acquisition, which indicated that carriers were the primary sources of typhoid infections [138]. Transmission of serovar Typhi, then, would be expected to occur primarily through human-human contact and the fecal-oral route, and this is what has been observed [151]. Surprisingly, analysis of the host-restricted serovar Gallinarum, which causes avian typhoid, showed that Gallinarum and Typhi had many of the same patterns of gene loss [142]. This suggests that host adaptation within S. enterica subsp. enterica involves loss of the intestinal lifestyle, coupled with an ability to cause systemic infection. This niche specialization may also reduce the ability of host-adapted serovars to survive in the external environment [136,142].
There is increasing correlation between curli expression, formation of the rdar morphotype and a host-generalist lifestyle. For the past 10-20 years, Salmonella serovars Typhimurium and Enteritidis have been the most common causes of human gastroenteritis worldwide [8]. As well as being able to infect many animal hosts [8], these serovars have retained respiration under anaerobic conditions, as well as a broad substrate spectrum for metabolism [144]; presumably, this allows them to have a flexible niche space. Solano et al. [42] found that 93% of ~200 natural serovar Enteritidis isolates were rdar-positive. Romling et al. [152] analyzed ~800 Enteritidis and Typhimurium isolates from patients, food and animals, and over 90% of isolates were rdar-positive. Interestingly, all rdar-negative Typhimurium isolates were members of var. Copenhagen, which causes an invasive disease in pigeons. Isolates from host-adapted or -restricted serovars Choleraesuis, Gallinarum and Typhi were all rdarnegative, with the exception of one Gallinarum isolate, strongly suggesting that host-adaptation was associated with loss of the rdar morphotype [152]. Solomon et al. [125] and our group analyzed a wider diversity of isolates in subsp. enterica [43], as well as other S. enterica subspecies, and S. bongori as part of the SARC collection [153] and showed that the rdar morphotype was widely conserved throughout the Salmonella genus. As expected, we found that serovars Choleraesuis and Paratyphi (A, B or C) were almost entirely rdar-negative. For serovar Typhi, the first two isolates we tested were erroneously reported to be rdar-positive [43]; however, in further testing, no rdar-positive isolates were identified from >100 Typhi isolates analyzed (AP White, SL Stocki and KE Sanderson, unpublished). Both curli operons (csgBAC and csgDEFG) are present in serovar Typhi, so it is assumed that the rdar morphotype is shut off via regulatory mutations. We have also reported that S. enterica subsp. arizonae isolates were negative for the rdar morphotype [153]. Not as much is known about subsp. arizonae, but it is possible that these isolates are host-adapted to reptiles [154]. S. enterica subsp. arizonae isolates are known to have pseudogenes in the curli operons [24], as well as serovar Paratyphi A [139].
Recent studies in E. coli show a similar trend between a host-generalist lifestyle and conservation of the rdar morphotype. We analyzed 284 human, livestock and environmental E. coli isolates and generated a phylogenetic tree based on comparisons of three conserved intergenic regions [3]. Isolates that were nearly identical to each other (phylogroup B1), and thus predicted to be host-generalist, were ~90% rdar-positive, whereas isolates with longer branch lengths (phylogroup B2 and part of group D) that were predicted to be host-adapted were only ~30% rdar-positive [3]. Meric et al. [155] performed a similar comparative analysis of E. coli isolates that were obtained from plants or from animal species. The plant-adapted isolates (primarily phylogroup B1) had a significantly higher prevalence of rdar morphotype formation as compared to the animal isolates (primarily group B2). In addition, similar to Salmonella, E. coli isolates that cause invasive disease (i.e., enteroinvasive E. coli and Shigella) have lost the ability to form the rdar morphotype [156]. Since the operons for curli and cellulose production were likely present in the common ancestor of Salmonella and E. coli [25,135,157], it makes sense that trends relating to rdar morphotype expression may have evolved similarly. Nevertheless, despite these observed trends, correlation does not necessarily equal causation, and there is no published evidence that the rdar morphotype is involved in transmission.
One of the main problems for understanding NTS transmission is that there are no good models to test the relative importance or function of Salmonella genes in the transmission process. A chronic infection model was recently developed for serovar Typhimurium in mice designed to mimic the human carrier state of serovar Typhi [158]. Although it is difficult to extrapolate from mice to humans, use of this model will undoubtedly lead to the identification of factors that aid in Salmonella ser. Typhi persistence and possible transmission from the carrier state. Many research groups around the world are designing genome-wide screens to study the infectious process for both NTS and typhoid. It is important to consider that there will always be a significant percentage of Salmonella genes whose functions are not involved in infection, but rather, are needed for survival in external environments. These genes will be missed in standard infection screening. Most new S. enterica subsp. enterica genomes still have 20-30% of genes with unknown function. For example, 76% of genes identified by Fricke et al. [144] that were unique to a given serovar were annotated as hypothetical proteins, as compared to only 10% of the genes that were absent. Although it is difficult to study NTS transmission, there has been a murine model recently proposed for serovar Typhimurium [159]. In addition to expanding on current transmission models, there is also a need to establish what the realworld conditions are like for human transmission.

Commentary: Infectious NTS Isolates in Africa
There is a prevailing thought in North America that NTS isolates are not life-threatening pathogens, simply causing a self-limiting gastroenteritis that usually does not require hospitalization. In Africa, however, NTS isolates cause an invasive type of disease that has a mortality rate as high as 25%, especially in young children or HIV-positive individuals [10,160]. These infectious NTS (iNTS) isolates represent a new variant group of serovar Typhimurium (ST313) that have many of the hallmarks of host-adaptation, with accumulation of pseudogenes in pathways that are also inactivated in serovars Typhi or Paratyphi A [9]. Genetic characterization of the iNTS pathogens helps in the design of effective treatment strategies [146]. However, these authors have stated that the largest gap in knowledge has to do with transmission. While the public health situation in the industrialized world cannot be compared to Africa, we feel that there is also a large gap in understanding the transmission process for more "typical" NTS isolates.

Conclusions
In writing this review, we felt that for the reader to gain a better perspective on Salmonella transmission, we needed to describe laboratory models of Salmonella survival in addition to presenting real-world examples of S. enterica outbreaks. It is apparent that even though the vehicles for transmission are often well-defined, there is still much to learn about where and how Salmonella persists. Salmonella survival adaptations like the rdar morphotype have been studied in the laboratory, but it is unknown how relevant these adaptations are to the persistence of Salmonella in a real outbreak. After reviewing outbreaks in several food vehicles, a pattern emerged whereby persistence of Salmonella in the food processing environment was due to increased resistance, ability to crosscontaminate, and long-term survival, characteristics that have been ascribed to the rdar morphotype. It is increasingly clear that the form of Salmonella in the environment is different than Salmonella in the host. To intervene effectively in industrial processes and ensure safe food handling as we move into the future, we need to understand the strategies S. enterica has evolved that enable it to transmit so efficiently. This is increasingly relevant with the globalization of our food supply and the use of centralized processing and storage facilities. In our view, more research should be aimed at identifying the genes involved in the transmission of S. enterica, which requires that better transmission models be developed. In addition, there is a continued need for increased epidemiological surveillance to identify reservoirs in the environment. Salmonella has adapted remarkably well to diverse environments and there will not be a simple solution for reducing the prevalence of Salmonella infections. However, increasing our knowledge about transmission can only help to minimize its worldwide impact.
to space constraints. Our research is supported by a Discovery Grant from the Natural Sciences and Engineering Research Council (NSERC) to APW and The Jarislowsky Chair in Biotechnology from the Stephen Jarislowsky Trust Fund, University of Saskatchewan and the Province of Saskatchewan. KDM is supported through an Alexander Graham Bell Canada Graduate Scholarship from NSERC.