Next Article in Journal
Response of Siberian Cranes (Grus leucogeranus) to Hydrological Changes and the Availability of Foraging Habitat at Various Water Levels in Poyang Lake
Next Article in Special Issue
Evaluation of Truck Cab Decontamination Procedures following Inoculation with Porcine Epidemic Diarrhea Virus and Porcine Reproductive and Respiratory Syndrome Virus
Previous Article in Journal
A Review of the Occurrence of Metals and Xenobiotics in European Hedgehogs (Erinaceus europaeus)
Previous Article in Special Issue
Performance of a Differentiation of Infected from Vaccinated Animals (DIVA) Classical Swine Fever Virus (CSFV) Serum and Oral Fluid Erns Antibody AlphaLISA Assay
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Active Participatory Regional Surveillance for Notifiable Swine Pathogens

1
Department of Veterinary Diagnostic and Production Animal Medicine, College of Veterinary Medicine, Iowa State University, Patterson Hall, 1800 Christensen Drive, Ames, IA 50011-1134, USA
2
Department of Statistics, College of Liberal Arts and Sciences, Iowa State University, Snedecor Hall, 2438 Osborn Drive, Ames, IA 50011-4009, USA
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Animals 2024, 14(2), 233; https://doi.org/10.3390/ani14020233
Submission received: 18 November 2023 / Revised: 30 December 2023 / Accepted: 5 January 2024 / Published: 11 January 2024
(This article belongs to the Special Issue Biosecuring Animal Populations)

Abstract

:

Simple Summary

Effective, sustainable regional surveillance for the early detection of notifiable swine pathogens has been difficult to achieve. Regional surveillance based on clinical signs (syndromic surveillance) is not diagnostically sensitive and specific. Surveillance based on farm-by-farm testing is burdensome and costly. Borrowing the strengths of each approach, we evaluated an active participatory surveillance design in which regional status was determined by targeted sampling of 10 poor-doing pigs in each participating farm followed by screening in credentialed laboratories. The analysis showed that at 0.1% prevalence (18 infected farms among 17,521 farms) and a farm-level detection probability of 30%, active participatory surveillance would detect ≥ 1 positive farms with 67%, 90%, and 97% probability when producer participation was 20%, 40%, and 60%, respectively. Depending on the specimen collected (serum or swab sample) and test format (nucleic acid or antibody detection), the cost per round of sampling ranged from EUR 0.016 to EUR 0.032 (USD 0.017 to USD 0.034 USD) per pig in the region. The techniques and technologies required for active participatory surveillance are widely available and in common use. Implementation would require coordination among producers, industry groups, and animal health authorities.

Abstract

We evaluated an active participatory design for the regional surveillance of notifiable swine pathogens based on testing 10 samples collected by farm personnel in each participating farm. To evaluate the performance of the design, public domain software was used to simulate the introduction and spread of a pathogen among 17,521 farms in a geographic region of 1,615,246 km2. Using the simulated pathogen spread data, the probability of detecting ≥ 1 positive farms in the region was estimated as a function of the percent of participating farms (20%, 40%, 60%, 80%, 100%), farm-level detection probability (10%, 20%, 30%, 40%, 50%), and regional farm-level prevalence. At 0.1% prevalence (18 positive farms among 17,521 farms) and a farm-level detection probability of 30%, the participatory surveillance design achieved 67%, 90%, and 97% probability of detecting ≥ 1 positive farms in the region when producer participation was 20%, 40%, and 60%, respectively. The cost analysis assumed that 10 individual pig samples per farm would be pooled into 2 samples (5 pigs each) for testing. Depending on the specimen collected (serum or swab sample) and test format (nucleic acid or antibody detection), the cost per round of sampling ranged from EUR 0.017 to EUR 0.032 (USD 0.017 to USD 0.034) per pig in the region. Thus, the analysis suggested that an active regional participatory surveillance design could achieve detection at low prevalence and at a sustainable cost.

1. Introduction

In this study, a swine farm is defined as a specific geographic location where a population of pigs under one management system is raised; a region is defined as a contiguous geographical area within which the farms under surveillance are located. Swine farms are diverse in size and structure, production type, housing, and management, but the trend over the last few decades has been toward fewer and larger farms. As an example, the number of U.S. farms with pigs declined from 168,450 in 1995 to 68,300 in 2012 [1] while the average farm inventory increased from 302 pigs to 1044 [2]. This period also saw the emergence of specialized swine farms and widespread adoption of the practice of moving young pigs from breeding-specific farms to feeding-specific farms. Thus, in 2019, Denmark, France, Germany, and Spain cumulatively imported 15.7 million and exported 20.5 million live pigs [3] and, in the U.S., 63.4 million live pigs were transported from one state to finishing farms in other states [4]. Overall, these changes have contributed to improved production efficiency but complicated disease control. Logically, the greater movement of animals, personnel, and material facilitates the spread of infectious agents. For example, porcine epidemic diarrhea virus spread to at least 12 states within 8 weeks of its initial detection in the U.S. [5,6].
Under these circumstances, the early detection of notifiable swine pathogens is essential but difficult. In Brazil (1978), an unrecognized outbreak of African swine fever virus (ASFV) in the index farm was followed by its spread to 11 states. Eradication took 8 years and cost ~USD 20 million [7]. In a 1997–1998 outbreak in the Netherlands, a retrospective analysis determined that classical swine fever virus (CSFV) was spreading in the country 5 to 7 weeks prior to its recognition [8]. Eradication was ultimately accomplished at a cost of ~USD 2.3 billion [9]. In the United Kingdom (2001), foot-and-mouth disease virus (FMDV) infections went unnoted and the virus was disseminated widely via the movement of infected animals. Eradication took 6 months, led to the euthanasia of 4 million animals, and cost ~USD 4.0 billion [10]. At present, the ASFV pandemic initiated in 2007 continues to expand despite the recognition that “an early detection system for ASF could facilitate early reporting and response (and limit) the spread of the disease” [11].
The need for effective, on-going regional surveillance is obvious, but a workable design is not [12]. Surveillance based on “down-the-road” testing to prove farms free from infection is often performed in government-supported eradication programs, e.g., Aujeszky’s disease [13], but is costly and administratively burdensome. Syndromic surveillance [14], i.e., detection based on reports of clinical signs consistent with the pathogen(s) of interest, should meet the need, but Poppensiek and Budd, cited in [15], found that “The greatest single difficulty in a disease-reporting program proved to be the failure of vets to file reports”. Exploring this problem, Gates et al. [16] found that the reluctance to report arose from feelings of uncertainty, fear of the consequences of reporting, distrust of authorities, and unfamiliarity with the reporting process. Participatory surveillance, i.e., including members of the population at risk in the surveillance data collection process [17,18,19,20], has improved syndromic surveillance, but its effectiveness is limited by the participants’ clinical experience and the inherent diagnostic ambiguity of clinical signs. Thus, Elbers et al. [21] estimated that a diagnosis of CSFV based solely on clinical signs achieved a diagnostic sensitivity of 73% and a diagnostic specificity of 53%.
The objective of this study was to characterize the performance of a surveillance design best described as “collecting and testing a few targeted samples from each of many farms in the region”. In more formal terms, we analyzed the performance and cost of test-based, regional, active participatory surveillance based on the targeted sampling of 10 poor-doing pigs by farm personnel (producer, staff, and/or veterinarian) followed by screening for the pathogen of interest in credentialed laboratories. Because our objective was to explore the feasibility and performance of this general design, the analysis did not include a sampling and testing process for a specific pathogen. However, a key assumption in the cost analysis was that 10 pigs would be sampled on each participating farm and the individual pig samples combined into two pools (5 pigs per pool) for antibody or nucleic acid testing.

2. Study Design

The study was conducted in three phases. In Phase 1, the spread of a notifiable (but unspecified) pathogen was simulated in a population of 17,521 swine farms holding 51,515,699 pigs in a geographic region of 1,615,246 km2 for a period of 70 days. Based on the simulated farm status (negative or positive), Phase 2 estimated the probability of detecting ≥ 1 positive farms in the region as a function of farm-level sensitivity (10%, 20%, 30%, 40%, 50%), percent of farms participating in the surveillance program (20%, 40%, 60%, 80%, 100%), and farm-level prevalence in the region. Since the objective was to broadly evaluate the performance of the surveillance design, Phase 1 (pathogen spread) and Phase 2 (detection) simulations were performed over a range of parameters. In Phase 3, active participatory surveillance was analyzed in terms of the cost per farm and the cost per pig in inventory per round of sampling and testing.

2.1. Phase 1: Simulating Pathogen Spread—Animal Disease Spread Model (ADSM)

The Animal Disease Spread Model (ADSM) is public-domain software designed to simulate the spread of infectious agents in livestock populations [22]. ADSM uses a static, fixed population and defines the population of animals at a single geographic location, i.e., a farm, as the epidemiological unit. For Phase 1 simulations, a population of swine farms was created from publicly available concentrated animal feeding operation (CAFO) permit data provided by the appropriate authorities in the states of Colorado, Iowa, Kansas, Minnesota, Missouri, Nebraska, Oklahoma, and South Dakota. State datasets were collated into a single ADSM-compatible file. Farms determined to be inactive or with data quality issues were removed, resulting in a final data set consisting of 17,521 farms (Table 1). Swine packing plant locations and slaughter capacities were included in the population file to account for their role in indirect pathogen spread [23], with latitude and longitude generated from their addresses [24] using Google Maps (www.google.com/maps accessed 1 April 2021).
ADSM (version 3.510.0) software required the identification of each farm site by production type (breeder, feeder, or breeder/feeder), inventory (number of animals), and geolocation (latitude and longitude). For the majority of farms, production type was provided in the state datasets or derived from the site name, e.g., “Smith Sow Farm”. Using this approach, 13,041 of 17,521 farms in the population file were assigned to production type: 209 (1.6%) breeder/feeder, 702 (5.4%) breeder, and 12,130 (93.0%) feeder. The remaining 4481 farms were randomly assigned [25] to production type proportional to state-level production types or, if state data were not adequately reported, the overall proportions in the database. State-level permit data described the capacity (inventory) of each farm either as the number of pigs or as “animal units”. In the latter case, animal units were converted to the number of pigs on the basis of one animal unit per 2.5 pigs weighing ≥ 24.9 kg (≥55 pounds) or 10 pigs weighing < 24.9 kg (<55 pounds) [26]. With one exception, all states reported farm location by latitude and longitude, by ZIP Code (i.e., postal code), or by county (i.e., an administrative subdivision of a state). For farms without precise geolocation, the spsample function in the sp R package (version 1.4-5) [27] was used to randomly generate a latitude and longitude within the geographic unit associated with the record (i.e., ZIP Code, county, or state).

2.1.1. ADSM Simulations

The ADSM software was designed to simulate the spread of a designated pathogen in a defined livestock population by setting parameter values representative of the pathogen’s transmission characteristics and industry production practices. In this study, ADSM simulations were performed over a range of parameter values (Table 2) to provide spread estimates generalizable to a variety of notifiable pathogens. Although disease control options are available in ADSM, e.g., movement restrictions, vaccination, and farm depopulation, they were not implemented so as to allow the unrestricted spread of the hypothetical agent within the region.
For simplicity, each spread scenario began with a single index farm (Table 2). A total of 30 index farms were identified by randomly selecting 10 farms from each of the 3 pig density categories (1.1–3.3 pigs per km2, 15.9–25.3 pigs per km2, 106.8–214.5 pigs per km2) using features built into R (version 4.1.0) software [25]. Each of these 30 individual index farms was successively categorized as a breeder, breeder–feeder, or feeder in simulations. This process ensured that the spread scenarios covered the range of possible outcomes that could arise due to differences in pig density in the region surrounding the index farm and in the direct and indirect contact rates among production types. A total of 2430 pathogen spread scenarios were simulated based on all combinations of county-level pig density (n = 3), index farm within county density (10 farms in each of the 3 county-level pig densities), index farm production type (n = 3, i.e., breeder, feeder, breeder–feeder), spread by direct contact (n = 3, i.e., probability levels 0.2, 0.4, 0.6), spread by indirect contact (n = 3, i.e., probability levels 0.05, 0.10, 0.15), and area spread (n = 3, i.e., probability levels 0.001, 0.010, 0.100). Each scenario was replicated 100 times to account for the stochastic nature of the ADSM simulations.

2.1.2. ADSM Automation

After constructing the initial pathogen spread scenario using the ADSM Scenario Creator, the creation of the subsequent 2429 scenarios was automated. In brief, the Scenario Creator output a directory that contained an SQLite database (“ScenarioX.db”) into the ADSM workspace that housed the information ADSM used to run the scenario. An R script [25] was written to copy the database file and update the unique parameter values for each scenario (Table 2), thereby creating additional SQLite databases suitable to be imported and run on ADSM.
The procedure was performed as follows:
  • A new file directory in the ADSM workspace was created using the base R function dir.create and the saved template scenario “ScenarioX.db” was copied from the initial simulation. This file was renamed using the base R function file.rename, e.g., “new_scenario.db”.
  • A connection was created between R and the SQLite database using the dbConnect function from the RSQLite R package (version 2.2.4) [30] to access the “new_scenario.db” file for editing, (i.e., con = dbConnect(SQLite(), dbname = “new_scenario.db”)).
  • Once the connection was opened, the dbGetQuery function was used to bring the SQLite table to be edited into the R environment as a data frame. For example, the R code created a data frame in the R environment named “Population” using the ScenarioCreator_unit table from the SQLite database. This table contained the entire population file input during the ADSM scenario creation process.
    a.
    Population <- dbGetQuery(con, “SELECT * FROM ScenarioCreator_unit”).
    b.
    Other tables altered using this procedure included those containing the direct spread parameters (ScenarioCreator_directspread), indirect spread parameters (ScenarioCreator_indirectspread), and local area spread parameters (ScenarioCreator_airbornespread).
  • R functions were then used to update the data frame to fit the new desired scenario (i.e., production types, initial infection statuses, or transmission probabilities).
  • The dbWriteTable function with the overwrite option specified as TRUE was used to replace the SQLite table in the database file with the newly edited data frame. For example, dbWriteTable(con, name = “ScenarioCreator_unit”, value = Population, overwrite = TRUE).
  • Rerunning the dbConnect line exactly as written in Step 2 saved the SQLite database file with the changes included (i.e., con = dbConnect(SQLite(), dbname = “new_scenarioX.db”)).
At the conclusion of the scenario creation process, the spread scenarios were run using batch processing (see https://github.com/NAVADMC/ADSM/wiki/Batch-processing-of-scenarios-using-ADSM-Auto-Scenario-Runner accessed on 1 May 2021).

2.1.3. Phase 1: Spread Results

Phase 1 simulation results (100 iterations for each of the 2430 scenarios) are reported in Table 3 as the mean number of infected farms on simulation day 70 for all possible combinations of the spread parameter values listed in Table 2, i.e., index farm county pig density (n = 3 pig densities), index farm type (n = 3), probability of transmission by direct contact (n = 3 levels), indirect contact (n = 3 levels), and area spread (n = 3 levels). All parameters in the model affected the outcome, but holding all other parameters constant (ceteris paribus), it can be seen that the probability of transmission by direct contact, i.e., the movement of infectious animals among sites, was the most impactful in terms of the total number of infected farms on day 70.

2.2. Phase 2: Simulating Pathogen Detection

Among the 2430 spread scenarios simulated in Phase 1, the 360 scenarios indicated in Table 4 were selected for use in Phase 2. Each of the 12 groups of spread scenarios shown in Table 4 consisted of 30 scenarios, i.e., 10 index farms in each pig density category, with each index farm successively classified as one of 3 production types (breeder, breeder–feeder, and feeder). For each of these 360 spread scenarios, the detection of ≥ 1 positive farms in the region was simulated under 25 pathogen detection settings based on farm-level sensitivity (10%, 20%, 30%, 40%, or 50%) and farm participation in the surveillance program (20%, 40%, 60%, 80%, or 100% of farms in the region). For the 20%, 40%, 60%, and 80% participation levels, farm participation was allocated uniformly across the 3 pig inventory size categories in Table 1 through simple random sampling without replacement. For example, simulations at 20% participation included 20% of the farms in the ≤1000 category, 20% in the 1001–4999 category, and 20% in the ≥5000 category. For each participation level, 1000 farm groupings were randomly selected using R software [25] to match the 1000 surveillance simulations. By definition, 100% participation did not require participant selection.
An R function [25] was written to perform the surveillance simulations for each combination of spread scenario replicate and detection setting, as described below. Because each spread scenario was replicated 100 times, there were a total of 100,000 iterations (1000 iterations for each of the 100 replicates) for each combination of spread scenario and detection setting.
For iteration i, where i = 1, …, 1000:
  • Assign as participants the ith set of participating farms from the list of pre-selected sets corresponding to the current setting’s participation level. For 100% participation, the entire population of farms were participants.
  • For each participating farm, the farm infection status (negative, positive) was identified for days 7, 14, 21, 28, 35, 42, 49, 63, and 70 of the ADSM spread simulation.
  • For each of the days listed in Step 2, participating farms classified as positive were “tested” (simulated) independently in R using the rbinom function with the probability of detection equaling the assigned farm-level sensitivity. Thus, for each positive farm and where p was the assigned farm-level sensitivity, rbinom (n = 1, size = 1, prob = p) randomly generated a 0 or 1, where 1 indicated that the infection was detected.

2.2.1. Phase 2: Detection Results

For each of the 25 detection settings, results were reported as the probability of detection by regional farm-level prevalence, with the probability of detection calculated as the percentage of iterations in which ≥ 1 true positive farms “tested” positive in the simulations. Results are provided in Table 5 and Figure 1 and Figure 2 by regional prevalence, farm-level sensitivity, and producer participation. The analysis showed that detection was dependent on the interactions between producer participation and farm-level sensitivity, but high probabilities of detection were achieved at low prevalence over a wide range of participation and sensitivity values. For example, at 0.1% prevalence (18 positive farms among 17,521 farms) and a farm-level detection probability of 30%, the participatory surveillance design achieved 67%, 90%, and 97% probabilities of detecting ≥ 1 positive farms in the region when producer participation was 20%, 40%, and 60%, respectively.

2.3. Phase 3: Cost of Sampling and Testing

The cost analysis assumed that ante mortem specimens (blood, blood swabs, nasal swabs, oral swabs, or fecal swabs) would be collected from 10 poor-doing pigs in each participating farm, combined into 2 pooled samples (5 pigs per pool), shipped to a credentialed laboratory in an insulated shipping container with coolant, and tested by polymerase chain reaction (PCR) or antibody ELISA. Since the general surveillance design did not define a specific testing protocol, costs were estimated for 3 cases: serum samples tested by PCR, swab samples tested by PCR, and serum samples tested by ELISA. To further evaluate the impact of testing costs on overall program costs, 3 price levels for PCR (EUR 18.65, EUR 23.32, EUR 27.98/USD 20.00, USD 25.00, USD 30.00) and antibody ELISA (EUR 4.66, EUR 7.00, EUR 9.33/USD 5.00, USD 7.50, USD 10.00) were used in the estimates.
The estimated cost of a single round of sampling used the inputs and costs listed in Table 6 and assumed 100% producer participation. Costs listed in Table 6 are the mean of prices quoted by 3 companies for the distribution of products in the U.S. Supplies for collecting serum samples included single-use blood collection tubes and needles (n = 10), tubes in which to pool samples (n = 2), and disposable gloves (2 pairs). Supplies for swab samples included swabs (n = 10), transport medium, tubes in which to pool samples (n = 2), and disposable gloves (2 pairs). Package shipment costs reflect rates paid by clients of the Iowa State University Veterinary Diagnostic Laboratory (Ames, IA, USA) and may vary.
The analysis assumed that the labor and materials required to collect, process, and package samples for shipment would be provided by the farm and, therefore, were not included in the cost analysis. Likewise, it was assumed that blood samples would be centrifuged and the serum pooled (5 pigs per pool) prior to shipment in order to avoid processing charges at the laboratory. Some costs that would be expected in the normal course of sampling and testing were also not included. For example, no attempt was made to account for the added cost of duplicate sampling or testing, e.g., the cost of an additional tube and needle for a second attempt at blood collection from a pig or the cost of retesting a non-negative sample in the laboratory. Likewise, the cost analysis did not include the costs required to administer and coordinate the program.

2.3.1. Phase 3: Estimated Cost of Sampling and Testing

The results of the cost analysis are listed in Table 7 for 3 “specimen by test” combinations with 3 costs for each test. The estimates are given in terms of the average cost per farm in the region, the average cost per pig in the region, and the average cost per pig in inventory for the farm size categories given in Table 1, i.e., farms with ≤1000 pigs, 1001–4999 pigs, and ≥5000 pigs. Using a PCR cost of EUR 23.32 (USD 25.00) or ELISA cost of EUR 7.00 (USD 7.50) per sample, the cost of sampling and testing would be approximately EUR 0.03 (USD 0.03) or EUR 0.02 (USD 0.02) per pig in the region, respectively. On a farm basis, given that sample size and test costs are the same for all farms, the cost per pig increases as the farm pig inventory decreases, as shown in Table 7.

3. Discussion

Concerning surveillance systems, Thacker et al. [31] advised that “Simplicity should be a guiding principle …. Simple systems are easy to understand and implement, cost less than complex systems, and provide flexibility”. Consistent with the theme of “simplicity”, the active participatory regional surveillance design was based on targeted sampling of 10 live but poor-doing pigs on participating farms by farm personnel, followed by testing in credentialed laboratories. The design differed most from traditional surveillance in that it focused on the status of the region rather than the status of individual farms. The result was fewer samples per farm yet sensitive regional surveillance at a manageable cost (Table 5 and Table 7).
Targeted sampling, already recommended for the surveillance of CSFV [32] and ASFV [33], addressed the problem of detection in populations characterized by heterogeneity and low prevalence [34]. That is, commercial swine farms separate animals into barns and pens by age, stage, and function, i.e., conditions that are inconsistent with the independence and homogeneity assumptions underlying the traditional power formula based on simple random sampling. Thus, Crauwels et al. [35] reported that random sampling would be unlikely to include a CSFV-positive pig for several weeks following its introduction into a naïve farm; that is, until it had spread sufficiently and infected a sufficient proportion of the population.
The surveillance design called for sampling live, poor-doing pigs because notifiable pathogens may not produce remarkable clinical signs and early mortalities, including CSFV [36], ASFV [37], and FMDV [38]. Kirkland et al. [36] cautioned that CSFV strains of low or moderate virulence could circulate without notable clinical signs for 4–8 weeks. Schulz et al. [39] concluded that, depending on the virulence of the isolate, it could take up to a month for ASFV-related mortalities to be noted. Thus, sampling poor-doing live pigs would facilitate early detection by eliminating the expectation of telltale clinical signs and/or conspicuous mortalities.
Sampling live pigs also anticipates the need to quickly resolve ambiguous (“non-negative”) test results. The typical response to a non-negative surveillance sample result is retesting the original sample using the original test or a confirmatory assay. If the retest result is conclusive, the question is resolved. If not, the fact that the samples originated from live pigs means that it is likely possible to re-sample and retest the original pigs and/or their penmates to quickly resolve the issue. On the other hand, if the samples originated from dead pigs or pigs no longer on the farm, the resolution will require extensive sampling of animals on the farm of origin and on other epidemiologically relevant farms.
In this “generic” surveillance scenario it was not necessary to designate specific specimens to be collected or tests to be performed. While blood and serum are traditional surveillance specimens, a variety of more easily collected antemortem specimens are increasingly used in diagnostics and surveillance, e.g., blood swabs, nasal swabs, oropharyngeal swabs, and rectal swabs [40]. Trevisan et al. [41] documented this trend for the period 2007 to 2018 in a study of 547,873 diagnostic cases submitted to 4 Midwestern U.S. veterinary diagnostic laboratories for porcine reproductive and respiratory syndrome virus (PRRSV) testing. In 2007, 51% of the diagnostic cases included serum samples; in 2018, 21% of cases included serum samples, 35% included oral fluid samples, and 11% included processing fluid samples.
Sample collection by farm personnel working under the supervision of the farm veterinarian is common practice in many parts of the world. The use of easily collected antemortem samples will facilitate producer participation and is consistent with sampling by lay personnel. Regardless of the specimen(s) selected for use, the ability of lay participants to collect diagnostic samples is supported by the literature. In human medicine, Branson [42] reported that 156,121 (94.5%) of 165,194 self-collected dried blood spot specimens were acceptable for human immunodeficiency virus testing. The remaining 5.5% were disqualified for insufficient quantity, contamination, or excessive time between sampling and submission. Similarly, Tsang et al. [43] found no loss in diagnostic accuracy with self-collected oronasal swabs or oral fluid samples in a systematic review of 23 refereed studies involving severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) testing. In the veterinary literature, formal examples of the use of producer-collected samples include field studies on CSFV [44], PRRSV [45], and antibiotic resistance in Escherichia coli [46].
The use of farm personnel in sample collection acknowledges the fact that those who work with the pigs are also those most aware of recent changes in pig health and are best qualified to identify the appropriate animals to sample. Relevant to program sustainability, the use of farm personnel reduces sampling costs by eliminating the need to employ program samplers and integrates both scalability and responsiveness into the design. That is, because farm personnel are already on the farms, sample size and/or frequency can be quickly adjusted in response to changing circumstances, e.g., increased after the initial detection of the target to improve case finding and decreased after the threat is contained to reduce costs.
The final point in the surveillance design is the testing of samples in credentialed diagnostic laboratories. “Credentialed”, in this case, refers to laboratories operating under national or international standards, e.g., ISO/IEC 17025 [47,48]. Such laboratories have operational quality management systems, proper equipment, and the technical expertise to reliably perform testing. Further, many of these laboratories are equipped with laboratory information management systems and the capacity to report test results electronically, thereby facilitating timely reporting to participants and, if needed, animal health authorities. Alternatively, testing in the field using point-of-care test devices is sometimes suggested as a means to expedite the discovery of notifiable pathogens. This may be possible in the future, but not at present. Hobbs et al. [49] reported the most fundamental problem: “Inadequate regulatory guidance and poor industry oversight has led to a proliferation of point-of-care tests of varying quality and fitness for purpose …”. A number of issues would need to be addressed if point-of-care tests are to be used for notifiable pathogens, but at a minimum, an accounting system for tracking kits and test results will need to be in place to avoid misuse.
The performance analyses of the active participatory regional surveillance design were based on simulations of the spread (Phase 1) and detection (Phase 2) of an unspecified pathogen in a population of naïve swine farms representative of the Midwest U.S. The swine farm dataset was created using concentrated animal feeding operation (CAFO) permit records from eight U.S. states. CAFO permitting requirements and data quality were not standardized across states, but the data required for the pathogen spread simulations (farm geolocation, inventory, and production type) were provided in most cases. After resolving inconsistencies and missing data (Section 2.1), the dataset consisted of 17,521 swine farms holding 51,515,699 pigs in a region of ~1,582,000 km2 [50]. As a point of reference, the geographic area of Belgium, France, Germany, Luxembourg, the Netherlands, Portugal, and Spain is ~1,584,000 km2 [51].
The purpose of Phase 1 was to create datasets of swine farms of known infection status (negative, positive) for use in the detection simulations (Phase 2). The Animal Disease Spread Model (ADSM) [15] software used in Phase 1 provided substantial modeling flexibility and has previously been used to simulate the spread of ASFV in Vietnam [49], Aujeszky’s disease virus in Thailand [52], CSFV in the Republic of Serbia [53], and PRRSV in both Uganda [54] and Canada [28]. A total of 2430 pathogen spread scenarios were simulated (Section 2.1.1) based on combinations of county-level pig densities, index herd production types, and the probabilities of transmission by direct contact, indirect contact, and area spread. Among these scenarios, 360 spread scenarios representing the range of outcomes were selected for use in the Phase 2 detection simulations.
The objective of Phase 2 was to estimate the probability of detecting ≥ 1 positive farms in the region as a function of farm-level sensitivity, percent of farms participating in the surveillance program, and regional farm-level prevalence. The results were expressed in terms of the probability of detection by regional herd prevalence (from Phase 1 simulation results) rather than time-to-detection because of the diversity of spread rates simulated in the ADSM software. A difficulty when applying targeted sampling to surveillance is the absence of agreed-upon methods for calculating sample size and associated farm-level sensitivities. Using a modeling approach, Nielsen et al. [32,33] reported that targeted sampling of 5 sick or dead pigs in a population of 1000 pigs would detect CSFV 4 to 37 days and ASFV 13 days post-introduction with 95% probability. In the present study, a sample size of 10 pigs was considered a practical number for on-farm collection. The present study was not pathogen-specific and, in the absence of citable estimates, a conservative range of farm-level detection sensitivities (10%, 20%, 30%, 40%, and 50%) was used in the Phase 2 detection analysis. Similarly, data to inform the level of farm participation in this voluntary regional surveillance program were lacking. Consequently, adopting an approach that would inform administrators if such a program were to be initiated, a range of participation levels (20%, 40%, 60%, 80%, 100%) were evaluated for their effect on detection by prevalence.
In Phase 3, the regional surveillance design was analyzed for the cost per farm (17,521 farms) and per pig in the region (51,515,700 pigs) for one round of sampling. Options evaluated in the analysis included specimen (serum vs. swab samples), assay format (PCR or ELISA), and 3 assay cost options. The analysis was based on the present costs of materials for collecting, shipping, and testing in credentialed laboratories (Table 6). The cost analysis assumed that the labor required to collect and package samples would be provided by participant swine producers and that administrative costs would be borne by existent animal health agencies. Using test costs of EUR 23.32 (USD 25.00) per PCR and EUR 7.00 (USD 7.50) per ELISA, the cost per farm in the region ranged from EUR 51.25 (USD 54.94) for serum tested by ELISA to EUR 83.89 (USD 89.94) for serum tested by PCR. The cost per pig in the region ranged from EUR 0.018 (USD 0.019) to EUR 0.029 (USD 0.031) for the same scenarios.
Lee et al. [55] reported that U.S. swine producers would be willing to pay USD 0.581 (EUR 0.542) per pig per year to reduce the risk of losses from notifiable pathogens. While Lee et al. [55] focused on biosecurity, the regional surveillance design described herein would facilitate early detection and elimination, provide evidence of freedom from disease, and support access to international markets. Thus, the net effect is the amelioration of the major economic losses expected after the introduction of a notifiable pathogen at a price close to the producers’ cost constraints.
The dataset of 17,521 farms used in this study was assembled from CAFO permits and, therefore, may be considered representative of the region. From Table 1, it can be seen that the 4422 farms (25.2%) in the smallest farm category (≤1000 pigs) held 2.67% of the pigs in the region. The 11,261 farms (71.8%) in the mid-size category (1001 to 4999 pigs) held an additional 64.3% of the pigs in the region. By definition, smaller farms have fewer pigs and, therefore, surveillance cost per pig is higher (Table 7). To be successful, participatory surveillance requires broad engagement. While it may be possible to further reduce costs associated with sampling, transport, and testing, it would be prudent to explore the means to incentivize small producer participation.

4. Conclusions

The regional active participatory surveillance design evaluated in this study is simple and adaptable to the surveillance of a variety of pathogens, farm animal species, or regions. Simplicity and clarity in sampling, testing, and reporting are central to the success of a participatory program because voluntary programs depend on the full confidence of the participants. In truth, there is little innovation in the proposed surveillance framework; the personnel, testing, and reporting systems are largely in place. The only possible novelty is the aggregation and interpretation of surveillance testing data at the regional level rather than the farm level. As was shown in the evaluation, this change in focus achieved highly sensitive regional detection at a low cost.

Author Contributions

Conceptualization, G.T., P.M., R.M., C.W. and J.Z.; methodology, G.T., P.M., G.S.S. and C.W.; writing, G.T., P.M., G.S.S. and J.Z.; visualization, P.N.; funding acquisition, R.M., G.S.S., G.T., C.W. and J.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Pork Board, Clive, IA, U.S. (#22-028).

Institutional Review Board Statement

Not applicable.

Data Availability Statement

Current concentrated animal feeding operation (CAFO) permit data are available from the appropriate state authorities through the U.S. Freedom of Information Act.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. United States Department of Agriculture. Changes in the U.S. Swine Industry, 1995–2012 (#678.0817); USDA:APHIS:VS: CEAH, National Animal Health Monitoring System: Fort Collins, CO, USA, 2017.
  2. Dunn, J.W. The evolution of the U.S. swine industry. In Changes in the Live Pig Market in Different Countries; Szymańska, E.J., Ed.; Warsaw University of Life Sciences Press: Warsaw, Poland, 2017; pp. 19–27. [Google Scholar]
  3. Food and Agriculture Organization of the United Nations. FAOSTAT Statistical Database. Available online: www.fao.org/faostat/en/#data (accessed on 12 July 2021).
  4. United States Department of Agriculture. Meat Animals Production, Disposition, and Income—2020 Summary; USDA National Agricultural Statistics Service: Washington, DC, USA, 2021.
  5. Stevenson, G.W.; Hoang, H.; Schwartz, K.J.; Burrough, E.R.; Sun, D.; Madson, D.; Cooper, V.L.; Pillatzki, A.; Gauger, P.; Schmitt, B.J.; et al. Emergence of porcine epidemic diarrhea virus in the United States: Clinical signs, lesions, and viral genomic sequences. J. Vet. Diagn. Investig. 2013, 25, 649–654. [Google Scholar] [CrossRef] [PubMed]
  6. Niederwerder, M.C.; Hesse, R.A. Swine enteric coronavirus disease: A review of 4 years with porcine epidemic diarrhoea virus and porcine deltacoronavirus in the United States and Canada. Transbound. Emerg. Dis. 2018, 65, 660–675. [Google Scholar] [CrossRef] [PubMed]
  7. Moura, J.A.; McManus, C.M.; Bernal, F.E.M.; de Melo, C.B. An analysis of the 1978 African swine fever outbreak in Brazil and its eradication. Rev. Sci. Tech. Off. Int. Epiz. 2010, 29, 549–563. [Google Scholar] [CrossRef] [PubMed]
  8. Stegeman, A.; Elbers, A.; de Smit, H.; Moser, H.; Smak, J.; Pluimers, F. The 1997–1998 epidemic of classical swine fever in the Netherlands. Vet. Microbiol. 2000, 73, 183–196. [Google Scholar] [CrossRef] [PubMed]
  9. Meuwissen, M.P.; Horst, S.H.; Huirne, R.B.; Dijkhuizen, A.A. A model to estimate the financial consequences of classical swine fever outbreaks: Principles and outcomes. Prev. Vet. Med. 1999, 42, 249–270. [Google Scholar] [CrossRef]
  10. Davies, G. The foot and mouth disease (FMD) epidemic in the United Kingdom 2001. Comp. Immunol. Microbiol. Infect. Dis. 2002, 25, 331–343. [Google Scholar] [CrossRef]
  11. World Organisation for Animal Health. African Swine Fever (ASF)—Situation Report 9 (7 April 2022). Available online: https://www.oie.int/en/what-we-do/animal-health-and-welfare/disease-data-collection/ (accessed on 21 April 2022).
  12. Doherr, M.G.; Audigé, L. Monitoring and surveillance for rare health-related events: A review from the veterinary perspective. Phil. Trans. R. Soc. Lond. B 2001, 356, 1097–1106. [Google Scholar] [CrossRef]
  13. Anderson, L.A.; Black, N.; Hagerty, T.J.; Kluge, J.P.; Sundberg, P.L. Pseudorabies (Aujeszky’s Disease) and Its Eradication. A Review of the U.S. Experience; Technical Bulletin No. 1923; United States Department of Agriculture, Animal and Plant Health Inspection Service: Riverdale, MD, USA, 2008.
  14. Hoinville, L.J.; Alban, L.; Dewe, J.A.; Gibbens, J.C.; Gustafson, L.; Häsler, B.; Saegerman, C.; Salman, M.; Stärk, K.D.C. Proposed terms and concepts for describing and evaluating animal-health surveillance systems. Prev. Vet. Med. 2013, 112, 1–12. [Google Scholar] [CrossRef]
  15. Bush, E.J.; Gardner, I.A. Animal health surveillance in the United States via the National Animal Health Monitoring System (NAHMS). Epidemiol. Sante Anim. 1995, 27, 113–126. [Google Scholar]
  16. Gates, M.C.; Earl, L.; Enticott, G. Factors influencing the performance of voluntary farmer disease reporting in passive surveillance systems: A scoping review. Prev. Vet. Med. 2021, 196, 105487. [Google Scholar] [CrossRef] [PubMed]
  17. Mariner, J.C.; Hendrickx, S.; Pfeiffer, D.U.; Costard, S.; Knopf, L.; Okuthe, S.; Chibeu, D.; Parmley, J.; Musenero, M.; Pisang, C.; et al. Integration of participatory approaches into surveillance systems. Rev. Sci. Tech. Off. Int. Epiz. 2011, 30, 653–659. [Google Scholar] [CrossRef] [PubMed]
  18. Calba, C.; Antoine-Moussiaux, N.; Charrier, F.; Hendrikx, P.; Saegerman, C.; Peyre, M.; Goutard, F.L. Applying participatory approaches in the evaluation of surveillance systems: A pilot study on African swine fever surveillance in Corsica. Prev. Vet. Med. 2015, 122, 389–398. [Google Scholar] [CrossRef]
  19. Wójcik, O.P.; Brownstein, J.S.; Chunara, R.; Johansson, M.A. Public health for the people: Participatory infectious disease surveillance in the digital age. Emerg. Themes Epidemiol. 2014, 11, 7. [Google Scholar] [CrossRef]
  20. Smolinski, M.S.; Crawley, A.W.; Olsen, J.M.; Jayaraman, T.; Libel, M. Participatory disease surveillance: Engaging communities directly in reporting, monitoring, and responding to health threats. JMIR Pub. Health Suveill. 2017, 3, e7540. [Google Scholar] [CrossRef] [PubMed]
  21. Elbers, A.R.W.; Bouma, A.; Stegeman, J.A. Quantitative assessment of clinical signs for the detection of classical swine fever outbreaks during an epidemic. Vet. Microbiol. 2002, 85, 323–332. [Google Scholar] [CrossRef]
  22. Harvey, N.; Reeves, A.; Schoenbaum, M.A.; Zagmutt-Vergara, F.J.; Dubé, C.; Hill, A.E.; Corso, B.A.; McNab, W.B.; Cartwright, C.I.; Salman, M.D. The North American Animal Disease Spread Model: A simulation model to assist decision making in evaluating animal disease incursions. Prev. Vet. Med. 2007, 82, 176–197. [Google Scholar] [CrossRef]
  23. Lowe, J.; Gauger, P.; Harmon, K.; Zhang, J.; Connor, J.; Yeske, P.; Loula, T.; Levis, I.; Dufresne, L.; Main, R. Role of transportation in the spread of porcine epidemic diarrhea virus infection, United States. Emerg. Infect. Dis. 2014, 20, 872–874. [Google Scholar] [CrossRef]
  24. Meyer, S. Pork Packing: Just What Is Capacity? National Hog Farmer, 6 October 2020; pp. 18–22. [Google Scholar]
  25. R Core Team. R: A Language and Environment for Statistical Computing; R Foundation for Statistical Computing: Vienna, Austria, 2020; Available online: https://www.R-project.org (accessed on 30 May 2021).
  26. Hagenstein, P.R.; Flocchini, R.G.; Bailar, J.C., III; Claiborn, C.; Dickerson, R.R.; Galloway, J.N.; Grossman, M.R.; Kasibhatla, P.; Kohn, R.A.; Lacy, M.P.; et al. The Scientific Basis for Estimating Air Emissions from Animal Feeding Operations: Interim Report; The National Academy Press: Washington, DC, USA, 2002; p. 90. [Google Scholar]
  27. Bivand, R.S.; Pebesma, E.; Gómez-Rubio, V. Applied Spatial Data Analysis with R, 2nd ed.; Springer: New York, NY, USA, 2013; pp. 146–149. [Google Scholar]
  28. Thakur, K.K.; Revie, C.W.; Hurnik, D.; Poljak, Z.; Sanchez, J. Simulation of between-farm transmission of porcine reproductive and respiratory syndrome virus in Ontario, Canada using the North American Animal Disease Spread Model. Prev. Vet. Med. 2015, 118, 413–426. [Google Scholar] [CrossRef] [PubMed]
  29. Machado, G.; Galvis, J.A.; Lopes, F.P.N.; Voges, J.; Medeiros, A.A.R.; Cárdenas, N.C. Quantifying the dynamics of pig movements improves targeted disease surveillance and control plans. Transbound. Emerg. Dis. 2021, 68, 1663–1675. [Google Scholar] [CrossRef]
  30. Müller, K.; Wickham, H.; James, D.A.; Falcon, S. RSQLite: SQLite Interface for R, Version 2.2.4. Available online: https://CRAN.R-project.org/package=RSQLite (accessed on 12 June 2022).
  31. Thacker, S.B.; Parrish, R.G.; Trowbridge, F.L. A method for evaluating systems of epidemiological surveillance. World Health Stat. Q. 1988, 41, 11–18. [Google Scholar]
  32. Nielsen, S.S.; Alvarez, J.; Bicout, D.J.; Calistri, P.; Canali, E.; Drewe, J.A.; Garin-Bastuji, B.; Gonzales Rojas, J.L.; Schmidt, C.G.; Herskin, M.; et al. Assessment of the control measures of the category A diseases of animal health law: Classical swine fever. EFSA J. 2021, 19, e06707. [Google Scholar] [CrossRef] [PubMed]
  33. Nielsen, S.S.; Alvarez, J.; Bicout, D.J.; Calistri, P.; Depner, K.; Drewe, J.A.; Garin-Bastuji, B.; Gonzales Rojas, J.L.; Schmidt, C.G.; Herskin, M.; et al. Scientific opinion on the assessment of the control measures of the category A diseases of the animal health law: African swine fever. EFSA J. 2021, 19, e06402. [Google Scholar] [CrossRef]
  34. Christensen, J.; Gardner, I.A. Herd-level interpretation of test results for epidemiologic studies of animal diseases. Prev. Vet. Med. 2000, 45, 83–106. [Google Scholar] [CrossRef] [PubMed]
  35. Crauwels, A.P.P.; Nielen, M.; Stegeman, J.A.; Elbers, A.R.W.; Dijkhuizen, A.A.; Tielen, M.J.M. The effectiveness of routine serological surveillance: Case study of the 1997 epidemic of classical swine fever in the Netherlands. Rev. Sci. Tech. Off. Int. Epiz. 1999, 18, 627–637. [Google Scholar] [CrossRef] [PubMed]
  36. Kirkland, P.D.; Le Potier, M.-F.; Finlaison, D. Pestiviruses. In Diseases of Swine, 11th ed.; Zimmerman, J.J., Karriker, L.A., Ramirez, A., Schwartz, K.J., Stevenson, G.W., Zhang, J., Eds.; John Wiley and Sons, Inc.: Hoboken, NJ, USA, 2019; pp. 622–640. [Google Scholar] [CrossRef]
  37. Gallardo, C.; Soler, A.; Rodze, I.; Nieto, R.; Cano-Gómez, C.; Fernandez-Pinero, J.; Arias, M. Attenuated and non-haemadsorbing (non-HAD) genotype II African swine fever virus (ASFV) isolated in Europe, Latvia 2017. Transbound. Emerg. Dis. 2019, 66, 1399–1404. [Google Scholar] [CrossRef] [PubMed]
  38. Bates, T.W.; Thurmond, M.C.; Hietala, S.K.; Venkateswaran, K.S.; Wilson, T.M.; Colston, B.W., Jr.; Trebes, J.E.; Milanovich, F.P. Surveillance for detection of foot-and-mouth disease. J. Am. Vet. Med. Assoc. 2003, 223, 609–614. [Google Scholar] [CrossRef]
  39. Schulz, K.; Conraths, F.J.; Blome, S.; Staubach, C.; Sauter-Louis, C. African swine fever: Fast and furious or slow and steady? Viruses 2019, 11, 866. [Google Scholar] [CrossRef] [PubMed]
  40. Munguía-Ramírez, B.; Armenta-Leyva, B.; Giménez-Lirola, L.; Wang, C.; Zimmerman, J. Surveillance on swine farms using antemortem specimens. In Optimising Pig Herd Health and Production; Maes, D., Segalés, J., Eds.; Burleigh Dodds Science Publishing: Cambridge, UK, 2023; pp. 97–138. [Google Scholar]
  41. Trevisan, G.; Linhares, L.C.M.; Crim, B.; Dubey, P.; Schwartz, K.J.; Burrough, E.R.; Main, R.G.; Sundberg, P.; Thurn, M.; Lages, P.T.F.; et al. Macroepidemiological aspects of porcine reproductive and respiratory syndrome virus detection by major United States veterinary diagnostic laboratories over time, age group, and specimen. PLoS ONE 2019, 10, e0223544. [Google Scholar] [CrossRef] [PubMed]
  42. Branson, B.M. Home sample collection tests for HIV infection. J. Am. Med. Assoc. 1998, 280, 1699–1701. [Google Scholar] [CrossRef]
  43. Tsang, N.N.Y.; So, H.C.; Ng, K.Y.; Cowling, B.J.; Leung, G.M.; Ip, D.K.M. Diagnostic performance of different sampling approaches for SARS-CoV-2 RT-PCR testing: A systematic review and meta-analysis. Lancet Infect. Dis. 2021, 21, 1233–1245. [Google Scholar] [CrossRef] [PubMed]
  44. Fatima, M.; Luo, Y.; Zhang, L.; Wang, P.-Y.; Song, H.; Fu, Y.; Li, Y.; Sun, Y.; Li, S.; Bao, Y.-J.; et al. Genotyping and molecular characterization of classical swine fever virus isolated in China during 2016–2018. Viruses 2021, 13, 664. [Google Scholar] [CrossRef] [PubMed]
  45. Eichhorn, G.; Frost, J.W. Study on the suitability of sow colostrum for the serological diagnosis of porcine reproductive and respiratory syndrome (PRRS). J. Vet. Med. B 1997, 44, 65–72. [Google Scholar] [CrossRef]
  46. Nijsten, R.; London, N.; van den Bogaard, A.; Stobberingh, E. Antibiotic resistance among Escherichia coli isolated from faecal samples of pig farmers and pigs. J. Antimicrob. Chemother. 1996, 37, 1131–1140. [Google Scholar] [CrossRef] [PubMed]
  47. Newberry, K.M.; Colling, A. Quality standards and guidelines for test validation for infectious diseases in veterinary laboratories. Rev. Sci. Tech. Off. Int. Epiz. 2021, 40, 227–237. [Google Scholar] [CrossRef] [PubMed]
  48. Ribeiro Miguel, A.L.; Lopes Moreira, R.P. Fernando de Oliveira. ISO/IEC 17025: History and introduction of concepts. Química Nova 2021, 44, 792–796. [Google Scholar] [CrossRef]
  49. Hobbs, E.C.; Colling, A.; Gurung, R.B.; Allen, J. The potential of diagnostic point-of-care tests (POCTs) for infectious and zoonotic animal diseases in developing countries: Technical, regulatory and sociocultural considerations. Transbound. Emerg. Dis. 2021, 68, 1835–1849. [Google Scholar] [CrossRef] [PubMed]
  50. U.S. Census Bureau. State Area Measurements and Internal Point Coordinates. 2010. Available online: https://www.census.gov/geographies/reference-files/2010/geo/state-area.html (accessed on 25 June 2023).
  51. United Nations. Statistical Yearbook, 65th ed.; United Nations: New York, NY, USA, 2022; pp. 13–35. [Google Scholar]
  52. Ketusing, N.; Reeves, A.; Portacci, K.; Yano, T.; Olea-Popelka, F.; Keefe, T.; Salman, M. Evaluation of strategies for the eradication of pseudorabies virus (Aujeszky’s disease) in commercial swine farms in Chiang-Mai and Lampoon provinces, Thailand, using a simulation disease spread model. Transbound. Emerg. Dis. 2014, 61, 169–176. [Google Scholar] [CrossRef] [PubMed]
  53. Stanojevic, S.; Valcic, M.; Stanojevic, S.; Radojicic, S.; Avramov, S.; Tambur, Z. Simulation of a classical swine fever outbreak in rural areas of the Republic of Serbia. Vet. Med. 2015, 60, 553–566. [Google Scholar] [CrossRef]
  54. Hasahya, E.; Thakur, K.K.; Dione, M.M.; Wieland, B.; Oba, P.; Kungu, J.; Lee, H.S. Modeling the spread of porcine reproductive and respiratory syndrome among pig farms in Lira district of northern Uganda. Front. Vet. Sci. 2021, 8, 727895. [Google Scholar] [CrossRef] [PubMed]
  55. Lee, J.; Schulz, L.L.; Tonsor, G.T. Swine producer willingness to pay for Tier 1 disease risk mitigation under multifaceted ambiguity. Agribusiness 2021, 37, 858–875. [Google Scholar] [CrossRef]
Figure 1. Illustration of the interaction between producer participation, farm-level detection sensitivity, and number of positive farms on the probability of detecting ≥ 1 positive farms in the region.
Figure 1. Illustration of the interaction between producer participation, farm-level detection sensitivity, and number of positive farms on the probability of detecting ≥ 1 positive farms in the region.
Animals 14 00233 g001
Figure 2. As shown for 3 prevalence levels, various combinations of producer participation and farm-level sensitivity produced ≥ 95% probability of detecting ≥ 1 positive herds in the region. Farm-level sensitivity is the probability of a positive test on samples from an infected farm.
Figure 2. As shown for 3 prevalence levels, various combinations of producer participation and farm-level sensitivity produced ≥ 95% probability of detecting ≥ 1 positive herds in the region. Farm-level sensitivity is the probability of a positive test on samples from an infected farm.
Animals 14 00233 g002
Table 1. Population of farm sites used in Phase 1 (pathogen spread) and Phase 2 (probability of detection) simulations by production type and pig inventory 1.
Table 1. Population of farm sites used in Phase 1 (pathogen spread) and Phase 2 (probability of detection) simulations by production type and pig inventory 1.
Total Pig InventoryBreeder Sites (No. Pigs)Breeder–Feeder Sites (No. Pigs)Feeder Sites (No. Pigs)Total Sites (No. Pigs)
≤100047665032964422
(140,823)(254,214)(980,356)(1,375,393)
1001 to 499947429410,49311,261
(1,244,232)(641,930)(29,374,884)(31,261,046)
≥500013212015861838
(1,556,655)(1,595,964)(15,726,642)(18,879,261)
TOTAL1082106415,37517,521
(2,941,710)(2,492,108)(46,081,882)(51,515,700)
1 Swine farm data based on publicly available animal feeding operation permit data provided by the appropriate authorities in the U.S. states of Colorado, Iowa, Kansas, Minnesota, Missouri, Nebraska, Oklahoma, and South Dakota.
Table 2. Phase 1 (pathogen spread): parameters used in simulating the spread of a notifiable swine pathogen in a defined region 1.
Table 2. Phase 1 (pathogen spread): parameters used in simulating the spread of a notifiable swine pathogen in a defined region 1.
Spread ParametersParameter Definitions and/or Values
  • Index farm.
    a.
    Location (pig density).
  • b.
    Production type.
  • First positive farm in each simulation.
    a.
    County-level pig density: low (1.1–3.3 pigs per km2), medium (15.9–25.3 pigs per km2), high (106.8–214.5 pigs per km2).
    b.
    Breeder, breeder–feeder, or feeder.
2.
Direct contact.
a.
Distance for direct contact.
b.
Daily movement rate.
2.
Transmission by moving infectious animals among sites.
a.
BETAPert distribution, min 0.5 km, mode 100 km, max 1000 km.
b.
Fixed rate as specified by farm type:
Destination
Source farmBreeder [28]Feeder [28]Packing Plant [29]
Breeder–FeederNA0.02040.0310
Breeder0.00140.06870.0310
FeederNA0.03480.0310
  • c.
    Probability of infecting a negative farm.
  • c.
    Probabilities tested 0.2, 0.4, and 0.6.
3.
Indirect contact.
a.
Distance for indirect contact.
b.
Daily indirect contact rate.
3.
Transmission by movement of people, fomites, etc.
a.
BETAPert distribution, min 0.5 km, mode 100 km, max 1000 km.
b.
Fixed rate as specified by farm type:
- - - - - - - - - Destination - - - - - - - - -
Source farmBreederFeederPacking Plant
Breeder–FeederNA0.02040.0310
Breeder0.00140.06870.0310
FeederNA0.03480.0310
  • c.
    Probability of infecting a negative farm.
  • c.
    Probabilities tested 0.05, 0.1, and 0.15.
4.
Local area spread.
a.
Probability of infecting a negative farm.
4.
Daily probability of spread to farms ≤ 1 km from infected farm.
a.
Exponential drop off. Probabilities tested 0.001, 0.01, and 0.1.
1 Pathogen spread simulations performed using public domain software [22] and a population of 17,521 farms (51,515,700 pigs) in a contiguous geographic region (1,615,246 km2).
Table 3. Phase 1 (pathogen spread): results of the simulated regional spread of a notifiable swine pathogen reported as the mean number of infected farms on simulation day 70 for specific spread scenarios 1.
Table 3. Phase 1 (pathogen spread): results of the simulated regional spread of a notifiable swine pathogen reported as the mean number of infected farms on simulation day 70 for specific spread scenarios 1.
Index Farm
Location and Type 2
Spread Probabilities
Area SpreadDirect Contact 0.2Direct Contact 0.4Direct Contact 0.6
Indirect ContactIndirect ContactIndirect Contact
0.050.100.150.050.100.150.050.100.15
Low-density county
1.1–3.3 pigs per km2
BF0.001445161621636781
B101112465460191218242
F67825313699113138
BF0.0105551922258696107
B111215596475245279318
F7810313945136160182
BF0.1001314158597119430427534
B344248247298311112712231366
F192533135164190637754864
Medium-density county 15.9–25.3 pigs per km2BF0.001444161719647679
B101112475562195218250
F67927313799126150
BF0.01055621232988102112
B121315646780260304336
F71010354149141164187
BF0.100161818119120146456510595
B424958292317393132513961539
F233040155200209742859956
High-density county
106.8–214.5 pigs per km2
BF0.001555192021667485
B111213515865200233271
F689263441108132151
BF0.010678263133109123141
B141618718094285322358
F101113425155164182210
BF0.10047566724428633098210151288
B8189102446500575166119362093
F577285310368432123913451553
1 Pathogen spread simulations (100 per scenario × 10 index farms) were performed using public domain software [22] in a population of 17,521 farms (51,515,699 pigs) in a contiguous geographic region (1,615,246 km2). 2 BF (breeder–feeder), B (breeder), F (feeder).
Table 4. Spread parameter values from Phase 1 (pathogen spread) selected for use in Phase 2 (probability of detection) simulations.
Table 4. Spread parameter values from Phase 1 (pathogen spread) selected for use in Phase 2 (probability of detection) simulations.
Index Farm Location 1Spread Probabilities
Area SpreadDirect Contact 0.2Direct Contact 0.4Direct Contact 0.6
Indirect ContactIndirect ContactIndirect Contact
0.050.100.150.050.100.150.050.100.15
Low-density county 0.001---------
0.010------
0.100---------
Medium-density county0.001---------
0.010------
0.100---------
High-density county0.001---------
0.010------
0.100------
1 Index farm located in low-density county (1.1–3.3 pigs per km2), medium-density county (15.9–25.3 pigs per km2), or high-density county (106.8–214.5 pigs per km2).
Table 5. Probability of detecting ≥ 1 positive farms as a function of regional prevalence, farm-level sensitivity (%), and producer participation (%).
Table 5. Probability of detecting ≥ 1 positive farms as a function of regional prevalence, farm-level sensitivity (%), and producer participation (%).
Regional
Prevalence 1
Farm-Level
Sensitivity (%) 2
Producer Participation
20%40%60%80%100%
0.1% (18 farms)100.3040.5200.6720.7770.850
200.5190.7770.9000.9570.982
300.6710.9000.9720.9930.998
400.7760.9560.9930.9991.000
500.8490.9820.9981.0001.000
0.2% (35 farms)100.5060.7600.8860.9460.975
200.7600.9450.9890.9981.000
300.8850.9890.9991.0001.000
400.9450.9981.0001.0001.000
500.9751.0001.0001.0001.000
0.3% (53 farms)100.6570.8850.9630.9880.996
200.8850.9880.9991.0001.000
300.9620.9991.0001.0001.000
400.9881.0001.0001.0001.000
500.9961.0001.0001.0001.000
1 Farm-level prevalence in a population of 17,521 farms in a defined region (1,615,246 km2). 2 Farm-level sensitivity is the probability of a positive test on samples from an infected farm.
Table 6. Cost per sampling per farm to collect samples and ship to laboratory 1.
Table 6. Cost per sampling per farm to collect samples and ship to laboratory 1.
CategoryCost per Item 2No. ItemsCost per
Sampling
A. Sample collection. Assumes 10 pigs/farm/sampling
   Option 1. Serum samples
   Blood collection tubes (single-use)0.5250.563105.265.64
   Blood collection needles (single-use)0.5720.613105.726.13
   Plastic tube for pooling 5 samples 30.1710.18320.350.37
   Disposable gloves€0.064$0.0694 (2 pairs)0.260.28
€11.58$12.42
   Option 2. Swab samples (blood, nasal, oral, or fecal)
   Sample collection swabs0.5320.570105.325.70
   Transport medium, e.g., phosphate-buffered saline0.0360.039 5 mL0.360.39
   Plastic tube for pooling 5 samples0.1710.183 20.350.37
   Disposable gloves€0.064$0.0694 (2 pairs)0.260.28
€6.29$6.74
B. Shipment of samples to the laboratory
   Insulated shipping container5.1675.54015.175.54
   Cold packs to ship with samples6.5297.00016.537.00
   Parcel shipping charge€13.991$15.000 13.9915.00
€25.69$27.54
1 EUR (€) 1.00 = USD ($) 1.0721 U.S. (https://www.federalreserve.gov/releases/h10/current/ accessed on 19 June 2023). 2 Means of prices provided by three distributors in the U.S. 3 Cost analysis assumed blood samples would be centrifuged and serum pooled (5 pigs per pool) prior to shipment to avoid sample processing charges at the laboratory.
Table 7. Cost of sampling and testing by specimen and test based on 3 test cost options.
Table 7. Cost of sampling and testing by specimen and test based on 3 test cost options.
Serum Tested by PCRSwabs Tested by PCRSerum Tested by ELISA
Cost per testEUR 18.65EUR 23.32EUR 27.98EUR 18.65EUR 23.32EUR 27.98EUR 4.66EUR 7.00EUR 9.33
Denominator USD 20.00USD 25.00USD 30.00USD 20.00USD 25.00USD 30.00USD 5.00USD 7.50USD 10.00
Per farm in region 1EUR 74.56EUR 83.89EUR 93.22EUR 69.28EUR 78.60EUR 87.93EUR 46.58EUR 51.25EUR 55.91
USD 79.94USD 89.94USD 99.94USD 74.27USD 84.27USD 94.27USD 49.94USD 54.94USD 59.94
Per pig in region 1EUR 0.025EUR 0.029EUR 0.032EUR 0.023EUR 0.027EUR 0.030EUR 0.016EUR 0.018EUR 0.019
USD 0.027USD 0.031USD 0.034USD 0.025USD 0.029USD 0.032USD 0.017USD 0.019USD 0.020
Per pig in inventory
Farms of ≤ 1000 pigs 2
EUR 0.240EUR 0.270EUR 0.299EUR 0.223EUR 0.253EUR 0.283EUR 0.150EUR 0.165EUR 0.180
USD 0.257USD 0.289USD 0.321USD 0.239USD 0.271USD 0.303USD 0.161USD 0.177USD 0.193
Farms of 1001–4999 pigs 3EUR 0.027EUR 0.030EUR 0.034EUR 0.025EUR 0.028EUR 0.032EUR 0.017EUR 0.019EUR 0.021
USD 0.029USD 0.032USD 0.036USD 0.027USD 0.030USD 0.034USD 0.018USD 0.020USD 0.022
Farms of ≥ 5000 pigs 4EUR 0.007EUR 0.008EUR 0.009EUR 0.007EUR 0.007EUR 0.008EUR 0.005EUR 0.006EUR 0.006
USD 0.008USD 0.009USD 0.010USD 0.007USD 0.008USD 0.009USD 0.005USD 0.005USD 0.006
1 Estimates based on 17,521 farms in the region, holding 51,515,700 pigs. 2 Estimates based on 4422 farms with a mean inventory of 311 pigs. 3 Estimates based on 11,261 farms with a mean inventory of 2776 pigs. 4 Estimates based on 1838 farms with a mean inventory of 10,272 pigs.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Trevisan, G.; Morris, P.; Silva, G.S.; Nakkirt, P.; Wang, C.; Main, R.; Zimmerman, J. Active Participatory Regional Surveillance for Notifiable Swine Pathogens. Animals 2024, 14, 233. https://doi.org/10.3390/ani14020233

AMA Style

Trevisan G, Morris P, Silva GS, Nakkirt P, Wang C, Main R, Zimmerman J. Active Participatory Regional Surveillance for Notifiable Swine Pathogens. Animals. 2024; 14(2):233. https://doi.org/10.3390/ani14020233

Chicago/Turabian Style

Trevisan, Giovani, Paul Morris, Gustavo S. Silva, Pormate Nakkirt, Chong Wang, Rodger Main, and Jeffrey Zimmerman. 2024. "Active Participatory Regional Surveillance for Notifiable Swine Pathogens" Animals 14, no. 2: 233. https://doi.org/10.3390/ani14020233

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop