Strawberry Growth under Current and Future Rainfall Scenarios

: Globally, the changing and interacting effects of temperature and precipitation are anticipated to inﬂuence the ﬁtness of specialty crops. Strawberry ( Fragaria x ananassa ) is an important crop in the Northeastern United States. In this study, four plausible precipitation scenarios were developed to be representative of current and future growing season precipitation patterns. Using a precipitation simulator, we tested these scenarios on potted-day-neutral strawberries. This study generated four primary results. (1) Though some treatments received different amounts of precipitation, little difference was observed in soil volumetric water content or temperature. Treatments designed to simulate future conditions were more likely to have higher nitrate-in-leachate (N-leachate) concentrations than those designed to simulate current conditions. (2) Neither total precipitation nor seasonable distribution were associated with foliar or root disease pressure. (3) While there was a slightly higher chance that photosynthesis would be higher in drier conditions, little difference was observed in the effects on chlorophyll concentration and no water stress was detected in any treatment. (4) Leaf biomass was likely more affected by total rather than seasonal distribution of precipitation, but the interaction between changing rainfall distribution and seasonal totals is likely to be an important driver of root biomass development in the future.


Introduction
Internationally, the United States is ranked second in strawberry (Fragaria x ananassa) production, producing 20% of the global market and generating ≈USD 2.3 B gross sales annually. In the Northeast, strawberries are produced on ≈1538 hectares (3800 acres) for wholesale and direct markets. In this region, strawberry production is most commonly integrated with diversified farming operations, with an average production per farm of 0.61 hectares (1.5 acres). Production areas are commonly located around high-population centers, suggesting the important role that strawberries play in meeting a fresh and regional market demand. While nearly all of the strawberries produced in the Northeast are consumed within the region, only 95% of strawberries purchased in Northeast states are grown there [1]. This means there are significant opportunities for growers and the agricultural advisors who support them to address current production challenges and increase production to better meet local consumer demand.
Temperature and solar radiation have been shown to be important limiting factors in strawberry yield efficiency [2]; however, crop-water use is also important [3]. It is estimated that strawberries require 2.54 cm (1 inch) of water every 12-14 days during the growing season (April-October, in the Northeastern United States), increasing to 3.81 cm (1.5 inches) of water every 7 days during the fruiting period [4]. Increasingly, growers in the Northeast are using protected cultivation to produce strawberries (i.e., low tunnels, high tunnels and greenhouses) in order to protect and enhance fruit marketability, reduce disease pressure, increase both total yield and the size of individual fruits, extend fruit production and control phenological development [5]. However, it is still common for strawberry growers to produce their crops in unprotected environments, in which case strawberries remain vulnerable to precipitation and temperature extremes and the associated challenges these extremes present.
Shifting precipitation patterns is widely accepted as one of the most consequential effects of anthropogenic climate change [6]. Globally, it is expected that these shifting precipitation patterns will influence both ground and surface water resources and, by extension, agroecological systems. Surface water reliability in particular is expected to change due to increased variability in precipitation patterns [7]. Groundwater sources are less prone to changing due to variable precipitation than surface waters; however, they are still vulnerable. Groundwater reservoirs are often more spatially dispersed and serve as important reserves during periods when surface water bodies are unavailable or unreliable. They are also difficult to monitor and can be expensive to access through drilling and pumping [8]. Groundwater use is considered sustainable when the rate of withdrawal does not exceed the rate of recharge. However, climate change is anticipated to affect recharge mechanisms (i.e., precipitation, temperature and sea level), thereby drawing into question the reliability of some regionally important groundwater reservoirs [9]. This is of considerable concern in regions where there is a high reliance on groundwater for municipal, industrial and agricultural purposes.
The Northeast region of the United States is one such region. The Northeast comprises 12 states (Connecticut, Delaware, Maine, Maryland, Massachusetts, New Hampshire, New Jersey, New York, Pennsylvania, Rhode Island and Vermont), as well as the District of Columbia. Since 2001, this region has experienced a notable increase in both total annual precipitation and extreme precipitation [10,11], outpacing changes observed in other parts of the United States [12]. Extreme precipitation has been increasing, both in terms of number of extreme events and intensity, associated with the intensification of the hydrologic cycle [13]. Along with an increase in seasonal extreme rainfall events, it is anticipated that increasingly common dry periods and drought will affect water resources across the region [14] and that water use efficiency and access will become an increasingly important consideration for agricultural sectors [15], including small fruit production systems.
It is well established that both water availability (i.e., soil-water potential) and the relationship between water absorption and transpiration rates have direct effects on crop physiological performance [16]. Precipitation is an important driver of soil water potential that influences water availability for agriculture and provides necessary moisture for primary crop physiological development. In addition, precipitation during the non-cropping season in the form of rain or snowfall contributes to ground-and surface-water reservoirs, which can later be utilized for irrigation, fertigation, or frost protection. Small fruit specialty crops are particularly vulnerable to changes in precipitation (frequency and intensity) during critical growth periods. For example, strawberries are sensitive to yield loss and quality degradation as a result of disease (related to excess water) and poor fruit development (related to both limited and excess water) [17]. Even for fruit crops with high resilience to drought, such as wild blueberries in Northeast USA, water deficits greatly impact the crop vigor and yield [18].
To meaningfully assess the effects of changing precipitation patterns on specialty crops in general and strawberries in particular, we must look at both crop yield and a broader range of plant quality and agroecological outcomes [19]. Assessing environmental conditions (e.g., soil water potential) and plant physiological functions (e.g., photosynthesis Water 2022, 14, 313 3 of 18 indicated by chlorophyll fluorescence and chlorophyll concentration) generates important insights into crops' resistance to a changing climate. For example, high strawberry root mass density has been shown to increase nutrient uptake, with root mass density being associated with soil water potential [20]. Sub-seasonal root biomass shrinkage in this crop is associated with fruiting periods, a time when nutrient needs are also likely at their highest [21]. It is likely that changes in biomass are influenced by soil water potential and have subsequent effects on strawberries' ability to metabolize nutrients and perform core physiological functions. In another example, the chlorophyll concentration of leaves is considered to be a reliable indicator of photosynthetic capacity and has been found to be positively correlated with photosynthetic CO 2 fixation [22], as well as yield [23]. The estimation of leaf chlorophyll concentration allows researchers to estimate not only plants' photosynthetic capacity, but also the effects of disease, nutritional, or environmental stress [24,25]. Though chlorophyll content has been positively associated with yield in some crops [26], it is estimated that, under periods of water stress, photosynthetic activity decreases [27], likely leading to decreased yield and crop quality.
Soil water potential also influences other important agroecological dynamics. For example, it has been shown that nitrogen (N) leaching is influenced by precipitation, though much research on this topic has focused on arid or semi-arid regions. In these environments, N accumulates in soils during extended dry periods and is released during rewetting events [28,29]. Recent studies suggest that similar dynamics exist in temperate climates [28,30], though, in temperate contexts, the shorter periods between precipitation events likely lead to lower concentrations of N in leachate. The relationship between the rewetting of dry soils and N flushing in leachate is partially explained by the activation of microbial biomass [31]. Additionally, disease pressure has been shown to vary with changes in soil water potential in small fruit crops, for example, in cultivated grapes [32]. Specifically relevant to strawberries, water deficits have been found to reduce the severity of biotrophic pathogens [33], including those that affect plant foliage [34].
Considering the complexity of these dynamics and the likely sensitivity of strawberries to changes in precipitation and water applications, the objectives of this research study are to determine the influence of four precipitation scenarios on the production of strawberries. Specifically, we aim to understand how strawberries perform under different precipitation scenarios in terms of (a) total leachate amounts and loss of nitrate through leaching, (b) incidence of disease and insect damage and (c) plant physiological performance, including chlorophyll concentration and photosynthesis.

Experimental Design
A small-scale, portable rainfall simulator was constructed, using a design adapted from Humphry et al. [28]. A single, stationary nozzle was mounted on the steel frame at a height of 3.048 m (10 ). The base area of the simulator was 1.5 × 2 m (4.9 × 6.7 ). Simulated precipitation was calibrated by assessing the length of time the simulator needed to run to fill empty pots to 4921 mL (1 ) at a PSI between 5 and 8. The selected setting precipitated a rate of 289 mL/min (17 min to fill to 1 ). Deionized water was used for the experiment, pumped through the simulator using a small Shurflo pump at 3.5 gallons per miniature and a maximum of 310.26 kPa (45 PSI).
The treatment schedules in this study followed four plausible future rainfall scenarios ( Table 1) that encompass aspects of a wetter future climate with an intensified hydrologic cycle [10][11][12]14], as well as an unexpected drier future climate. In order to maximize the realistic nature of these scenarios, we utilized historical station observations local to the study domain rather than from a downscaled general circulation model. To do this, we used daily precipitation values during the warm season (1 April-30 September) for the years 2001 and 2019 observed at the Bangor International Airport (44.8 • N−68.82 • W; elevation, 58 meters (190 )), a site located within the southern interior climate division. Precipitation data were sourced from the Global Historical Climatology Network (GHCN) [29], station  (Figure 1). The first two scenarios represent plausible future precipitation regimes following the historically dry (RECDry) and wet (RECWet) conditions observed in 2001 and 2019, respectively ( Figure S1). In 2001, modest intervals of rainfall were interspersed between dry intervals ranging from two to four weeks in duration. Only a single rainfall event, 4.70 cm (1.85 inches) on 25 September, exceeded a 2.54 cm (1 inches) threshold that season. This is in contrast to 2019, which saw frequent rainfall throughout the warm season and eight instances of daily rainfall >6.38 cm (2.51 inches). The third and fourth plausible future scenarios are based on 2001 precipitation, but with daily values being increased by the factors 1.43 (AMPDry1.43) and 1.89 (AMPDry1.89) to align with a season total equivalent to that observed for the historically wet year 2019. Thus, these latter two scenarios represent plausible future climates where most of the season precipitation is accumulated during heavy rainfall events that are separated by extended intervals of dryness. In AMPDry1.89, the 25 September rainfall event noted above was removed such that all other daily rainfall values were increased to bring the season total to precipitation to the same amount in AMPDry1.43 (i.e., equivalent to the total observed in 2019). Day-neutral strawberries (Jewel variety) were purchased bare root and planted into containers measuring 12.7 cm (5 inches) × 12.7 cm (5 inches) square, 30.48 cm (12 inches) in depth. The soil medium used for potting was Pro-Mix BX Original. Between precipitation simulations, plants were stored inside a greenhouse made of 8 mm clear twin-wall polycarbonate paneling (roof, sidewalls and end walls) measuring 14.78 m (48.5 feet) × 5.79 m (19 feet). The greenhouse was ventilated with horizontal air flow fans. Data collection occurred between 19 May and 30 September, 2021. A variety of variables were measured in order to assess strawberry plant physiological responses to the four precipitation treatments.

Volumetric Water Content, Soil Temperature, Leachate and Nitrate-Leachate Concentration
Soil volumetric water content and soil temperature in each pot were measured on a weekly basis using a TDR soil moisture meter (TDR 150; Spectrum Technologies In., Aurora, IL, USA). Average values were calculated for each treatment across all pots. Free draining lysimeters were constructed under each treatment block (8 plants each), with 2 replications per treatment (16 plants per treatment). Leachate was collected after every simulated rainfall event. The total volume of leachate was recorded (mL) for each replicate and subsamples were stored at 4.44 • C (40 • F). Subsamples collected within a one-week period were combined and tested for nitrate concentration (colorimetric analyses) by the University of Maine Soil Testing and Analytical Lab. Not all strawberry plants survived the season, but pots containing dead plants were included in volumetric water content, soil temperature, leachate and N-leachate concentration measurements until the conclusion of the experiment.

Disease
Plants were inspected for visual evidence of symptoms at the end of the experiment. Specifically, the percentage of leaf area with leaf spots on each plant were recorded and averaged across all plants within a treatment. At the conclusion of the experiment, crown and root diseases were assessed through destructive sampling of the remaining living plants. During this process, two pieces of roots and two pieces of crown from each plant were cut and sterilized with a 10% bleach solution for one min and then rinsed in sterile, deionized water. The plant material was then plated in water agar fungal media and incubated at 25 • C for one week. Agar plates were viewed under a compound microscope (Olympus BX5) and assessed for fungal diseases, which were morphologically identified if present. When a pathogen was identified from a plant piece, a percentage was generated with 100% representing all four of the plated plant pieces having fungal growth. Foliage showing symptomatic leaf spots were selected and placed in a moist chamber that was also incubated at 25 • C for one week. After incubation, foliage was viewed under a dissecting microscope (Olympus SZX16) to determine the presence and absence of pathogenic fungal growth.

Leaf Chlorophyll Fluorescence and Chlorophyll Concentration
Leaf photosynthetic performance was assessed through leaf chlorophyll fluorescence measurements. The variables measured included the quantum yield of photosystem II (Y(II)), which equals Fv/Fm and is sensitive to stress. We also assessed the electron transport rate (ETR), which indicates actual photosynthetic performance under specific light conditions and is related to photosynthetic CO 2 assimilation [35]. Both Y(II) and ETR were measured weekly from 12:00 to 4:00 p.m. using a FluorPen portable leaf fluorescence meter (Photon Systems Instruments, Drásov, Czech Republic). Additionally, we assessed leaf chlorophyll concentration, which has been shown to be strongly related to N concentration of mature leaves. It was expected that there was a strong correlation between chlorophyll concentration and photosynthetic rate. Leaf chlorophyll concentration soil-plant analysis development (SPAD) measurements were made weekly using a hand-held chlorophyll meter (SPAD 502; Minolta Corp., Osaka, Japan), an evidence-based approach to assess plant N status [36]. All measurements related to photosynthetic potential and performance and chlorophyll concentration were made on a weekly basis from individual plants (one leaf per plant).

Biomass
Above-and below-ground biomass samples were collected as the strawberry plants died over the course of the experiment and at the end of the experiment. Above-ground biomass assessment methods established by the United States Department of Agriculture (USDA) Natural Resources Conservation Service [36,37] were used in this assessment. Specifically, all vegetation was clipped at the soil surface and stored prior to processing. Samples were weighed prior to being oven-dried at 60 • C and subsequently reweighed. To assess below-ground biomass, all soil was removed from the plant roots and processed using the same approach used for above-ground biomass samples.

Analyses
To assess the overall seasonal effects of the precipitation treatments on our various, normally distributed dependent variables, we performed a series of one-way analysis of variance (ANOVA) tests with Tukey's HSD post hoc tests. For those variables that did not meet assumptions of normality, non-parametric Kruskal-Wallis rank sum tests, followed by pairwise Wilcoxon rank-sum comparisons, were employed. To better assess the effects of seasonal precipitation distribution of precipitation on relevant dependent variables, a series of generalized linear regression models (GLMs) with gamma distributions and inverse link functions was used. This approach was chosen because we assumed a fixed relationship between the mean values of each model and the variance, as all readings were always greater than zero with non-normal, skewed distributions [38,39]. Gamma distributions are often utilized in description of precipitation data [40] and can be interpreted using both transformed and original scales [41]. All analyses included in this manuscript were performed using R Version 4.1.0 [42] and the plots were created using ggplot2 [43].

Volumetric Water Content, Soil Temperature, Leachate and Nitrate-Leachate Concentration
Simulated precipitation amounts (cm) per week, soil volumetric water content (%) and soil temperature (degree Celsius) of different treatments throughout the experiment are depicted in Figure 2. As expected, the precipitation amounts applied through the rainfall simulator were associated with differences in volumetric water content, though the difference among treatments was not as great as expected (see Table 2 for model results). Likewise, there was little difference between treatments and their effect on soil temperature. The use of commercial potting media with excellent drainage may have influenced these results. This suggests that, unless extreme rainfall events occur, well-drained soils are unlikely to demonstrate highly variable responses in kPa or degree C. difference among treatments was not as great as expected (see Table 2 for model results). Likewise, there was little difference between treatments and their effect on soil temperature. The use of commercial potting media with excellent drainage may have influenced these results. This suggests that, unless extreme rainfall events occur, well-drained soils are unlikely to demonstrate highly variable responses in kPa or degree C.
It should be noted that not all plants survived the experiment. In RECDry, nine plants died between weeks 5 and 9 (56% of plants in this treatment); in RECWet, one plant died by week 10 (6%); in AMPDry1.43, one plant died by week 1 (6%); and, in AMPDry1.89, two plants died by week 10 (13%). However, not all plant death was attributable to the experimental design. Planting stock failure was the cause of early plant mortality.  It should be noted that not all plants survived the experiment. In RECDry, nine plants died between weeks 5 and 9 (56% of plants in this treatment); in RECWet, one plant died by week 10 (6%); in AMPDry1.43, one plant died by week 1 (6%); and, in AMPDry1.89, two plants died by week 10 (13%). However, not all plant death was attributable to the experimental design. Planting stock failure was the cause of early plant mortality.
To assess total water loss through leaching, we recorded the total volume of water collected in modified pan lysimeters from each treatment. An ANOVA showed statistically significant differences overall (F = 6.846, df = 3, p < 0.000) and post hoc (Tukey's HSD) tests suggested differences between all treatments except AMPDry1. 43 and RecWet, which reflects the similar total precipitation amounts applied in these two scenarios. Nitrate-inleachate (N-leachate) concentrations were low in all samples, between 0.014 mg/L and 0.628 mg/L (Figure 3). Differences in the actual rate of N-leachate between treatments were determined using a Kruskal-Wallis rank-sum test (H = 13.73, df = 3, p = 0.003). While there were statistical differences observed between RECDry and all other treatments, there was no statistical difference observed among other treatments.
Model results (Table 3) show that the precipitation treatments influenced the N-leachate (mg/L) on a weekly basis. Notably, treatments designed to simulate future conditions had higher seasonal and, often, weekly precipitation totals with uneven seasonal distribution (AMPDry1.43 and AMPDry1.89). These treatments were more likely to have higher N-leachate concentrations than treatments designed to simulate current conditions, including the lowest precipitation treatment (RECDry) and the only treatment with even seasonal distribution (RECWet). The difference between the Kruskal-Wallis and GLM results suggests that more N was lost from soil media when precipitation increased on a weekly basis and was unevenly distributed across the growing season. It should also be noted that no additional N was provided to the plants beyond what was included in the soil media and additional applications may have led to more noticeable differences between treatments. This should be further explored in future research.
In regards to leachate amounts, our models diverged from the findings of the ANOVA and Tukey's HSD tests described above. Specifically, the leachate model suggested minimal difference between treatments and the amount of weekly leachate collected, and a poor model fit overall. This suggests that collecting weekly leachate amounts is a less useful measure than assessing total leachate over the course of a growing season. Our results were likely influenced by the practice of collecting leachate data after plants had died (in weeks 5-9), which may have led to superficially high rates of leachate drainage in plots without living tissue.  In regards to leachate amounts, our models diverged from the findings of the ANOVA and Tukey's HSD tests described above. Specifically, the leachate model suggested minimal difference between treatments and the amount of weekly leachate collected, and a poor model fit overall. This suggests that collecting weekly leachate amounts is a less useful measure than assessing total leachate over the course of a growing season. Our results were likely influenced by the practice of collecting leachate data after plants had died (in weeks 5-9), which may have led to superficially high rates of leachate drainage in plots without living tissue.

Disease
Visual inspections of leaf tissue were conducted in the last week of the experiment (on 28 September 2021). Leaf spots were typical of late season strawberry plantings, meaning that all treatments had some evidence of leaf spot; RECwet averaged 3% leaf coverage, RECdry averaged 3%, AMPdry1.43 averaged 2% and AMPdry1.89 averaged 1%. Symptomatic foliage was incubated for one week, after which two pathogens were observed, Neopestalotiopsis sp. and Alternaria sp., in the RECDry and AMPDry1.89 treatments. Notably, neither pathogen was observed on foliage in the RECWet and AMPDry1.43 treatments.
Roots and the crown of each plant were sampled and tested for the presents of fungal pathogens by plating in water agar plates. The pathogens Pythium sp., Rhizoctonia sp. and Fusarium sp. were found in each treatment in similar amounts (no differences among treatments). When processing plants for disease assessment, crown discoloration was noted on the majority of the plants under the RECWet (11 out of 15 plants) and AMPDry1.89 (5 out of 14 plants) treatments, but not on those under the RECDry or AMPDry1.43 treatments.

Leaf Chlorophyll Fluorescence and Chlorophyll Concentration
Photosynthetic potential and performance were assessed through leaf chlorophyll fluorescence. A Kruskal-Wallis rank-sum test indicated no significant difference in Y(II) among treatments (H = 3.582, df = 3, p = 0.3103). A Kruskal-Wallis rank-sum test was also used to assess difference in the electron transport rate (ETR), or photosynthetic performance, across treatments (H = 14.163, df = 3, p = 0.002). Pairwise Wilcoxon rank-sum comparisons showed significant differences between AMPDry1.43 and all other treatments, but, otherwise, no differences among treatments (Figure 4). Chlorophyll concentration was assessed using SPAD readings, or the difference between the transmittance of red light and infrared light (both measured in nanometers). An ANOVA showed statistically significant differences across treatments (F = 23.16 df = 3, p < 0.000) and post hoc (Tukey's HSD) tests suggested differences among all treatments (p < 0.01) with the exception of between AMPDry1.43 and AMPdry1.89.
This suggests that the amount of rainfall in any given precipitation event was less important than the distribution of precipitation across time, when examining the effects of rainfall on photosynthetic potential, performance and chlorophyll content. However, three GLM model results (Table 4) showed little difference in Y(II) or ETR, with a slightly higher chance that Y(II) and ETR rates would have been higher in drier conditions. It should be noted that Y(II) remained fairly constant throughout the season across treatments, while ETR trended slightly down ( Figure 5). Additionally, little difference was observed between treatments in terms of their effects on chlorophyll concentration (SPAD), which is related to N-accumulation in leaves. Chlorophyll concentration trended slightly higher towards the end of the experimental period.  This suggests that the amount of rainfall in any given precipitation event was less important than the distribution of precipitation across time, when examining the effects

Biomass
Differences in biomass among treatments were assessed at the conclusion of the experiment, with analyses having been conducted on plant leaves, plant roots and total biomass. An ANOVA test demonstrated that significant differences in leaf biomasses were evident (F = 3.006, df = 3, p = 0.04), though post hoc (Tukey's HSD) tests suggested that the

Biomass
Differences in biomass among treatments were assessed at the conclusion of the experiment, with analyses having been conducted on plant leaves, plant roots and total biomass. An ANOVA test demonstrated that significant differences in leaf biomasses were evident (F = 3.006, df = 3, p = 0.04), though post hoc (Tukey's HSD) tests suggested that the only statistically significant differences were found between the RECDry and AMPDry1.43 treatments. This suggests that the total amount of water received throughout the growing season was a more important driver of leaf biomass than the seasonal distribution of water. Similarly, we performed an ANOVA test on root biomass, finding, again, that significant differences were evident among treatments (F = 11.35, df = 3, p < 0.000). Post hoc (Tukey's HSD) tests revealed significant differences (adjusted p < 0.01) among several treatments ( Figure 6). Differences existed primarily between the precipitation treatments designed to represent current conditions (the REC treatments) and those designed to represent future conditions (the AMP treatments), with root biomass being lower in the current condition scenarios. This suggests that the interaction between changing distribution and seasonal total amounts of precipitation is likely to be an important driver of increased root biomass development in the future. only statistically significant differences were found between the RECDry and AMPDry1.43 treatments. This suggests that the total amount of water received throughout the growing season was a more important driver of leaf biomass than the seasonal distribution of water. Similarly, we performed an ANOVA test on root biomass, finding, again, that significant differences were evident among treatments (F = 11.35, df = 3, p < 0.000). Post hoc (Tukey's HSD) tests revealed significant differences (adjusted p < 0.01) among several treatments ( Figure 6). Differences existed primarily between the precipitation treatments designed to represent current conditions (the REC treatments) and those designed to represent future conditions (the AMP treatments), with root biomass being lower in the current condition scenarios. This suggests that the interaction between changing distribution and seasonal total amounts of precipitation is likely to be an important driver of increased root biomass development in the future.

Discussion
In this study, we identified four important findings. First, we found that leachate amounts did not differ significantly regardless of precipitation amount and distribution across the season. The soil water capacity of the potting media used in this experiment likely buffered the effects of the precipitation simulation treatments to some degree. Potting media typically have a standardized total porosity of ≥50% (target air-filled porosity ≥10% and target water-holding capacity ≥40%) [44]. It is possible that the treatments received simulated precipitation in amounts that did not exceed the field capacity of the potting mix, meaning that the drainable porosity of the soil was not notably different.
Of greater importance is the relationship between N-leachate and total leachate. Our results show that this relationship was not straightforward, but in alignment with the few field-based trials that address this topic. One such study assessed the effects of rainfall intensification on N-cycling in temperate, agricultural soils. This work showed that, while percolation may change depending on rainfall intensity, N-transformations differ depending on moderating factors, such as tillage practices [30]. In our study, higher precipitation amounts unevenly distributed across the growing season (in the AMPDry treatments) were more likely to be associated with elevated N-leachate, despite the higher survival rate of experimental plants in AMPDry treatment (and assumed higher N use by those plants). This finding is notably evident from our GLM results, though the relationship is obscured when data are analyzed using Kruskal-Wallis rank-sum and post hoc tests. This supports past findings from experiments in arid and semi-arid environments,

Discussion
In this study, we identified four important findings. First, we found that leachate amounts did not differ significantly regardless of precipitation amount and distribution across the season. The soil water capacity of the potting media used in this experiment likely buffered the effects of the precipitation simulation treatments to some degree. Potting media typically have a standardized total porosity of ≥50% (target air-filled porosity ≥10% and target water-holding capacity ≥40%) [44]. It is possible that the treatments received simulated precipitation in amounts that did not exceed the field capacity of the potting mix, meaning that the drainable porosity of the soil was not notably different.
Of greater importance is the relationship between N-leachate and total leachate. Our results show that this relationship was not straightforward, but in alignment with the few field-based trials that address this topic. One such study assessed the effects of rainfall intensification on N-cycling in temperate, agricultural soils. This work showed that, while percolation may change depending on rainfall intensity, N-transformations differ depending on moderating factors, such as tillage practices [30]. In our study, higher precipitation amounts unevenly distributed across the growing season (in the AMPDry treatments) were more likely to be associated with elevated N-leachate, despite the higher survival rate of experimental plants in AMPDry treatment (and assumed higher N use by those plants). This finding is notably evident from our GLM results, though the relationship is obscured when data are analyzed using Kruskal-Wallis rank-sum and post hoc tests. This supports past findings from experiments in arid and semi-arid environments, where N accumulated during extended dry periods and was then rapidly released during the microbial processes that followed subsequent precipitation events [28]. For example, large pulses of precipitation have been shown to be associated with higher rates of N losses through denitrification in Argentina [45]. While the degree to which these dynamics affect N release in temperate climates was previously uncertain, our study suggests similar dynamics are at play in temperate and arid climates, albeit to different degrees. Our findings are also corroborated by studies pointing towards altered soil moisture following intense precipitation events in temperate cropping systems (in the Midwest United States), specifically suggesting changes in N mobilization following heavy rainfall events [30,46], especially if those rainfall events follow significant droughts [47].
Second, neither the total precipitation amount or seasonable distribution was associated with foliar or root disease pressure. Similar degrees of damage from disease were observed across all treatments at the end of the experimental period. Most studies looking at water stress have been focused on water deficit due to the effect of climate change in the region of study rather than an overabundance of water. Even with water deficits, a pathogen's effect on a plant is specific to the pathogen being studied. This was observed in inoculated grapes undergoing water stress that showed no significant difference between well-watered vines compared to those undergoing water deficit, while a different pathogen in the same study showed a significant increase in colonization of the under well-watered treatment [32]. Biotrophic pathogens have been shown to be less severe in strawberries that have gone through a water deficit [33]. Since we evaluated all diseases present, it is possible the diseases we did observe were not affected favorably or negatively by the treatments we carried out. It is also possible there was no difference between our treatments because the plants were not inoculated and were potted (no movement of soilborne diseases to move from one plant to the next) and protected in a greenhouse (limiting exposure to wind-dispersed spores). Additionally, we expected to observe endemic examples of red stele (Phytophthora fragariae), a common fungal disease that affects strawberry roots. However, this relied on the plants already being diseased at the time of planting, which was not the case.
Third, this study demonstrated that the changing precipitation amounts and distribution across the growing season may have had a small effect on leaf photosynthesis, with drier conditions being associated with slightly higher Y(II) and ETR rates. This finding is supported by prior studies, which showed that, while drought affected strawberry plants' leaf water potential, fresh and dry mass, leaf area and leaf number, it did not appear to affect Y(II) significantly; therefore, it may not be the best indicator of drought resistance [48]. However, another research study has demonstrated that drought stress in fruit crops led to down-regulation of photosynthesis, with severe drought conditions decreasing the degree of correlation between fluorescence and photosynthetic performance [49]. Recovery from drought conditions can occur within 24 h, depending on the crop and the severity of the drought conditions [50]. Our findings support this and we suggest that the distribution of precipitation across time (which captures extended dry periods and drought conditions) has more influence on photosynthetic capacity and performance than the intensity of any one precipitation event. Additionally, while our study does not support the assumption that chlorophyll content in strawberries is affected by different precipitation treatments, prior research has suggested that severe drought leads to reduced chlorophyll concentration and that chlorophyll concentration recovers more slowly from drought than photosynthetic capacity [51]. It is likely that our study does not align with this prior work due to the limited degree of drought introduced through our treatments, and that more severe drying would yield different results. Indeed, no long-term stress was detected in strawberry plants under any of the treatments, demonstrated by the fact that Y(II) remained fairly constant throughout the period of treatments (i.e., there was no reduction in quantum yield of photosystem II due to treatments).
Lastly, our results suggest that the total amount of water received throughout the growing season was a more important driver of leaf biomass than the seasonal distribution of water. The interaction between changing distribution and seasonal total amounts of precipitation is likely to be an important driver of increased root biomass development in the future. It has long been known that root development in strawberries follows periods of carbohydrate accumulation [52], which is facilitated by sufficient levels of cropavailable water. More robust roots are associated with better crop water-use efficiency and uptake of soil-immobile nutrients [53]. However, it is noted that strawberry cultivars can vary significantly when it comes to their tolerance to water deficits [3,54]. Cultivars exhibiting high rates of water use efficiency are more likely to have greater leaf biomass [54]. Additionally, prior research suggests that the canopy structure of strawberries is influenced by sustained drought conditions over time and that total aerial biomass may be less important for drought resistance than canopy structure, leaf orientation and osmotic adjustments [55].
Two notable limitations of the current study should be considered when conducting future investigations of this type. The first limitation pertains to the inclusion of dead plants in the study sample. As stated above, our team made the decision to include pots containing dead plant materials in our data. Doing so was deemed to simulate realistic conditions in the field, specifically related to leachate and nitrate in leachate. However, it is possible that excluding pots with dead plant materials (i.e., replacing them with living plant materials as they died) would have generated different results. This possibility should be explored in future iterations of this type of experiment. The second limitation pertains to the characterization of future precipitation scenarios. Specifically, in AMPDry1.89, a large precipitation event on September 25th was removed and all other daily rainfall values were subsequently increased to bring the season total to that of AMPDry1.43 (and also to that observed in 2019). The observed rainfall for that day in 2001 was 4.7 cm (1.85 inches) and, in AMPDry1.43, that value was 6.7 cm (2.65 inches). Our expectation was that a measurable strawberry response would be produced from the AMPDry1.89 signal-wherein the September 25th rainfall was removed and the signal amplification factor was set to 1.89 (to match the season total precipitation observed in the wet year 2019). However, AMPDry1.89 produced only a small difference in strawberry growth. This experiment could have instead used a larger amplification factor and for a scenario with seasonal rainfall totaling more than that observed in 2019.

Conclusions
Understanding crop responses to current and future precipitation scenarios requires an agroecological systems approach. Through a simulated precipitation experiment in potted strawberries, we observed that, probable future precipitation conditions in the Northeastern United States may lead to increased N-leachate concentrations, implying that producers may need to invest in new approaches that both increase the efficiency of fertilizer applications and minimize leaching of nitrate into ground and surface water. It is likely that precipitation variability across the growing season will stress unirrigated crops in future decades and that, without attentive irrigation applications, these crops will suffer reduced photosynthetic performance and, by extension, yield and quality. Further, projected climate warming in the future will also increase the risk of drought stress due to increased crop and soil water loss (evapotranspiration) [56]. Lastly, we found that leaf biomass is likely more affected by total precipitation than seasonal distribution, but the interaction between changing distribution and seasonal total amounts of precipitation is likely to be an important driver of increased root biomass development. As strawberries and other small fruits require increased water during specific growth periods (i.e., fruit set), distribution of precipitation has implications for yield and fruit quality especially in agroecological systems without supplemental irrigation. The interactions between water availability, changing precipitation patterns and plant physiology has implications for strawberry production specifically, but also points towards the need to better understand how changing precipitation will alter agroecosystems in temperate climates. Climate adaptive management will likely require commercial producers to alter water, soil, crop health and fertility management if sustainable production is to be a priority in the future.

Data Availability Statement:
The data presented in this study are available from the authors.