The 1930’s drought that afflicted the Great Plains of the USA motivated the USDA in 1935 to start a weather forecasting program at Michigan Institute of Technology that became the foundation for the monthly weather forecasts distributed by the US Weather Bureau [1
]. Furthermore, the drought compelled agricultural producers to irrigate crops, and questions of how much water and when to apply, i.e., irrigation scheduling, were asked. Attempts to answer these questions using weather networks were initially developed at the University of Nebraska in the early 1980’s [2
]. Since then, the number of weather networks providing evapotranspiration (ET) estimates increased across North America. A survey in 1991 revealed 831 weather stations in USA and Canada [3
], and in 1999, the number of weather stations had increased by 45% to 1200 [4
]. Examples describing the technical aspects of the weather stations and of regional networks for different USA states are given for California [5
], Ohio [6
], Georgia [8
], Oklahoma [9
], Texas [10
], and Washington [11
]. This approach has also been used in other countries, e.g., Kenya [12
], Australia [13
], and Spain [14
]. This is by no means an exhaustive list, and our purpose is to illustrate the proliferation and adoption of weather-generated information to provide estimates of the water requirements of crops. Our attention will focus on the current situation in the Texas High Plains (THP) in terms of the supply of irrigation water and adoption of the information generated by weather stations and used to determine estimates of crop ET.
In Texas, the history of irrigation is well documented [15
]. It is known that in circa 1540, Native Americans had established irrigation systems near El Paso and Pecos, TX, USA, using water from the Rio Grande and Pecos River. Over a 28-year span, from 1716 to 1744, Franciscans were the first Europeans to irrigate crops using “acequias” (irrigation canals) in Catholic missions near San Antonio [17
]. In Texas, corn (Zea mays
L.) was the first field crop to be irrigated by both Native Americans and missionaries [16
]. In the THP, large-scale irrigation started in 1920 and due to the scarcity of surface water, agricultural producers were forced to withdraw the groundwater, first with windmills and thereafter with irrigation pumps [18
]. However, irrigation only became practical in the 1920–1930s with the development of the internal combustion engine [19
] and also in response to the drought of the Dust Bowl [20
]. The 1960’s were characterized as a decade of alarm due to the rapid decline of the ground water table [18
The source of nearly all irrigation water in the THP is the Ogallala Aquifer [21
], which is a large aquifer that covers eight states of the USA. In the THP, the Ogallala Aquifer is mostly a closed system where withdrawals exceed recharge, and over the years has resulted in a decline of the water table [19
]. In some areas of the Ogallala Aquifer, this decline resulted in well capacities that produce less than the daily water requirements of crops, leading to the implementation of so-called deficit irrigation, i.e., the practice of applying less water than the water demand of the crop [24
]. Also, and in response to this decline and to use seasonal rain, the sprinkler irrigation concept known as Low Energy Precision Application (LEPA) was developed for the THP [26
]. This system was adapted for the declining well capacities of the THP that range from 1.3 to >7 mm·day−1
of the daily water supply [29
]. The well capacity often determines the amount of water that can be applied regardless of environmental demand and crop needs. In some areas of the THP, the only irrigation option is to continuously water the crop throughout the growing season, from planting to harvest [29
The development of irrigation systems such as LEPA [26
] and sub-surface drip [30
] requires an estimate of the daily water requirement of the crop. This value may be calculated by multiplying a potential or reference ET value by a crop coefficient (Kc
), and this method is referred to as the “engineering approach” by [29
]. This method was first suggested by [31
] and is now the standard and recommended procedure for irrigation of crops worldwide [32
]. This method to calculate ET provided a “standard” set of calculations for convenience and reproducibility for a hypothetical reference surface applied to a short and tall surface. The standardized reference evapotranspiration (ETsz
) for a short crop (ETos
) is a clipped and cool season grass (ETos
) with a height of 0.12 m, and for a tall crop (ETrs
), is full cover alfalfa with a height of 0.5 m [33
]. The required weather inputs to calculate crop ETsz
for either a short or tall crop are the same and include short-wave irradiance, air temperature and humidity, and wind-speed, normally measured at a screen-height of 2.0 m. An additional input is the surface soil heat flux. The availability of commercial weather stations with data-loggers able to record weather variables every second provide a range of timescales, from minutes to days, for the calculated values of crop ETsz
for the entire growing season.
As previously described, the availability of commercial weather stations, advances in cellular communication and the Internet led to the development of “weather networks” that provided information on crop ETsz
. In 1994, the Texas A&M University Research and Extension Service started the “South Plains Potential ET Network” with weather stations at Halfway, Lubbock and Lamesa, TX. The network provided daily values of ETsz
and heat units from 1 April to 31 October, and the information was delivered on a daily basis via facsimile to subscribers that paid a nominal fee for the service. In 1995, a similar weather network was established in Amarillo, TX, i.e., “Texas North Plains PET Network” with weather stations at Dalhart, Etter, Morse, Whitedeer, Bushland and Dimmit [35
]. Despite this expansion in coverage across the THP, fourteen years later in 2010, due to budget cuts and lack of support, public access to the weather networks was discontinued. This led us to question why irrigators and other users of the network allowed such a valuable tool to be eliminated from their operations in the THP. Perhaps it was reasonable to ask whether the usefulness of the system as an irrigation management tool was less than expected. Other relevant questions included did changes in irrigation and management practices over the life of the weather network reduce its value to the end users? To what extent do the irrigation recommendations of the ET network match the irrigators’ ability to act on those recommendations? While ET networks no doubt provide useful information on the magnitude and temporal pattern of ET over the course of a growing season, the extent to which such data can be considered “actionable” on the part of the irrigator is perhaps not fully understood or appreciated. Furthermore, our experience working with local producers, even those considered progressive irrigators, indicates that at least some remain unconvinced of the value and utility of the information delivered from ETsz
We are not the first to question the value of weather networks to the end users as a tool for water management. The value of climate information as a decision making tool has been investigated [36
], and the adoption of weather information has been reported [40
]. In Texas, a survey sent to 900 agricultural producers indicated the importance of agricultural weather information, e.g., freeze warnings, precipitation probabilities, and soil temperature and water content, commonly broadcast over newspaper, TV and radio media; however, few producers were willing to pay for this information [43
]. A similar conclusion was reached in Oklahoma where farmers and ranchers did not want to pay significant fees to have access to weather information beyond raw data [44
]. Nevertheless, technology transfer of agricultural water management programs from weather networks via extension and outreach activities remains a high priority in Nebraska [45
] and elsewhere. The occurrence of drought events remains a driver for the use and adoption of “smart” irrigation controllers based on ET [46
] and in the adoption of information generated from weather networks. For example, in California, users of the California Irrigation Management Information System increased over a 5-year span (1986 to 1991) from 500 to 2000 [47
To provide insight and to answer the previous questions, we selected and used 30-year (1975–2004) weather data sets for four locations of the THP to investigate the relationship between ET-based irrigation recommendations for cotton (Gossypium hirsutum
L.) and the ability of irrigators to manage irrigation with those recommendations. While our research approach is specific to our location, climate and crop, we believe that our method has general applications to other regions and crops. Our goal was to identify options to improve the successful adoption of ET-based irrigation as a management tool [45
]. Our purpose is not to evaluate specific advantages or disadvantages of any particular irrigation system for cotton production in the THP. Furthermore, we present a first “order solution” to address the decline in irrigation water, and we do not consider the many agronomic options that are available, e.g., capture of rain, cotton varieties, crop rotations, minimum tillage, and other practices that are in use in the THP. The increased competition for water between agriculture and urban users [49
] and the continued incidence of droughts gives us renewed interest in improving tools that irrigators could use to manage water efficiently.
The results presented are best described in terms of an irrigation management approach based on a simple water balance, i.e., inputs and outputs. The primary input factor is the amount of irrigation water that can be applied over the growing season of the crop. This input is adjusted according to the rainfall received and amount of stored soil water. Given these inputs (irrigation, rain and stored soil water), how much water does the crop need and when should this water be applied? In our example, we selected and used a demand-response irrigation approach to answer these questions.
The results showed that when irrigation resources are sufficient for over-irrigation, a crop-ET approach provides the means to prevent over-irrigation, and the implementation of crop-ET approaches will generally result in reducing the amount of irrigation water applied. Further, when irrigation water is not limiting, differences in irrigation water applied when using a “real-time” daily ETc compared to a 30-year average ETc are minimal. This result suggested that it is reasonable to implement an ETc-based irrigation using long-term values of ETc, wherever “real-time” ETc is not available. In our example, for the THP using a 30-year average ETc for irrigation of cotton crop for a range of well capacities and irrigation schemes compared to a daily value of ETc from a weather network, resulted in similar values of irrigation water applied. Further, these differences are within the application “error” that can be achieved with sprinkler and drip irrigation systems. This finding suggests that even in the absence of a real-time weather network to generate ETc values, the use of a long-term historic ETc table by an irrigator is useful to reduce over-irrigation and perhaps to generate interest in the establishment of a real-time ETc network.
An initial question posed in our study was “why does it appear that irrigators in the THP are not making full use of ETc-based irrigation management?” Weather network ETc-based systems were introduced to the study area in the mid-1990, when well capacities where in the 2270 L·min−1 range (Wagner, 2012). However, over the past two decades irrigation capacities have significantly declined, and irrigation goals have shifted from preventing over-irrigation to deficit irrigation. Our results showed that, when well capacities decline to <2270 L·min−1, the usefulness of an ETc-based irrigation management declines to the extent that the perceived value of the weather network becomes less apparent and perhaps effectively non-existent from the irrigator’s perspective.
One additional explanation for the decline in the use of public ET networks is the possibility that producer’s lack of participation may be a result of them using private on-farm weather stations to calculate ET. Advances in weather monitoring make this a real possibility and may indeed contribute to the possible loss of interest in public networks. In this instance, loss of participation could be seen as an indicator of the producer’s perceived value of ET-based scheduling as opposed to a lack of interest. Such an analysis is beyond the scope of this paper but might be valuable in some instances.
What are the broader implications of these findings? The transition from an irrigation management designed to prevent over-irrigation to a deficit-irrigation approach is occurring in many irrigated regions. On the THP where this study was conducted, a combination of reduced well capacities and governmental regulation of water withdrawals have resulted in response limits on irrigation that may require the development of non-ETc based irrigation management.
It is perhaps worth asking, why did the producers simply reduce the irrigated area to accommodate the reduced water supply? Currently, in the THP, there is interest in approaches to reduce irrigated areas in an economically advantageous manner. During the early 2000’s, the issue was possibly complicated by the introduction of cotton varieties with increased water productivity that to some extent offset lint yield losses due to declining irrigation. Finally, insurance issues associated with the shift from irrigated to rainfed may have created some obstacles to abandonment of irrigated area that were not based solely on agronomic considerations. Again, this transition is now occurring in the THP, and efforts to reestablish the regional ET network are underway.
Current research in the ET irrigation community is focused on improving the coverage of weather networks with improved methods to calculate crop ET [56
]. In addition, there is great interest and parallel effort directed toward the development of improved crop coefficients, e.g., [58
]. Regardless of these efforts, it is probable that improvements in ET delivery and crop coefficients will not be sufficient to overcome the limits on ETc
-based irrigation management that result from declining water resources. Our results suggest that the inclusion of an irrigation management analysis based on a demand-response concept may prove useful to assess the utility of irrigation management schemes in general and with regard to research aimed toward improving or refining a given approach. When an ETc
-based irrigation management is to be implemented/refined, it might be helpful to ask the question; how will this improvement/refinement affect how the irrigator will respond to an irrigation demand? While increasingly local and accurate demand values may be obtainable, their value to the irrigator may be ultimately inconsequential.