Improving Irrigation Water Use Efficiency: A Review of Advances, Challenges and Opportunities in the Australian Context

The demand for fresh water is on the increase, and the irrigation industry in Australia is looking to a future with less water. Irrigation consumes the bulk of the water extracted from various sources, and hence the efficiency of its use is of outmost importance. This paper reviewed the advancements made towards improving irrigation water use efficiency (WUE), with a focus on irrigation in Australia but with some examples from other countries. The challenges encountered, as well as the opportunities available, are also discussed. The review showed that improvements in irrigation infrastructure through modernisation and automation have led to water savings. The concept of real-time control and optimisation in irrigation is in its developmental stages but has already demonstrated potential for water savings. The future is likely to see increased use of remote sensing techniques as well as wireless communication systems and more versatile sensors to improve WUE. In many cases, water saved as a result of using efficient technologies ends up being reused to expand the area of land under irrigation, sometimes resulting in a net increase in the total water consumption at the basin scale. Hence, to achieve net water savings, water-efficient technologies and practices need to be used in combination with other measures such as incentives for conservation and appropriate regulations that limit water allocation and use. Factors that affect the trends in the irrigation WUE include engineering and technological innovations, advancements in plant and pasture science, environmental factors, and socio-economic considerations. Challenges that might be encountered include lack of public support, especially when the methods used are not cost-effective, and reluctance of irrigations to adopt new technologies.


Introduction
Irrigation is an essential agricultural practice for food, pasture and fibre production in semiarid and arid areas. In many countries including Australia, efficient water use and management are today's major concerns. The bulk of the irrigation water is sourced from rivers and dams and conveyed via open channels or pipelines to irrigated farms for storage before use or direct application to root zones. Irrigators who use groundwater often have storage tanks on their properties. At the farm level, irrigation systems or methods commonly in use may be broadly classified as sprinkler, surface, and drip or trickle systems. In sprinkler systems (e.g., solid sets, centre pivots and travelling irrigators), water is delivered in form of sprays using overhead sprinklers. In drip or trickle systems, water is delivered in small amounts via small nozzles installed in pipes or tapes, which can either be above the ground or underground. The sprinkler and drip/trickle systems are also referred to as pressurised systems, as they operate under low pressure which often involves some form of pumping. In surface systems (e.g., furrow and basin/border), water is conveyed over the field surface by the gravitational force. The furrow system is the most common method for the irrigation of row crops in Australia and in the world.
Globally, it is estimated that about 70% of fresh water abstracted is used to irrigate 25% of the world's croplands (399 million ha) which supply 45% of global food [1]. Water used for industrial and domestic purposes account for approximately 20% and 10% of the total global water usage, respectively. In Australia, for instance, in the year 2016-2017, 9.1 million mega litres were used to irrigate 2.2 million ha [2]. The demand for fresh water resources is on the increase, and the trend is likely to continue with the increasing population that comes with increased demand for food and fibre, and the predicted negative impacts of climate change. There is also increased awareness of the need to provide sufficient water to serve other ecological services. There appears to be consensus that irrigated agriculture in general is up against a future with less water.
This, therefore, calls for increased effectiveness in the utilisation of the scarce water resources, a concept that is technically called water use efficiency (WUE) or simply irrigation efficiency. From an engineering standpoint, WUE is often defined using a volumetric or hydrological approach, simply as the proportion of the water supplied through irrigation that is productively or beneficially used by the plant (Equations (1) and (2)). This definition is predominantly used when referring to field-scale irrigation water management. However, it should be noted that WUE may also be assessed at the catchment or basin scale [3].
The two most commonly used efficiency measures of an irrigation system are (i) application efficiency (AE) and (ii) requirement efficiency (RE), which can be written as: AE = volume of water stored in the root zone total volume of water applied (1) RE = volume of water stored in the root zone water deficit prior to irrigation (2) The efficiency performance measures, AE and RE, are only applicable at the field scale. However, losses of water also occur in conveyance and distribution channels prior to delivery to the field. If the water is stored in dams prior to usage, then further losses may occur as a result of evaporation and seepage. Performance measures used in these cases include conveyance, distribution and storage efficiencies.
On the other hand, the efficiency of irrigation water use may also be seen in a plant physiological sense, and in particular as a comparison of the yield or economic return of an irrigated crop or pasture to the total amount of water transpired by the crop or pasture. In fact, in recent literature (e.g., [4]), this is commonly referred to as irrigation water productivity and not WUE. In the cotton industry in Australia, this is sometimes referred to as irrigation water use index (IWUI) and relates cotton production only to the amount of irrigation water used [5].
In Australia, surface irrigation is the main irrigation method used, and in 2013-2014, it accounted for 59% of the total irrigated land [2]. However, the system in general is associated with high labour requirement and low WUE. This explains why modernisation and automation projects (discussed later in this paper) have tended to focus on this irrigation system. Conversely, the pressurised irrigation methods (sprinkler and drip) are generally less labour-intensive and have significantly higher WUE.
With the advancement of technology, thanks largely to the many years of investment and research and development in agriculture, there are new and emerging opportunities for further improving the WUE in irrigated agriculture. Examples of these include use of remotely sensed data (from drones or satellites), communication networks and the availability of cheap sensors.
It is clear from the above discussion that in order to improve the irrigation WUE, losses that occur along the conveyance and distribution channels must be minimised, and the timing and the quantity of water applied (or irrigation scheduling) must be optimised. Improvement of the irrigation WUE may lead to water savings which may be used to irrigate more land, which is particularly relevant where water is the limiting factor of production. The purpose of this paper was to review the advancements that have been made to improve the irrigation WUE, document the challenges encountered as well as exploring opportunities for further development. Although the bulk of the review is on Australia's irrigated agriculture, examples from other countries are also used, and it is anticipated that the findings will inform researchers and policy makers in general. The paper starts by looking at the nexus between irrigation modernisation and automation in Australia, particularly focusing on irrigation distribution channels and on-farm development. The review then discusses the role of irrigation scheduling in improving the WUE, and the concept of real-time control and optimisation that is still under development. The emerging and potential opportunities for improved WUE through remote sensing techniques, and sensors and communication networks are discussed. In the final section, we discuss the challenges to the achievement of higher WUE, with focus on water consumption at the basin scale and factors affecting trends in WUE which are broadly categorised as: engineering and technological; environmental; advancements in plant and pasture science; and socio-economic.

Irrigation Modernisation and Automation
The terms irrigation modernisation and automation have become common in irrigation literature in the recent decades, and in some cases have been used interchangeably. In the context of this paper, modernisation is the process of replacing ageing irrigation infrastructure and methods, often with new or "modern" equipment and technologies that have been developed in the recent past. The aims of irrigation modernisation include water saving and improved water delivery, and reduced operating and labour costs, which leads to sustainable agricultural production and enhanced livelihoods of farmers [6]. On the other hand, automation of irrigation systems is the use of equipment that allows the irrigation process to proceed with minimum human involvement, except for periodic inspections and routine maintenance. In Australia, modernisation and automation have been undertaken in the water distribution networks at the farm level as well. The nexus between automation and modernisation, and WUE will be explored in this paper.

Irrigation Water Distribution Systems
Irrigation has been practiced for many decades in Australia, especially in the Murray Darling Basin (MDB) (Figure 1), which consumes about two-thirds of the water abstracted for irrigation of crops and pasture [2]. The bulk of the water in the basin is delivered to irrigators by private and farmer-owned companies. Murray Irrigation in the MDB, which supplies water to 2300 farms with a total area of 748,000 ha is the largest in the country. Other large companies include Murrumbidgee Irrigation and Coleambally Irrigation Co-Operative Limited, which supply water to the Murrumbidgee Irrigation Area and the Coleambally Irrigation District, respectively.
In the last few years, irrigation water in the MDB has been largely conveyed via open channels. In addition, a significant portion of the infrastructure supporting irrigation, with some built in the early years of the last century, have started to age and become inefficient. From the WUE perspective, irrigation water conveyed via open channels is of interest because of the associated losses. Hence, the Federal and State and Territory Governments developed the Murray Darling Basin Plan and embarked on a program of modernising and automating this critical infrastructure, as discussed below.
Seepage losses, particularly in earthen open channels, may consume up to approximately 14% of the total water supplied to an irrigation scheme. Evaporation losses, especially in large open channels, may also be considerable [7], especially in arid parts of Australia. Therefore, one of the priorities of the modernisation plan of the MDB was to reduce these losses using a variety of methods including lining the canals with clay or rubber, repair of earthen and concrete channels, installation of gravity pipelines in place of open channels, and upgrade of on-farm irrigation infrastructure (discussed in greater detail in the next subsection). One example of such a project undertaken in the MDB from 2011 to 2015 is the Trangie-Nevertire Irrigation Scheme, which returned about 40% of the original water entitlements, significantly reduced water losses, and improved the water delivery efficiency from about 65% to 93% [8]. In the last few years, irrigation water in the MDB has been largely conveyed via open channels. In addition, a significant portion of the infrastructure supporting irrigation, with some built in the early years of the last century, have started to age and become inefficient. From the WUE perspective, irrigation water conveyed via open channels is of interest because of the associated losses. Hence, the Federal and State and Territory Governments developed the Murray Darling Basin Plan and embarked on a program of modernising and automating this critical infrastructure, as discussed below.
Seepage losses, particularly in earthen open channels, may consume up to approximately 14% of the total water supplied to an irrigation scheme. Evaporation losses, especially in large open channels, may also be considerable [7], especially in arid parts of Australia. Therefore, one of the priorities of the modernisation plan of the MDB was to reduce these losses using a variety of methods including lining the canals with clay or rubber, repair of earthen and concrete channels, installation of gravity pipelines in place of open channels, and upgrade of on-farm irrigation infrastructure (discussed in greater detail in the next subsection). One example of such a project undertaken in the MDB from 2011 to 2015 is the Trangie-Nevertire Irrigation Scheme, which returned about 40% of the original water entitlements, significantly reduced water losses, and improved the water delivery efficiency from about 65% to 93% [8].
A critical step towards improving the WUE is the ability to accurately measure the amount of water supplied to irrigators. This is in line with the saying that goes: "You cannot manage what you cannot measure." However, previous research undertaken in Australia showed that inaccurate flow measurement techniques, for instance using the Dethridge wheels, led to the supply of irrigation water in excess of entitlement volumes. A study by Goulburn Water [9] showed that large Dethridge wheels operated with inaccuracies of between −18% to +3%. Hence, regulatory requirements were put in place to ensure that irrigation water meters operate at an acceptable level of performance [10]. The requirements include pattern approval by the National Measurement Institute (NMI) and the A critical step towards improving the WUE is the ability to accurately measure the amount of water supplied to irrigators. This is in line with the saying that goes: "You cannot manage what you cannot measure." However, previous research undertaken in Australia showed that inaccurate flow measurement techniques, for instance using the Dethridge wheels, led to the supply of irrigation water in excess of entitlement volumes. A study by Goulburn Water [9] showed that large Dethridge wheels operated with inaccuracies of between −18% to +3%. Hence, regulatory requirements were put in place to ensure that irrigation water meters operate at an acceptable level of performance [10]. The requirements include pattern approval by the National Measurement Institute (NMI) and the ability of the meters to perform within maximum limits of error of ±5% in field conditions. The irrigation modernisation program involved the replacement of Dethridge wheels with water meters that were compliant with these regulations.
The above examples, therefore, demonstrate the opportunities for improving the WUE before the water is delivered to the farm and provide an indication of the magnitude of savings that have occurred in specific projects. There are still many irrigation enterprises that rely on open and unlined channels for their water supply, although statistics on their proportion are not immediately available. Some of the strategies used to modernise and automate flow of water in irrigation canals are also used on-farm to control the flow of water to different portions of the field. These include automatic regulators or gates and telemetry systems. These are discussed in the next subsection.

On-Farm Irrigation Development
In the recent past, a number of on-farm research and development projects aimed at improving the WUE have been undertaken. The funding scheme discussed above mainly focused on large irrigation schemes, but the Australian Government also initiated the On-farm Irrigation Efficiency Program to help individual irrigators improve their irrigation infrastructure or change irrigation practices (e.g., convert to more efficient irrigation methods) in order to save water. The most significant developments, especially in the surface system, appear to be automatic gates or outlets, water metering and use of telemetry systems. Some of the on-farm modernisation and automation projects have also been funded by the irrigators themselves.
Gates are structures placed in irrigation channels or bay/basin outlets to control flow of water, and be may be controlled by a mechanical timer or electric solenoid. Some gates also have the capacity to measure the flow rate. There has been an increase in the use of telemetry systems in irrigation to allow for remote measurement and control of various parameters. The mode of communication used by these telemetry systems include radio, telephone, infrared, satellite and internet. In Australia, there are commercially available on-farm irrigation telemetry systems manufactured by local companies, such as AWMA and Rubicon that utilise the Supervisory Control and Data Acquisition (SCADA) platform.
These on-farm automation developments may appear to be focussed on reducing the irrigation labour requirement especially in the surface systems, but they also play a role in improving the WUE. This is because, with automated systems, there is less chance of human error that may lead to water loss. A good example of this is that in manual surface irrigation systems, inflow is turned on and off at the completion of the irrigation by an operator. A delay of cutting off the flow will therefore lead to water losses. Improvement of application efficiencies of well-managed automated-bay-irrigated fields in Northern Victoria was demonstrated by Smith et al. [11].

Irrigation Modernisation in Developed and Developing Countries
The above discussion on irrigation modernisation and automation has largely focused on Australia. To provide context, this subsection will briefly examine the scenario in the rest of the world, particularly in Spain and the Unites States, two countries which are also major irrigating economies in Europe and North America, respectively.
According to Plusquellec [6], developed and some emerging countries generally possess conditions favourable for automation and modernisation of especially large-scale irrigation systems, for instance:

•
Organised manner of supply of water to users; • Irrigation systems that are generally well-maintained, hence less costly to upgrade; • Strong policy and regulatory environment, and the willingness and capacity to enforce laws related to water use; • Availability of technical expertise and equipment; and • Well-developed infrastructure such as roads.
Background information about irrigation modernisation and automation in Spain can be obtained from a number of articles, for instance González-Cebollada [12] and Lecina et al. [13]. Significant Spanish Government reforms and modernisation to manage demand of water began in 2002, and as was the case in Australia, the projects were largely taxpayer-funded, and the objectives included revitalisation of the irrigation sector and water conservation. The programs included improvement of the irrigation water distribution network and promotion of water-efficient methods such as the drip system. Similarly, the modernisation of irrigation systems in the United States aimed to improve the water delivery system using methods such as modification of check structures, use of reticulation systems, improved measurement and control, and the use of SCADA systems [14]. However, in a survey conducted by the United States Department of Agriculture (USDA) a few years ago, it was estimated that at least half of the United States cropland was still irrigated with less efficient irrigation methods [15].

Irrigation Scheduling
Irrigation scheduling is the process of determining how much water to apply and when to irrigate, and thus has a direct effect on WUE: the application of more water than that is necessary for optimal plant consumption reduces the irrigation WUE. Irrigation scheduling requires an understanding of the pattern of plant water use, which is affected by factors such as weather, growth stage and canopy wetness. The meteorologic component varies seasonally, daily and diurnally.
Irrigation may be scheduled based on the plant water status, which may be measured directly using a pressure bomb, or indirectly by monitoring the flow of the stem sap. Other indirect methods include the measurement of soil moisture content using probes and estimation of crop evapotranspiration (ET). A summary of the main irrigation scheduling approaches is presented in Jones [16]. In Australia, the most common tools used for irrigation scheduling are soil probes and tensiometers [17]. The major drawback of using these soil-moisture-based tools for scheduling is that they give point-based measurements, while the soil characteristics are known to vary spatially and temporally [18].
Farmers who do not use any scheduling tool often rely on their experiences to schedule their irrigations. However, previous studies have shown that such farmers who rely on the "rule-of-thumb" may be losing water [19]. Emerging methods of measuring or estimating crop water status for irrigation scheduling purposes are discussed later in this paper.
With the advancement in technology including the internet, a number of computer-based irrigation scheduling systems have been developed to help farmers in their decision-making process. Typical examples of these used in Australia include WaterSense, WaterTrack Rapid and IrriSatSMS [20]. However, despite the proven benefits of improved WUE using these technologies, their adoption is still limited because of reasons ranging from complexity to cost [17]. In the recent years, cheaper and more versatile sensors have become available (for instance Figure 2). such as the drip system. Similarly, the modernisation of irrigation systems in the United States aimed to improve the water delivery system using methods such as modification of check structures, use of reticulation systems, improved measurement and control, and the use of SCADA systems [14]. However, in a survey conducted by the United States Department of Agriculture (USDA) a few years ago, it was estimated that at least half of the United States cropland was still irrigated with less efficient irrigation methods [15].

Irrigation Scheduling
Irrigation scheduling is the process of determining how much water to apply and when to irrigate, and thus has a direct effect on WUE: the application of more water than that is necessary for optimal plant consumption reduces the irrigation WUE. Irrigation scheduling requires an understanding of the pattern of plant water use, which is affected by factors such as weather, growth stage and canopy wetness. The meteorologic component varies seasonally, daily and diurnally.
Irrigation may be scheduled based on the plant water status, which may be measured directly using a pressure bomb, or indirectly by monitoring the flow of the stem sap. Other indirect methods include the measurement of soil moisture content using probes and estimation of crop evapotranspiration (ET). A summary of the main irrigation scheduling approaches is presented in Jones [16]. In Australia, the most common tools used for irrigation scheduling are soil probes and tensiometers [17]. The major drawback of using these soil-moisture-based tools for scheduling is that they give point-based measurements, while the soil characteristics are known to vary spatially and temporally [18].
Farmers who do not use any scheduling tool often rely on their experiences to schedule their irrigations. However, previous studies have shown that such farmers who rely on the "rule-of-thumb" may be losing water [19]. Emerging methods of measuring or estimating crop water status for irrigation scheduling purposes are discussed later in this paper.
With the advancement in technology including the internet, a number of computer-based irrigation scheduling systems have been developed to help farmers in their decision-making process. Typical examples of these used in Australia include WaterSense, WaterTrack Rapid and IrriSatSMS [20]. However, despite the proven benefits of improved WUE using these technologies, their adoption is still limited because of reasons ranging from complexity to cost [17]. In the recent years, cheaper and more versatile sensors have become available (for instance Figure 2).  It is clear from the above discussion that scheduling of irrigation is easier if the irrigation system is automated, and with features such as accurate metering and sensors.

Real-Time Control and Optimisation
Due to factors such as differences in soil composition and weather, the infiltration characteristics at the field scale will vary both spatially and temporally. For most conventional irrigation systems that seek to apply water uniformly, this will mean that the on-farm WUE will be equally variable across the field [21]. The variability in WUE is more pronounced in surface irrigation systems (e.g., furrow) whereby irrigation water is conveyed over the soil surface. Hence, in the recent times, the concept of real-time control and optimisation, which was traditionally used in other branches of engineering, has gained prominence in irrigation water management.
In the context of irrigation, real-time control implies measurements taken during an irrigation event (e.g., advance of water in a furrow system) are processed for the modification of the same irrigation event. This is at variance with conventional management systems which typically rely on previous or historical measurements, which are affected by the temporal nature of infiltration characteristics. Real-time control is feasible when the control process is automated so that the feedback can be implemented rapidly. On the other hand, optimisation is the process of manipulating various variables of an irrigation system with the aim of achieving the best possible outcome. This has traditionally been achieved through trial and error or irrigator experience; however, owing to the advancement in computing technology in the recent past, the use of simulation models has been on the increase [22].
Surface irrigation systems that can be controlled and optimised in real time are sometimes referred to as smart irrigation systems ( Figure 3). They are regarded as improvements from purely automated systems, which are mostly designed to reduce irrigation labour requirement through automation of some tasks. The traditional or conventional irrigation systems are associated with high labour requirement and low WUE (Figure 3).

Real-Time Control and Optimisation
Due to factors such as differences in soil composition and weather, the infiltration characteristics at the field scale will vary both spatially and temporally. For most conventional irrigation systems that seek to apply water uniformly, this will mean that the on-farm WUE will be equally variable across the field [21]. The variability in WUE is more pronounced in surface irrigation systems (e.g., furrow) whereby irrigation water is conveyed over the soil surface. Hence, in the recent times, the concept of real-time control and optimisation, which was traditionally used in other branches of engineering, has gained prominence in irrigation water management.
In the context of irrigation, real-time control implies measurements taken during an irrigation event (e.g., advance of water in a furrow system) are processed for the modification of the same irrigation event. This is at variance with conventional management systems which typically rely on previous or historical measurements, which are affected by the temporal nature of infiltration characteristics. Real-time control is feasible when the control process is automated so that the feedback can be implemented rapidly. On the other hand, optimisation is the process of manipulating various variables of an irrigation system with the aim of achieving the best possible outcome. This has traditionally been achieved through trial and error or irrigator experience; however, owing to the advancement in computing technology in the recent past, the use of simulation models has been on the increase [22].
Surface irrigation systems that can be controlled and optimised in real time are sometimes referred to as smart irrigation systems (Figure 3). They are regarded as improvements from purely automated systems, which are mostly designed to reduce irrigation labour requirement through automation of some tasks. The traditional or conventional irrigation systems are associated with high labour requirement and low WUE (Figure 3). In surface irrigation systems in particular, adaptive real-time control has been proposed for the management of temporal infiltration variability [23][24][25]. A real-time optimisation system for furrow irrigation was tested in a field of commercially grown cotton in Queensland, Australia, demonstrated potential for an improvement in WUE and a reduction in labour requirement [26]. The system In surface irrigation systems in particular, adaptive real-time control has been proposed for the management of temporal infiltration variability [23][24][25]. A real-time optimisation system for furrow irrigation was tested in a field of commercially grown cotton in Queensland, Australia, demonstrated potential for an improvement in WUE and a reduction in labour requirement [26]. The system involved measurement of the inflow rate, sensing of the advance of water along the furrow, a computing system with a simulation model, and a telemetry system to facilitate communications between different components. A commercial prototype of this system was produced, and trials in a commercially irrigated field showed that it is able to control irrigation events by cut-off time to achieve the maximum application efficiency [27].
Hence, it is clear that real-time control and optimisation in the Australian irrigation industry is still at its infancy, particularly in surface irrigation. However, based on the amount of research and the progress made so far (for instance the commissioning of prototype systems), it is conceivable that, in the future, it will play a bigger role in irrigation water management and improvement of WUE.

Emerging and Potential Opportunities for WUE
Investments in research and development projects in the recent decades and advancement in technology, in general, have yielded new or emerging opportunities for increased WUE in irrigated agriculture. This has come in the form of new and advanced equipment and techniques, as well as cheaper and relatively accurate alternatives.

Remote Sensing
As discussed above, in irrigated agriculture, improvement of WUE is achieved by optimising the timing and quantity of irrigation applications. The scheduling methods described, whether plant-, soil-or meteorologic-based (evapotranspiration), are normally used on the ground. These methods are generally expensive, time-consuming and cannot be easily automated [16], and also mostly location-specific and not suitable for use in large areas. The option of remote sensing, which is not at all a new concept in agriculture, has, in the recent years, been an active area of irrigation water management research due to its advantages in systematic measurements across space and time, ability to cover large areas and capability to be integrated into models and with Geographic Information Systems (GIS).
New approaches using remotely sensed data to estimate the crop or plant water status and hence schedule irrigations are emerging. The first is satellite imagery which has been applied in many agricultural applications, for example yield and disease monitoring. In the last few decades, methods using algorithms to derive vegetation indices from satellite imagery in combination with ground-based measurements to estimate evapotranspiration (ET) over large areas have emerged [21,28]. Use of Landsat thermal infrared (TIR, https://lta.cr.usgs.gov/L8) imagery to derive spatial variability information of ET at the field scale and uniformity of water consumption for the purposes of improving WUE [29,30] is a recent step towards improved irrigation water management.
Use of remote sensing in irrigation water use monitoring, evaluation and management are underutilized due to issues of spatial and temporal resolution, quality of results and one-time/one-place syndrome among others. However, the current Lansat-8 satellite series comes with a 30 m spatial resolution and can be used to assess actual crop evapotranspiration and crop water use at the field and farm scale. There are a number of commercial satellites now available that may be used for agricultural purposes, for instance Sentinel-2 (https://sentinel.esa.int/web/sentinel/missions/ sentinel-2) and Planet (https://www.planet.com/markets/monitoring-for-precision-agriculture/).
Another remote sensing approach to determining the crop water status, which is still in research phase, is the use of thermal and multispectral imagery collected using unmanned aerial vehicles (UAV) or drones. Research has shown that the plant canopy temperature is correlated to the plant water status, and hence can be used for irrigation water management [31]. Applications using reflectance of near and mid-infrared regions of the electromagnetic spectrum to assess water status in cereal crops, fruit trees, grapevine, and pasture are described in Cozzolino [31].
The main advantage of remote sensing is the ability to estimate the crop water status over spatial scales, which cannot be possibly realised with the conventional methods such as soil probes or plant-based techniques. It is also expected that, with the increased uptake of drone technology, their prices will decrease and therefore become more accessible to many farmers. However, increased effort is also needed to connect irrigators and remote sensors to maximise economies of scale. Key opportunities and advances to watch include future collection of very high resolution (<10 m) data through hyperspectral sensors such as the current commercial IKONOS and Quickbird satellites, rapid data access availability of data from multiple sensors with a wide array of spatial, spectral, and radiometric features and remote sensing multi-data synthesis through streaming technology.

Sensor and Communication Networks
Sensors are equipment used to collect a range of data such as soil moisture and weather in order to improve agricultural management. Typical examples of sensors used in irrigation water management include soil moisture probes and weather stations. Traditionally, equipment for monitoring crop or soil water status were connected using cables and often required manual reading and the data used to schedule future irrigations. Apart from the inaccuracies that come with using historical data for future water management, such manual processes are time-consuming and often expensive. The use of wireless sensor technologies to improve WUE in irrigated agriculture is on the increase.
More often, a series of wireless sensors are used to monitor various parameters in the field, for example soil moisture and weather data. This is especially driven by the fact that the recent advancement in technology and competition has led to the availability of cheap sensors. A wireless sensor network consists of a number of individual sensors (sensor nodes), a sink node or hub to receive and process data from the sensor nodes, and a communication technology [32]. The sensor networks may also have actuators that can be used to automate the irrigation system.
The wireless communication technologies that are used for agricultural purposes including water management are discussed in many texts, for instance Rehman et al. [32]. The current communication technologies used in agriculture are ZigBee, Bluetooth, WiFi, GPRS/3G/4G, Long Range Radio (LoRa) and SigFox [33]. The ZigBee technology is commonly preferred in irrigation water management because of the range, low cost, energy efficiency and reliability [32,34].
The use of wireless sensors that measure soil moisture, temperature and humidity and relay the data over the 3G internet network is described in Reference [35]. The automation of such crucial data collection means that the irrigation system can be controlled in real time, thereby achieving higher WUE.
It appears from the literature reviewed that research into water loss through leakage has concentrated on urban water supply distribution networks. The techniques used to detect leakages in urban water distribution networks can be applied in irrigation settings. Pressure sensors connected to a wireless sensor network can play a vital role in in detecting leakages, and thus facilitate faster repair and prevention of further losses [36]. There is also a potential for using smart water technology to detect losses in water pipelines [37].
There are substantial ongoing research activities involving sensors and communication networks which are likely to lead to improved products and services in the future. It is also likely that communication networks will be used in a more integrated manner to achieve multiple objectives, for example irrigation and urban water supply using smart water meters.

Irrigation Water Productivity
This paper has largely been written from an engineering perspective, which predominantly defines WUE as the ratio of the irrigation water beneficially used by the plant or pasture to the water supplied through irrigation. This section will briefly discuss irrigation water productivity, with a focus on plant genetics and agronomic practices used to achieve higher yields using less water.
Through plant breeding, scientists have managed to develop high-yielding crop varieties. This implies that, all other factors kept constant, with the same amount available water, farmers can achieve a higher irrigation water productivity. Detailed information on how the plant WUE can be improved through molecular genetics is described in Ruggiero et al. [38]. The research provides an overview of the manipulation of genes that strongly impacts on WUE such as these that control root traits and stomatal development. Some genetically modified varieties are also resistant to pests and diseases, leading to higher yields. A study conducted within the cotton industry in Australia found that the water use productivity had increased by 40% over a period of ten years as a result of yield increases achieved by developments in plant breeding, the use of genetically modified varieties, and improved crop and water management systems [5].
Deficit irrigation, which is the application of less water than that is required by the plant or pasture, is a strategy that is often used when water is limiting. In a trial conducted in a dairy region of Victoria, Australia, where pasture is often irrigated, Rogers et al. [39] demonstrated that lucerne under deficit irrigation can fully recover once full irrigation is restored, and thus ideal forage can be grown under water limiting conditions. Tejero et al. [40] in a trial undertaken in a citrus orchard in Spain concluded that deficit irrigation strategies have the potential to improve WUE. Du [41] proposed the adoption of deficit irrigation strategies in areas of China where conventional irrigation is no longer sustainable because of water shortages.

WUE and Water Consumption at the Basin Scale
The need for water users to achieve greater WUE is often seen as a prerequisite for saving water for the benefit of other users as well as the environment. However, literature reviewed suggests that a higher WUE does not necessarily equal to net water saving, particularly at the basin scale.
When seen from the dimension of a water basin, what may be assessed as a loss in one perspective (e.g., deep drainage losses that may occur in surface systems), may be a gain in another way (e.g., recharge of groundwater resources). Some research has shown that significant improvement in delivery and on-farm WUE may in fact lead to a decline in groundwater resources [21] or reduce water for environment and downstream users [3]. Therefore, although improvement of on-farm irrigation WUE may lead to water savings on the farm, it will not necessarily be beneficial on a catchment or basin scale [15].
An overall increase of water consumption at the basin scale may occur if water savings ultimately leads to an expansion of the irrigated area [42]. This was demonstrated by research conducted in Morocco which saw the overall water consumption rise as a result of subsidised drip irrigation kits promoted as a means of increasing productivity and saving water [43]. In this example, although the drip system is generally regarded as water-efficient, farmers were found to shift to more water-intensive crops and generally used the "saved" water to expand the acreage under irrigation. This view has been corroborated by other studies, for instance a FAO-funded research project undertaken in North Africa and the Middle East region [44]. The study found that at the field scale, water saving may appear to be substantial, but at the basin scale, the total water consumption may actually increase while the crop water productivity gains for the most important crops may be modest at best. In a study undertaken in India, the widespread adoption of water efficient methods such as the sprinkler and drip systems were found to have the capacity to substantially reduce overextraction of groundwater resources, but half of the water saved was reused to expand the area under irrigation [45]. Figure 4 is used as an example to demonstrate the simultaneous increasing uptake of water-efficient technologies (sprinkler and drip) and the increasing total area under irrigation (especially between 1994 and 2013) in the Unites States. This suggests the potential reuse of any water savings to expand the area under irrigation. The graph in Figure 4 also shows the corresponding decrease in surface irrigation systems that are generally regarded as inefficient.   Table  4; USDA 2014, Table 28) [46].
An analysis of the MDB in Australia showed that the environment may become the unintended casualty (receive less water on average) of the increases in WUE driven by the adoption of waterefficient technologies [47] with most of the saved water being reused. The reuse of the saved water seems to be corroborated by the trend of the total irrigation water use in Australia between 2002 and 2017 ( Figure 5). The graph shows that the total irrigation water use between 2002 and 2006 was above 10,000 gigalitres (GL) but reduced to a low of just above 6000 GL in the four-year period: 2007-2008 and 2010-2011. The reduced irrigation water use in the period 2006-2011 was as a result of a severe drought that drastically reduced the availability of water for irrigation. In the period 2012-2014, the water use increased back to a similar level to the early part of the available data (approximately 11,000 GL), effectively signalling no net water saving. There was a decrease of irrigation water use in 2014-2016, but increased slightly in 2016-2017 to just over 9000 GL. The trends appear to be largely dependent on weather patterns.  An analysis of the MDB in Australia showed that the environment may become the unintended casualty (receive less water on average) of the increases in WUE driven by the adoption of water-efficient technologies [47] with most of the saved water being reused. The reuse of the saved water seems to be corroborated by the trend of the total irrigation water use in Australia between 2002 and 2017 ( Figure 5). The graph shows that the total irrigation water use between 2002 and 2006 was above 10,000 gigalitres (GL) but reduced to a low of just above 6000 GL in the four-year period: 2007-2008 and 2010-2011. The reduced irrigation water use in the period 2006-2011 was as a result of a severe drought that drastically reduced the availability of water for irrigation. In the period 2012-2014, the water use increased back to a similar level to the early part of the available data (approximately 11,000 GL), effectively signalling no net water saving. There was a decrease of irrigation water use in 2014-2016, but increased slightly in 2016-2017 to just over 9000 GL. The trends appear to be largely dependent on weather patterns.   Table  4; USDA 2014, Table 28) [46].
An analysis of the MDB in Australia showed that the environment may become the unintended casualty (receive less water on average) of the increases in WUE driven by the adoption of waterefficient technologies [47] with most of the saved water being reused. The reuse of the saved water seems to be corroborated by the trend of the total irrigation water use in Australia between 2002 and 2017 ( Figure 5). The graph shows that the total irrigation water use between 2002 and 2006 was above 10,000 gigalitres (GL) but reduced to a low of just above 6000 GL in the four-year period: 2007-2008 and 2010-2011. The reduced irrigation water use in the period 2006-2011 was as a result of a severe drought that drastically reduced the availability of water for irrigation. In the period 2012-2014, the water use increased back to a similar level to the early part of the available data (approximately 11,000 GL), effectively signalling no net water saving. There was a decrease of irrigation water use in 2014-2016, but increased slightly in 2016-2017 to just over 9000 GL. The trends appear to be largely dependent on weather patterns. While Figure 5 shows the trend of irrigation water use in the whole of Australia, Figure 6 is specific to the MDB, which consumes the bulk of the water used for irrigation in the country as previously discussed. In addition, Figure 5 shows the trend of the irrigated land in the basin and correlation of the irrigation water use with the area irrigated, meaning when farmers have access to more water, they irrigate more land (and vice versa). Therefore, it is likely that water saved as a result of the water-efficient technologies and practices adopted is reused as suggested by the studies quoted above. While Figure 5 shows the trend of irrigation water use in the whole of Australia, Figure 6 is specific to the MDB, which consumes the bulk of the water used for irrigation in the country as previously discussed. In addition, Figure 5 shows the trend of the irrigated land in the basin and correlation of the irrigation water use with the area irrigated, meaning when farmers have access to more water, they irrigate more land (and vice versa). Therefore, it is likely that water saved as a result of the water-efficient technologies and practices adopted is reused as suggested by the studies quoted above. However, there are strategies that can be used to attain a good balance between improved irrigation efficiency and environmental conservation, including groundwater recharge. A typical example is the water saving initiatives funded by the Australian Government, with the understanding that the water saved is released for environmental use [6]. The regulatory return of the saved water to the environment therefore mitigates the "rebound effect" phenomenon, which suggests that the increase in efficiency of use of a resource may lead to an increase in the rate of consumption of that resource [48].
Many other studies undertaken in different parts of the world have also linked widespread adoption of water-efficient technologies to overall increase in water consumption mostly due to expansion of land under irrigation, and not a decrease as intended. Nonetheless, as shown by a study undertaken in Spain, the water-efficient technologies have come with other side benefits such as reduced use of fertilisers and better accounting of water use [49].
In the literature reviewed for this study, an almost unanimous view that emerges is that overall reduction of water at the basin scale cannot simply be attained through the promotion or subsidies provided for water-efficient technologies. These technologies will thus need to be used in tandem with other measures such as incentives for conservation [45] and regulations to limit water allocation [44], among others.
Another interesting dimension is the nexus between the irrigation methods deemed to be generally more water-efficient, energy consumption and greenhouse gas emissions. For instance, modelling has demonstrated that although pressurised irrigation systems such as sprinkler and drip methods are generally more efficient and productive, they are more energy-consuming (compared to conventional systems such as furrow irrigation), resulting in the production of additional greenhouse However, there are strategies that can be used to attain a good balance between improved irrigation efficiency and environmental conservation, including groundwater recharge. A typical example is the water saving initiatives funded by the Australian Government, with the understanding that the water saved is released for environmental use [6]. The regulatory return of the saved water to the environment therefore mitigates the "rebound effect" phenomenon, which suggests that the increase in efficiency of use of a resource may lead to an increase in the rate of consumption of that resource [48].
Many other studies undertaken in different parts of the world have also linked widespread adoption of water-efficient technologies to overall increase in water consumption mostly due to expansion of land under irrigation, and not a decrease as intended. Nonetheless, as shown by a study undertaken in Spain, the water-efficient technologies have come with other side benefits such as reduced use of fertilisers and better accounting of water use [49].
In the literature reviewed for this study, an almost unanimous view that emerges is that overall reduction of water at the basin scale cannot simply be attained through the promotion or subsidies provided for water-efficient technologies. These technologies will thus need to be used in tandem with other measures such as incentives for conservation [45] and regulations to limit water allocation [44], among others.
Another interesting dimension is the nexus between the irrigation methods deemed to be generally more water-efficient, energy consumption and greenhouse gas emissions. For instance, modelling has demonstrated that although pressurised irrigation systems such as sprinkler and drip methods are generally more efficient and productive, they are more energy-consuming (compared to conventional systems such as furrow irrigation), resulting in the production of additional greenhouse gas emissions [50]. The energy costs in many countries (in the case of irrigation electricity for pumping water) has been rising steadily. This is thus likely to impact on the adoption of the water-efficient but high-energy consuming irrigation methods.

Factors Affecting Trends in WUE
From the above discussion, it is clear that the trends in the WUE of irrigated agriculture are affected by a range of factors which may be broadly categorized as shown in Figure 7. Engineering and technological factors include improvement of water distribution networks and on-farm irrigation development, irrigation scheduling, real-time control and optimisation, remote sensing and sensor and communication networks. These factors improve irrigation WUE mainly by reducing water losses. In the recent past, a variety of hardware and software gadgets has become commercially available and is used to enhance irrigation WUE. Advancements in plant genetics have led to the development of high-yielding and disease-resistant varieties with higher WUE. There has been greater environmental awareness, leading to some governments around the world funding water-saving initiatives with the understanding that the water saved is released as environmental flows. Socio-economic factors are also important drivers of WUE. This will be covered in this section, with a focus on technology adoption and the decision-making processes of irrigation water users.
Water 2018, 10, x FOR PEER REVIEW 13 of 17 gas emissions [50]. The energy costs in many countries (in the case of irrigation electricity for pumping water) has been rising steadily. This is thus likely to impact on the adoption of the waterefficient but high-energy consuming irrigation methods.

Factors Affecting Trends in WUE
From the above discussion, it is clear that the trends in the WUE of irrigated agriculture are affected by a range of factors which may be broadly categorized as shown in Figure 7. Engineering and technological factors include improvement of water distribution networks and on-farm irrigation development, irrigation scheduling, real-time control and optimisation, remote sensing and sensor and communication networks. These factors improve irrigation WUE mainly by reducing water losses. In the recent past, a variety of hardware and software gadgets has become commercially available and is used to enhance irrigation WUE. Advancements in plant genetics have led to the development of high-yielding and disease-resistant varieties with higher WUE. There has been greater environmental awareness, leading to some governments around the world funding watersaving initiatives with the understanding that the water saved is released as environmental flows. Socio-economic factors are also important drivers of WUE. This will be covered in this section, with a focus on technology adoption and the decision-making processes of irrigation water users. When farmers are faced with limited availability of irrigation water, they have to make challenging decisions on how best to operate. This is a common problem in Australia, where to a large extent, water is the limiting factor of production, but land is virtually unlimited. It is common to see farmers irrigate part of their land and cultivate the rest under rain-fed conditions. A study undertaken in Southern Spain found that the majority farmers growing irrigated intensive olive groves used deficit irrigation in order to maximise the value of limited available water [51].
Some researchers have observed that water saving initiatives have mainly focussed on engineering solutions such as reduction of seepage losses and suggested that further improvement in on-farm WUE could be achieved by the adoption of new irrigation technologies [52]. However, it must be noted that technology adoption is a complex sociological phenomenon, and its success will largely depend on the willingness of water users to change their attitudes. Irrigation water users, like When farmers are faced with limited availability of irrigation water, they have to make challenging decisions on how best to operate. This is a common problem in Australia, where to a large extent, water is the limiting factor of production, but land is virtually unlimited. It is common to see farmers irrigate part of their land and cultivate the rest under rain-fed conditions. A study undertaken in Southern Spain found that the majority farmers growing irrigated intensive olive groves used deficit irrigation in order to maximise the value of limited available water [51].
Some researchers have observed that water saving initiatives have mainly focussed on engineering solutions such as reduction of seepage losses and suggested that further improvement in on-farm WUE could be achieved by the adoption of new irrigation technologies [52]. However, it must be noted that technology adoption is a complex sociological phenomenon, and its success will largely depend on the willingness of water users to change their attitudes. Irrigation water users, like other members of the community in general, are predisposed to continue with the farming practices they are most familiar with, for fear of the unknown. In most cases, water users have access to information on new technologies, but getting people to change their attitudes and adopt new practices or technologies is a slow process. Research undertaken in the United States suggested that the requirement to learn a new set of skills may act as a deterrent to irrigators investing in new technologies or adopting new practices [15].
In both Australia and the United States, the cost of changing to new technologies and practices has been cited as a significant factor causing non-adoption [15,52]. This includes the capital required as well as the associated on-going costs. A typical example is when farmers decide to change from surface irrigation methods to pressurised systems such as centre pivots and lateral move machines which are generally associated with higher WUE. These pressurised systems not only require substantially higher capital costs to install, but come with higher energy consumption and are therefore expensive to run [5].
A European-Union-funded research project undertaken in Italy and Portugal concluded that lack of adequate knowledge and incentives may prevent farmers from exploiting the full potential of available technologies to optimise WUE [53]. The study thus recommended that there should be a continuous knowledge exchange among scientific experts, farmers and other stakeholders, and appropriate support be provided to encourage environmental conservation.
As discussed earlier, one of the main approaches used in Australia to attain increased WUE is the upgrade of irrigation infrastructure and provision of subsidies for on-farm improvements. However, some studies [3] have shown that sometimes these investments are not cost-effective, especially when compared to other alternatives such as water trading which is aimed at efficiently allocating water across competitive use. Therefore, when the economic benefits of these taxpayer-funded initiatives are not apparent, public support is not guaranteed.
It should be noted, however, that WUE cannot be improved infinitely. In the case of an irrigation system that has a lower irrigation performance to start with, it would be easier to notice an increase in WUE when improvements to the system are undertaken. However, for a system that is already operating at or near the optimum level, the WUE will increase (if at all) at a much slower rate. A study conducted in the Guadalquivir River Basin in Spain found that the impact on the WUE of technological innovations, such as deficit irrigation, new crop varieties and other water-saving technologies, had decreased considerably after a number of years [54]. As previously noted, plant breeding has been used for a number of decades now to develop plant varieties with higher WUE. However, this improvement is not expected to continue at the same rate as before [55]. This implies that the world cannot solely rely on the past and the present technologies to improve WUE, but must continue to undertake research to generate newer technologies that can be used to further improve the WUE.

Conclusions
The purpose of this paper was to review the steps that have been taken to improve the use of the scarce water resources, with a focus on irrigation in Australia but with examples from other countries. We also looked at the challenges that have been encountered and explored opportunities that may lead to improved WUE in the future.
The Australian Federal, State and Territory Governments have facilitated the modernisation and automation of irrigation infrastructure such as distribution channels and improvements on on-farm irrigation hardware. This has led to improved WUE, and some of the water saved has been used for environmental purposes. Improved irrigation scheduling and the real-time control and optimisation have also demonstrated potential for further water savings. The use of remote sensing and communication sensor networks are emerging in the irrigation industry and are expected to contribute to improve WUE. However, challenges lie in the way of enhanced WUE. These include lack of public support, especially when the methods used are not cost-effective, and reluctance of irrigations to adopt new technologies.
The review has demonstrated that the adoption of water-efficient technologies has delivered water savings at the field scale, with some of the savings being released as environmental flows. However, the net water saving at the basin scale is not always achievable. In fact, some studies have demonstrated that a net increase in water consumption, largely due to the reuse of the saved water to expand the area of land under irrigation. Hence, an overall reduction of water consumption at the basin scale is likely to be achieved when water-efficient technologies are used in combination with other measures, such as provision of incentives for water conservation and regulations to limit water allocation.
Author Contributions: R.K. conceptualised this review study and produced the first draft of the manuscript. P.L. reviewed the draft, rewrote some sections and formatted the paper.
Funding: This research received no external funding.