Next Article in Journal
Rationalizing Structural Hierarchy in the Design of Fuel Cell Electrode and Electrolyte Materials Derived from Metal-Organic Frameworks
Previous Article in Journal
Durability Investigation of Fiber-Reinforced Functionally Graded Concretes in Cold Regions
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Livestock Management on Grazing Field: A FANET Based Approach

by
Mohammed A. Alanezi
1,
Bashir O. Sadiq
2,3,
Yusuf A. Sha’aban
4 and
Houssem R. E. H. Bouchekara
4,*
1
Department of Computer Science and Engineering Technology, University of Hafr Al Batin, Hafr Al Batin 31991, Saudi Arabia
2
Department of Electrical and Computer Engineering, Kampala International University, Kampala 20000, Uganda
3
Department of Computer Engineering, Ahmadu Bello University, Zaria 610001, Nigeria
4
Department of Electrical Engineering, University of Hafr Al Batin, Hafr Al Batin 31991, Saudi Arabia
*
Author to whom correspondence should be addressed.
Appl. Sci. 2022, 12(13), 6654; https://doi.org/10.3390/app12136654
Submission received: 27 May 2022 / Revised: 27 June 2022 / Accepted: 28 June 2022 / Published: 30 June 2022
(This article belongs to the Section Robotics and Automation)

Abstract

:
In recent times, designated grazing areas/fields or routes for livestock grazing are usually defined. Hence, their herding activities’ success relies on data extracted from aerial photographs. As such, a direct and cost-effective way of monitoring livestock for perimeter coverage and in other natural situations is required. This paper presents a coverage solution involving multiple interacting unmanned aerial vehicles. The presented approach is built on a graph, with geographic coordinates set such that several Unmanned Aerial Vehicles (UAVs) can successfully cover the area. The maximum flying time determines the number of UAVs employed for coverage. The proposed solution is thought to solve some practical problems encountered during the execution of the task with actual UAVs. It is suitable for long-term monitoring of animal behavior under various weather conditions and observing the relationship between livestock distribution and available resources on a grazing field. The simulation was carried out using MATLAB and aerial images from Google Earth.

1. Introduction

Unmanned Aerial Vehicles (UAVs) can revolutionize livestock herding and management. As a result, there is an increasing scientific interest in using UAVs to manage livestock. UAVs can be used to control livestock grazing areas and remote sensing of these animals. In recent years, most governments have defined livestock grazing areas/fields or paths, with the effectiveness of most herding operations relying on the extraction of information from aerial pictures [1]. In a farm context, UAVs are typically operated at human height or a low to medium altitude, giving them the same or wider angles of view as humans. As established by the regulatory body, grazing field mapping can be used for UAV operations due to the similarity of viewpoint with people at eye level. However, some computational and scientific issues are unique to UAV operations. One of these issues is the minimum time coverage of ground regions utilizing a group of unmanned air vehicles (UAVs) dubbed Flying Adhoc Network (FANET) outfitted with image sensors [2]. As a result, this research proposes a way for using UAVs to cover and sense ground areas. The paper will concentrate on practical issues arising only during vehicle deployment. The number of UAVs utilized in the task, for example, is determined by the area’s size and layout, the UAV’s maximum flight time and, more crucially, the time required to prepare and launch the UAV. Figure 1 depicts the conceptual diagram. The operation of multiple UAVs to achieve adequate coverage is dependent on the mode of communication amongst the UAVs. It is critical to ensure that the communication efficiency among sensor-equipped devices is at the highest level to accomplish a flawless operation of UAV-based livestock management [3]. These UAVs’ communication is typically affected by the speed of the UAVs within the grazing field. Nonetheless, with the high speed of UAVs, a high coverage area is said to be achieved because the UAVs are moving at a faster speed.
Some studies have used aerial pictures (satellite imagery) and UAV-based communication systems to examine farm animal tracking and monitoring for applications such as grazing distribution monitoring, pasture usage, domestic livestock management, and livestock behavior. Grazing distribution monitoring has attracted increased attention, intending to manage individual animals through continuous and real-time monitoring of their perimeter coverage during grazing as well as their health, welfare, output, and environmental effects. The data collected during the livestock monitoring procedure contributes to the long-term viability of the agroecosystem by providing herders with timely and reliable information about how their animals behave on the farm [4]. Nonetheless, it is known that satellite usage is applicable for animal tracking and monitoring through space-retrieved imagery that can analyze cattle productivity on different pastures. However, the use of drones is justified when the angle matters. Some uses of drones for livestock management include spotting trespassing hunters, farming illegal activities, and controlling herder operations.
The contributions of this paper are as follows:
  • We designed and proposed a deployment scheme that considered the geographic coordinates of the grazing field for practical implementation. To our knowledge, this has not been adequately treated before.
  • The proposed deployment scheme focused on achieving an optimal coverage time and interconnections for efficient grazing field surveillance.
  • We also proposed a task-sharing protocol between the multiple UAVs for effective task sharing and cooperation.
Section 2 presents a review of some of the pertinent literature works. Section 3 presents the methodology used to present the proposed solution, which shows the step-by-step approach and the conducted simulation. In Section 4, the obtained result is presented together with the analysis. Finally, Section 5 presents the conclusion drawn from the results and concludes the report.

2. Literature Review

Livestock management in widespread production systems may be thought-provoking, particularly in huge regions. The use of UAVs to gather images from the region of concern is rapidly becoming a feasible alternative. Nonetheless, proper processes for extracting pertinent data from the images are still rare. Conventionally, the recognition of livestock through a UAV is centered on a simple process involving a video recording of the pasture where the livestock is spotted and calculated manually by a human viewer. This practice was found to be valuable in the detection and counting of cattle in a quarantine environment [5]. However, this method is manual and always involves the presence of a human observer. The authors in the work of [3,6,7] present the parameters and key limitations, current regulations, potential requirements, and challenges for operating UAVs in smart agriculture. To computerize the livestock detection and counting method, the number of livestock was added up in the video frame by detaching its image from the background and applying thresholding on each image frame of the video sequence [8,9,10]. The ability of Unmanned Aircraft Systems (UAS) overflights for cattle surveillance was assessed in [11]. The information attained from the Unmanned Aircraft System (UAS) image was used to model the cattle distribution, and the outcomes were related to bio-logged cattle. The prospect of using UAV video surveillance to predict the food intake of non-nursing beef cows was examined where the cow feeding pattern was determined from the processed video files. The results suggest that UAV surveillance could be vital in monitoring cow feeding behavior [12]. The work of [13] suggested a general smart video surveillance system and studied some glitches in cow performance analysis using an intelligent image-based monitoring system framework with a hidden Markov process. while the authors in the work of [14] suggested a new robotic animal herding system centered on a system of independent barking drones. Such a system aims to substitute old herding approaches so that an enormous amount of farm animals can be speedily collected from a sparse status and then driven to a selected place. Vayssade et al. propose a method to process images taken by a commercial drone to automate the tracking of animal activities using a combination of thresholding and supervised classification methods [15]. Jung et al. use a Proportional Integral Derivative (PID) controller on four quadrotor UAVs to guide four animals into their pen within the minimum time by creating noises of predators modeled with an exponential function to provide a solution to the cattle roundup problem [10].
The geographical proximity used to examine grazing behavior and social structure is a critical indicator of performance in cow behavior. As such, Mufford et al. developed a competent way to compute the spatial proximity of beef cattle, employing UAV-based image acquirement and photogrammetric analysis. Still-frames pulled out from the UAV video screenshots were used to produce Orth mosaics, revealing that groups of correlated sets were nearer than non-correlated ones [16]. Sun et al. offered a real-world method that used UAVs and verified its use at a distinctive household pasture to examine the hourly spatial distribution of each yak [17]. Favier et al. explored the use of UAVs to detect and round up sheep by developing a prototype controlled from a laptop base station running on LABVIEW [18]. Jasper Brown et al. discovered the relationship between object detector performance and spatial degradation for livestock. Factual data was established using focus drone images and then down sampled to various ground sample distances (GSDs).
Apart from the simple method of livestock detection, advancements in Artificial Intelligence (AI) and Machine Learning (ML) have allowed researchers to detect livestock using a pre-trained Convolutional Neural Network (CNN)-based architecture. For instance, an adjusted version of R-CNN was employed to identify and count livestock on a grazing field [19]. In this method, a selective search algorithm was used to generate a region proposal and then applied a CNN to extract the features in the region, which were then later classified using a Support Vector Machine (SVM). The confidence value was obtained by applying it to a linear bounding box. Kate et al. scrutinized twelve sheep’s habits and normal responses to a drone to fit mathematical models of shepherding to the new dimension. The model targets make it realistic for AI to advance the independence of farmers in shepherding above the ground [20]. Barbedo et al. proposed a cattle counting scheme incorporating a deep learning model for the rough animal position and color space manipulation to increase the contrast between animals and the background [21]. Jayme Garcia Arnal Barbedo et al. investigated the prospect of utilizing an incline angle to increase the expanse of the acquired image in cattle monitoring. This feature was realized by creating a model for animal detection using Deep Convolutional Neural Networks. Fact findings show that oblique images can be magnificently utilized in some circumstances, but certain real restrictions must be solved for the method to be attractive [22]. Andrew et al. recommend a deep neural network method for livestock recognition employing UAVs with onboard deep learning inference [23]. Soares et al. suggested a technique for identifying and counting cattle, in above-the-ground images acquired by UAVs, based on CNNs and a graph-based optimization technique to remove detected duplicated images [24]. Shao et al. propose a cattle detection and counting system based on CNNs using aerial images taken by a UAV [25].
Owning that the future routing protocols depend on the nature of the communication link, the works in [26,27] present the design of an interface protocol for an indoor Flying Ad-hoc Network-specific routing protocol, using light fidelity as a communication link. The focus was to achieve high throughput. However, practical routing problems encountered during UAV operations are not considered. To solve practical routing problems encountered during UAV operations, ref. [28] proposed an optimized solution for the problem of minimum time coverage of ground areas using multiple UAVs with image sensors. This is achieved by determining the geographic coordinates a single UAV would cover in a minimum time and then formulating mixed-integer linear programming to route the UAVs over the geographic area. The UAVs required to cover a particular area could then be selected. However, the UAV routing strategy does not consider possible collisions among the UAVs. Hence, to avoid link breakage during information transfer, the communication path between multiple UAVs is optimized to improve communication links in flying Adhoc networks using smell agent optimization and the Particle Swarm Optimization (PSO) algorithm [29].
The reviewed studies show that several works have been conducted in livestock management. However, only a few works considered the geographic coordinates of the grazing field for practical implementation before optimizing the UAV’s performance in the field in terms of communication and flight patterns. Hence, this paper presents a solution based on the practical geographic coordinates of the grazing field for livestock perimeter tracking and management.

3. Methodology

This subsection presents the methodology adopted in this research work. Google Map image of a grazing field was first collected using the online Google Earth software. This grazing field represents the area that is set to be covered by the grazing castles. The dimension of the grazing field is employed in the simulation. Also, the parameter of the Tello Edu drone is used during the simulation. This will allow future deployment in the field.
The problem formulation assumed that three UAVs are supposed to cover a grazing field area. The choice of three candidate UAVs is because the minimum number of a UAVs that can form an Adhoc network with each other is three. Given that the area to be covered is A, and the flight time for each UAV is 10 min (according to the Tello Edu drone datasheet), each drone has an onboard camera for capturing and sensing. The aim will be to use these drones to sense the entire area using the onboard sensor whilst intercommunicating. The drone is said to fly at the same altitude [26]. Using a larger number of UAVs will lead to a minimum coverage time but increases the chance of collision. Therefore, the problem herein is to specify the path for each UAV to complete its mission. The collected Google Earth map for the grading field is presented in Figure 2.
Based on the fitted-on board sensor, if the focal distance of the camera’s lens is F L , the height between the camera and the ground is H G , which also denotes the flying height. The width of the fitted onboard sensor is W D the camera’s coverage width C L can be computed as:
C L = H G W D F L
The coverage concept is depicted using the diagram in Figure 3.
To properly determine the best direction to calculate the number of rows that can be covered, the measurement on the map area is rotated like a polygon, and the different heights are measured from the ground. The minimum height denoted as H is then used to compute the number of rows to be covered.
The number of the rows to be covered is estimated as:
C R = H C L ( 1 O l )
where O l is the fraction of overlap between the aerial images. Therefore, the distance between each of the parallel rows can be computed as:
d R = H C L
The image width and length can be computed using Equations (4) and (5), respectively.
I M W = 2 F A tan ( ( F O V h 2 ) π 180 )
I M L = 2 F A tan ( ( F O V V 2 ) π 180 )
The variables F A , F O V h , F O V V are the flight altitude, the horizontal field of view, and the vertical field of view, respectively. Therefore, the UAVs are said to move parallel to the x-axis. The coordinate of the take-off point of the UAV is three sets of nodes of the graph G = ( V , E ) . The movement scenario is depicted in Figure 4. The GL and GW represent the grazing field length and width, respectively.
The graph can be mathematically represented as an M × M matrix whereby the distance between two of the UAV nodes a and b, d a b , can be computed using Euclidean distance. Therefore,
d a b = d a b ,   ( a , b ) E
The flight time for each the UAV to cover the grazing field can be computed as follows:
F T = a M b M d a b V a b X a b k + d a b
where V a b , X a b k and d a b are the velocity, position, and distance between the UAVs, respectively. Therefore, the total flight time for three of the UAVs is:
F T o t a l = i = 1 3 ( a M b M d a b i V a b i X a b k + d k i )
The grazing field coverage algorithm for the implementation is presented in Algorithm 1.
Algorithm 1. Grazing Field Algorithm
Input: number of UAVs (N), Map
1Capture map area
2Initialize flying path P = 1 based on map coordinate obtained
3Compute image width using Equation (4)
4Compute image length using Equation (5)
5Initialize UAV node deployment
6Get a neighbor set of UAVs
7for  U N C R do
8    d R = H C L
9 Establish a UAV flying path
10 If at least one path exists, then
11 Continue
12 else
13 path = 0 then stop
14 end if
15end for
Output: Coverage region, time, flight path
The simulation parameters for the grazing field coverage are presented in Table 1. Some of these are the parameters obtained from the Tello Edu drone datasheet.
The UAVs in FANET are expected to communicate with each other. This expectation becomes necessary to achieve effective communication. Communication is key between multiple UAVs that require a specific routing protocol [26]. Since the grazing field is a process of surveying the field perimeter so that live stock will not move out of the field and violate governmental laws and policies, there is a need for the UAVs to communicate with each other as well as the base station. Hence, the paper proposes a system whereby the captured images are dynamically shared between multiple UAVs under wireless mesh networking using the communication protocol presented in [26]. This data sharing will make task cooperation and sharing between the UAVs efficient. The operation of sequence for the UAV-based FANET communication is depicted in Figure 5.
As shown in Figure 5, the UAV-based FANET communication is achieved if data is captured. The diagram shows that the UAVs in the FANET operation communicate with one another for information exchange. Provided that the data is captured in the required format, the data are then stored in the UAV for relaying to nodes or base stations.
The routing protocol used to share the captured information from the UAVs in the grazing field monitoring was a Specific Routing Protocol for Flying Adhoc Network, which was coined from the standard Ad Hoc On-Demand Distance Vector (AODV) as detailed in [26,27]. The standard AODV is a loop-free routing system designed to self-start in a network of nodes and withstand a variety of network behaviors such as node connection failures, mobility, and packet losses. At each node, AODV retains a routing table. In a destination’s routing table entry, a next-hop node, a sequence number, and a hop count are needed fields. The next-hop node receives all packets headed for the destination. The sequence number is a time-stamping system that measures the freshness of a route. The hop count represents the current distance to the target node [30].
For connectivity purposes, a path for routing between the FANET nodes must be established, similar to a graph G = ( V , E ) , consisting of a set of nodes and vertices (a, b). Mathematically the probability that a connectivity path will exist is given as:
G ( N , γ a b ) = ( Λ N , F N , γ N , γ a b )
where; Λ N = { G = G ( V , E ) } , F N = 2 N , γ N , γ a b ( G ) = ( a b ) E γ a b ( a b ) E ( 1 γ a b ) , γ a b is the probability of determining an edge between vertices a and b, and E is the random set of edges.
When the UAV node is static, the graph is also static. Therefore, G { H N , γ } ,     H N = ( V , E N ) where:
γ a b = { γ , ( a , b ) E γ , ( a , b ) E
γ N , γ a b ( G ) = γ | E | ( 1 γ ) | E N | | E |
The variable E N is the expected utility of the number of edges in the graph. The algorithm for the FANET node communication is presented in Algorithm 2. Using the algorithm presented, the UAVs in operation must be within the same transmission range for each node to communicate with each other. The range of transmission is dependent on the type of communication medium used. It is also essential to state the size of data captured before initiating UAV connections [26]. Based on this, the UAVs can connect within the Map coverage area for cooperation using the connectivity routing algorithm presented in Algorithm 2. Since the nodes move randomly within the coverage area, the UAV establishes a connection with the closest node. However, if the request to establish a connection is unsuccessful, the node requests again until a connection is established.
Algorithm 2 Connectivity Routing Algorithm
Input: UAV coverage Area, Transmission Range
1Obtain coverage area from Map
2Enter Transmission range
3Enter packet size to be captured
4Enter UAV speed
5Get a neighbor set of UAVs ( D ( x , y ) )
6If D ( x , y ) < Tmin then
7   Establish a UAV connection
8 Exchange date
9else
10 Transmission range not reachable
11 Return to 5
12end if
Output: Connection established
The parameters in Table 2 show the simulation parameters for the connectivity routing algorithm
The next subsection presents the result and discussion.

4. Results and Discussion

This subsection uses a series of simulations to demonstrate the proposed methodology. All simulations were run in MATLAB with an HP Pavilion 15 Intel Core i7 3.2 GHz processor and 8 GB of RAM specification. The YALMIP optimization toolbox was adopted to minimize the coverage time for the UAVs on the grazing fields.
The simulation started by exploring the coverage time equation presented in Equation (4) using three UAVs. The setup time of 120 secs and a total flight time of 10 min correspond to the parameters presented in Table 1. The collected map was used as an input to the algorithm, as shown in Figure 6. In the diagram in Figure 7, the grazing field map was carved out and presented. The extracted path for the fight coverage during the mission was extracted and presented in Figure 8.
The carved-out region of the extracted map is depicted with a black spot, as shown in the diagram. The UAV’s flight path is computed-based on the map’s carved-out region. The obtained flight path is presented in Figure 8.
The green, red, and blue paths are the paths to be followed by the UAVs during the grazing field surveillance. The paths are interwoven because the UAVs move randomly. The time of operation of the UAVs as obtained are 23.3 min, 25.1 min, and 25.2 min, respectively, without determining the optimal route to be followed. This shows that the UAVs could perform their mission on the grazing field.
The longest distance of the lane to the grazing field was 149.14 m. This parameter was required in establishing the UAV connectivity routing algorithm. However, the UAV path can further be optimized to reduce the coverage time, allowing the UAVs to complete their mission faster to maximize energy. The initial coverage time of the three UAVs before the optimal route is determined is benchmarked against the coverage time when the optimal route is determined. This is presented in Figure 9.
As shown in the diagram in Figure 10, the UAV’s coverage time is presented whereby the (1), (2), and (3) represent the first, second, and third UAVs, respectively, on the x-axis. The y-axis is the coverage time. After the route is optimized, the obtained coverage time is 21.3 min, 21.1 min, and 19.2 min for UAV1, UAV2, and UAV3, respectively, against the initial 23.3 min, 25.1 min, and 25.2 min, respectively. As a result, the optimized route path successfully covers the mission area while lowering energy consumption. Due to the increased coverage duration, this has been a limiting problem for UAVs in grazing field observation. Between each trip, the setup time for landing operations, battery replacement, and take-off can take longer than the actual flight time. As a result, the quantity of survey area that can be flown in each operation is influenced immediately.
Experts have proposed that for grazing field surveillance, UAVs with a better battery life for extended operation be used. As such, the flight time was extended from the conventional 30 min flight time of the Tello Edu to 40 min, 50 min, and 60 min to determine the effect of flight time in UAV flight operation. Table 3 shows the results of altering the flight operation on the grazing field.
As presented in Table 3, the relationship between flight time and coverage time is linear because an increase in flight time leads to a corresponding increase in the time spent on coverage. The result shows that as the flight time increases, the UAVs can sense and cover the area better, increasing the coverage time. Also, to see the effect of the setup time on the mission coverage time, one of the flight times was kept constant, and the setup was varied. The diagram in Figure 11 and Figure 12 shows how the setup time during a mission affects the coverage time.
From the figure presented, it is observed that as the setup time increases, the coverage time also increases.
Looking at the effect of the setup time when UAVs are tasked to determine the optimal route, as presented in Figure 11, the diagram shows that the presented algorithm can effectively reduce the coverage time as the setup time increases. The obtained results using the map of the grazing field in terms of the land coverage were used to check the connectivity between the UAV nodes on the grazing field. This result was used in conjunction with the parameters presented in Table 2. The typical coverage distance of an embedded Wi-Fi in a UAV is between 60–140 m [31]. Therefore, if the distance between the UAVs exceeds the maximum range, communication will not be established between the UAVs. As such, 100 m was arbitrarily chosen to simulate the connection between the UAVs in the grazing field. The result obtained for the connectivity routing of the UAVs is presented in Figure 12.
As shown in Figure 12, the UAVs could intercommunicate in the grazing field to share resources, thereby cooperating to achieve the mission faster and more effectively. The number of nodes in the simulation was doubled to depict the fast-moving nature of UAV nodes. The use of a base station was also considered in the simulation, which depicts the farmer/end-user controlling the entire grazing field operation. This process demonstrates the cooperative communication of the entire operation on the grazing field using a real-life mapping scenario. The following section presents the conclusion drawn for the work and further research.

5. Conclusions

This paper has presented a coverage solution using multiple unmanned aerial vehicles communicating with each other when surveying a grazing field. This is in response to governmental policies defined in several countries that restrict the grazing areas/fields or paths for livestock grazing. This policy is said to make herding operations effective and disaster-free. To achieve this, this paper presented a technique that relies on extracting information from aerial pictures to proffer a coverage solution for herding operations, thereby providing a direct and cost-effective method of monitoring livestock for perimeter coverage as well as under other natural conditions.
The method’s key contribution is that it directly and formally examines various practical issues that only arise during actual UAV deployment on a grazing field. The number of UAVs employed in this paper is the bare minimum for forming a FANET. However, the number of UAVs should be determined by the size and format of the grazing field and the vehicles’ maximum flight time in a mission. It is worth noting that the technique might not be able to solve the coverage problem if the battery life is insufficient for the size of the region to be covered. Nonetheless, the coverage time for the UAVs was optimized for adequate coverage. Further work should consider real-world implementation (prototype development) by introducing an FPGA or Arduino-based controller other than the existing ones on the drones to avoid collision of the UAVs.

Author Contributions

Conceptualization, B.O.S. and Y.A.S.; methodology, B.O.S., Y.A.S. and H.R.E.H.B.; software, B.O.S.; validation, Y.A.S. and H.R.E.H.B.; formal analysis, Y.A.S. investigation, B.O.S.; resources, B.O.S. and Y.A.S.; writing—original draft preparation, M.A.A., B.O.S., Y.A.S. and H.R.E.H.B.; writing—review and editing, M.A.A., B.O.S., Y.A.S. and H.R.E.H.B.; visualization, Y.A.S.; supervision, H.R.E.H.B. and M.A.A.; project administration, M.A.A.; funding acquisition, M.A.A. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Deputyship for Research and Innovation, Ministry of Education in Saudi Arabia, grant number IFP-A-201-2-1.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Acknowledgments

The authors extend their appreciation to the Deputyship for Research and Innovation, Ministry of Education in Saudi Arabia for funding this research work through the project No IFP-A-201-2-1.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Alanezi, M.A.; Shahriar, M.S.; Hasan, M.B.; Ahmed, S.; Sha’aban, Y.A.; Bouchekara, H.R. Livestock Management with Unmanned Aerial Vehicles: A Review. IEEE Access 2022, 10, 45001–45028. [Google Scholar] [CrossRef]
  2. Nintanavongsa, P.; Pitimon, I. Impact of sensor mobility on UAV-based smart farm communications. In Proceedings of the IEEE Proceedings International Electrical Engineering Congress (iEECON), Pattaya, Thailand, 8–10 March 2017. [Google Scholar]
  3. Maddikunta, P.K.R.; Hakak, S.; Alazab, M.; Bhattacharya, S.; Gadekallu, T.R.; Khan, W.Z.; Pham, Q.V. Unmanned aerial vehicles in smart agriculture: Applications, requirements, and challenges. IEEE Sens. J. 2021, 21, 17608–17619. [Google Scholar] [CrossRef]
  4. Li, X.; Xing, L. Use of unmanned aerial vehicles for livestock monitoring based on streaming K-means clustering. IFAC Pap. Online 2019, 52, 324–329. [Google Scholar] [CrossRef]
  5. Goolsby, J.; Jung, J.; Landivar, J.; McCutcheon, W.; Lacewell, R.; Duhaime, R.; Schwartz, A. Evaluation of unmanned aerial vehicles (UAVs) for detection of cattle in the cattle fever tick permanent quarantine zone. Subtrop. Agric. Environ. 2016, 67, 24–27. [Google Scholar]
  6. Hogan, S.D.; Kelly, M.; Stark, B.; Chen, Y. Unmanned aerial systems for agriculture and natural resources. Calif. Agric. 2017, 71, 5–14. [Google Scholar] [CrossRef] [Green Version]
  7. Herlin, A.; Brunberg, E.; Hultgren, J.; Högberg, N.; Rydberg, A.; Skarin, A. Animal welfare implications of digital tools for monitoring and management of cattle and sheep on pasture. Animals 2021, 11, 829. [Google Scholar] [CrossRef]
  8. Al-Thani, N.; Albuainain, A.; Alnaimi, F.; Zorba, N. Drones for sheep livestock monitoring. In Proceedings of the 2020 IEEE 20th Mediterranean Electrotechnical Conference (MELECON), Palermo, Italy, 6 June 2020; pp. 672–676. [Google Scholar]
  9. Sarwar, F.; Griffin, A.; Periasamy, P.; Portas, K.; Law, J. Detecting and counting sheep with a convolutional neural network. In Proceedings of the 2018 15th IEEE International Conference on Advanced Video and Signal Based Surveillance (AVSS), Auckland, New Zealand, 27–30 November 2018; pp. 1–6. [Google Scholar]
  10. Vayssade, J.A.; Arquet, R.; Bonneau, M. Automatic activity tracking of goats using drone camera. Comput. Electron. Agric. 2019, 162, 767–772. [Google Scholar] [CrossRef]
  11. Mulero-Pazmany, M.; Barasona, J.A.; Acevedo, P.; Vicente, J.; Negro, J.J. Unmanned Aircraft Systems complement biologging in spatial ecology studies. Ecol. Evol. 2015, 5, 4808–4818. [Google Scholar] [CrossRef] [Green Version]
  12. Nyamuryekung’e, S.; Cibils, A.F.; Estell, R.E.; Gonzalez, A.L. Use of an unmanned aerial vehicle− mounted video camera to assess feeding behavior of Raramuri Criollo cows. Rangel. Ecol. Manag. 2016, 69, 386–389. [Google Scholar] [CrossRef]
  13. Mufford, J.T.; Hill, D.J.; Flood, N.J.; Church, J.S. Use of unmanned aerial vehicles (UAVs) and photogrammetric image analysis to quantify spatial proximity in beef cattle. J. Unmanned Veh. Syst. 2019, 7, 194–206. [Google Scholar] [CrossRef]
  14. Zin, T.T.; Kobayashi, I.; Tin, P.; Hama, H. A general video surveillance framework for animal behavior analysis. In Proceedings of the Third International Conference on Computing Measurement Control and Sensor Network (CMCSN), Matsue, Japan, 20–22 May 2016; pp. 130–133. [Google Scholar]
  15. Li, X.; Huang, H.; Savkin, A.V.; Zhang, J. Robotic Herding of Farm Animals Using a Network of Barking Aerial Drones. Drones 2022, 6, 29. [Google Scholar] [CrossRef]
  16. Jung, S.; Ariyur, K.B. Strategic cattle roundup using multiple quadrotor UAVs. Int. J. Aeronaut. Space Sci. 2017, 18, 315–326. [Google Scholar] [CrossRef]
  17. Sun, Y.; Yi, S.; Hou, F.; Luo, D.; Hu, J.; Zhou, Z. Quantifying the dynamics of livestock distribution by unmanned aerial vehicles (UAVs): A case study of yak grazing at the household scale. Rangel. Ecol. Manag. 2020, 73, 642–648. [Google Scholar] [CrossRef]
  18. Favier, M.A.; Green, R.; Linz, A. The potential for UAV technology to assist in sheep management in the Scottish Highlands. Bornimer Agrartech. Ber. 2013, 81, 209–222. [Google Scholar]
  19. Brown, J.; Qiao, Y.; Clark, C.; Lomax, S.; Rafique, K.; Sukkarieh, S. Automated aerial animal detection when spatial resolution conditions are varied. Comput. Electron. Agric. 2022, 193, 106689. [Google Scholar] [CrossRef]
  20. Yaxley, K.J.; Joiner, K.F.; Abbass, H. Drone approach parameters leading to lower stress sheep flocking and movement: Sky shepherding. Sci. Rep. 2021, 11, 7803. [Google Scholar] [CrossRef]
  21. Barbedo, J.G.A.; Koenigkan, L.V.; Santos, P.M.; Ribeiro, A.R.B. Counting cattle in UAV images—dealing with clustered animals and animal/background contrast changes. Sensors 2020, 20, 2126. [Google Scholar] [CrossRef] [Green Version]
  22. Barbedo, J.G.A.; Koenigkan, L.V.; Santos, P.M. Cattle detection using oblique UAV images. Drones 2020, 4, 75. [Google Scholar] [CrossRef]
  23. Andrew, W.; Greatwood, C.; Burghardt, T. Aerial animal biometrics: Individual Friesian cattle recovery and visual identification via an autonomous UAV with onboard deep inference. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 3–8 November 2019; pp. 237–243. [Google Scholar]
  24. Soares, V.H.A.; Ponti, M.A.; Gonçalves, R.A.; Campello, R.J. Cattle counting in the wild with geolocated aerial images in large pasture areas. Comput. Electron. Agric. 2021, 189, 106354. [Google Scholar] [CrossRef]
  25. Shao, W.; Kawakami, R.; Yoshihashi, R.; You, S.; Kawase, H.; Naemura, T. Cattle detection and counting in UAV images based on convolutional neural networks. Int. J. Remote Sens. 2020, 41, 31–52. [Google Scholar] [CrossRef] [Green Version]
  26. Sadiq, B.O.; Adedokun, A.E.; Mu’azu, M.B.; Sha’aban, Y.A. A Specific Routing Protocol for Flying Adhoc Network. Telkomnika 2018, 16, 606–617. [Google Scholar] [CrossRef] [Green Version]
  27. Sadiq, B.O.; Salawudeen, A.T.; Sha’aban, Y.A.; Adedokun, E.A.; Mu’Azu, M.B. Interface protocol design: A communication guide for indoor FANET. Telecommun. Comput. Electron. Control. 2019, 17, 3175–3182. [Google Scholar] [CrossRef] [Green Version]
  28. Avellar, G.S.; Pereira, G.A.; Pimenta, L.C.; Iscold, P. Multi-UAV routing for area coverage and remote sensing with minimum time. Sensors 2015, 15, 27783–27803. [Google Scholar] [CrossRef] [Green Version]
  29. Sadiq, B.O.; Salawudeen, A.T. FANET optimization: A destination path flow model. Int. J. Electr. Comput. Eng. 2020, 10, 4381–4389. [Google Scholar] [CrossRef]
  30. Leonov, A.V.; Litvinov, G.A.; Shcherba, E.V. Simulation and comparative analysis of packet delivery in flying ad hoc network (FANET) using AODV. In Proceedings of the 19th International Conference of Young Specialists on Micro/Nanotechnologies and Electron Devices (EDM), Altai Republic, Russia, 29 June–3 July 2018; pp. 71–78. [Google Scholar]
  31. Herlich, M.; Yamada, S. Optimal distance of multi-hop 802. 11 WiFi relays. In Proceedings of the IEICE Society Conference, Tokyo, Japan, 8 September 2014. [Google Scholar]
Figure 1. Conceptual Framework.
Figure 1. Conceptual Framework.
Applsci 12 06654 g001
Figure 2. The Grazing Field.
Figure 2. The Grazing Field.
Applsci 12 06654 g002
Figure 3. Coverage Concept Scenario.
Figure 3. Coverage Concept Scenario.
Applsci 12 06654 g003
Figure 4. Grazing Field Movement Scenario.
Figure 4. Grazing Field Movement Scenario.
Applsci 12 06654 g004
Figure 5. Operation Sequence for the UAV-based FANET communication.
Figure 5. Operation Sequence for the UAV-based FANET communication.
Applsci 12 06654 g005
Figure 6. Processing Map Region for Grazing Field.
Figure 6. Processing Map Region for Grazing Field.
Applsci 12 06654 g006
Figure 7. Carved out Map Region for Grazing Field.
Figure 7. Carved out Map Region for Grazing Field.
Applsci 12 06654 g007
Figure 8. Flight Path for the UAVs.
Figure 8. Flight Path for the UAVs.
Applsci 12 06654 g008
Figure 9. UAV Coverage Time.
Figure 9. UAV Coverage Time.
Applsci 12 06654 g009
Figure 10. The Effect of Setup Time on Coverage.
Figure 10. The Effect of Setup Time on Coverage.
Applsci 12 06654 g010
Figure 11. The Effect of Setup Time on Coverage (optimal route).
Figure 11. The Effect of Setup Time on Coverage (optimal route).
Applsci 12 06654 g011
Figure 12. UAV Connectivity Routing.
Figure 12. UAV Connectivity Routing.
Applsci 12 06654 g012
Table 1. Simulation Parameters.
Table 1. Simulation Parameters.
ParametersValue
F O V h 84.6
F O V V 73.7
Resolution2592 × 1944
Number of UAV3
UAV setup time60 sec
UAV flight time30 min
flight height10 m
Table 2. Connectivity Routing Algorithm.
Table 2. Connectivity Routing Algorithm.
ParametersValue
Tmin100 m
Packet size10
UAV speed20 m/s
Table 3. UAV in Flight Operation.
Table 3. UAV in Flight Operation.
Flight Time (min)TinTopt
40UAV1 = 22.8387UAV1 = 20.8387
UAV2 = 24.7665UAV2 = 20.7665
UAV3 =24.3536UAV3 = 18.3536
50UAV1 = 23.6503UAV1 = 21.6503
UAV2 = 25.3484UAV2 = 21.3484
UAV3 = 25.2878UAV3 = 19.2878
60UAV1 = 23.9816UAV1 = 21.9816
UAV2 = 25.9041UAV2 = 21.9041
UAV3 = 25.5101UAV3 = 19.5101
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Alanezi, M.A.; Sadiq, B.O.; Sha’aban, Y.A.; Bouchekara, H.R.E.H. Livestock Management on Grazing Field: A FANET Based Approach. Appl. Sci. 2022, 12, 6654. https://doi.org/10.3390/app12136654

AMA Style

Alanezi MA, Sadiq BO, Sha’aban YA, Bouchekara HREH. Livestock Management on Grazing Field: A FANET Based Approach. Applied Sciences. 2022; 12(13):6654. https://doi.org/10.3390/app12136654

Chicago/Turabian Style

Alanezi, Mohammed A., Bashir O. Sadiq, Yusuf A. Sha’aban, and Houssem R. E. H. Bouchekara. 2022. "Livestock Management on Grazing Field: A FANET Based Approach" Applied Sciences 12, no. 13: 6654. https://doi.org/10.3390/app12136654

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop