Cost-Minimizing System Design for Surveillance of Large, Inaccessible Agricultural Areas Using Drones of Limited Range
Abstract
:1. Introduction
2. Materials and Methods
2.1. Model Assumptions
- It is assumed that minimizing the cost of the system is considered to be more important than all other factors. This will usually be the case in resource-poor situations. However, if the user wants to investigate alternative system configurations that give shorter mission times, (s)he may reduce the CS radius parameter (explained below), which will lead to higher-cost systems with reduced mission times.
- The area that is to be surveilled (henceforth referred to as “the field”) is polygonal and may thus be completely specified by the vertices of the bounding polygon.
- The drone is assumed to have a single flight mode, in which the drone’s camera runs continuously and video information is either stored onboard or is continuously transmitted back as the drone flies.
- The drone’s height, velocity, and power consumption are assumed to be constant throughout the drone’s mission (except when recharging).
- A start location is specified by the user, which is a point (presumably somewhere on the field’s edge) from which the drone is originally launched to conduct its mission. When launched from the start location, the drone is fully charged.
- Possible CS locations are scattered throughout the region and are specified to the algorithm by the user. This assumption is included to accommodate the situation where some locations within the field are more favorable than others for CS installation. If all locations are equally favorable, the user can input a regular or quasirandom lattice of points that fills the entire region.
2.2. Mission Path Planning Algorithm Justification and Outline
- 1
- Initialization of parameters for field specifications, potential CS locations and drone range specifications;
- 2
- Create binary grid representation of the field;
- 3
- Use iterative linear programming to find a minimal set of CSs that minimizes the maximum distance from field points to the nearest CS;
- 4
- Decompose field into Voronoi cells based on selected CS locations;
- 5
- Compute a closed spanning walk visiting all CSs using a modified traveling salesman algorithm;
- 6
- Perform a triangular decomposition of each Voronoi cell;
- 7
- Construct drone paths within each triangular region.
2.3. Initialization
- Field of view (FOV) radius: The camera’s field of view is assumed to be circular with fixed radius. As the drone flies, it takes continuous video pictures, thus covering a strip whose width is twice the FOV radius.
- Drone range: This is the maximum distance that a drone can reliably travel on a full charge.
- CS radius: The CS radius is the maximum distance between any point in the field and the nearest CS. In other words, a CS located at point C services a region that lies within a circle whose radius is equal to the CS radius with center at C. The CS radius must necessarily be less than or equal to half the drone’s range. However, the user has the option of choosing a smaller CS radius. A smaller CS radius will tend to produce solutions that have more CS and shorter surveillance mission times.
- Mesh step: For the sake of numerical computations, the coverage region is represented as a rectangular grid, whose mesh size is given by the mesh step.
- Field vertices: The user specifies the polygonal field area by specifying the field vertices in consecutive order. In our simulations we employed three basic shapes (rectangle, square, and hexagon, as shown in Figure 1), and each shape had three different sizes corresponding to total areas 25, 50, and 100 .
- Start location: This is the drone launching point, as explained in Section 2.1.
- CS vector length: according to our assumptions, there will be a number of favorable locations within the field for placing the CSs. In our simulation, these favorable locations are chosen randomly. The CS vector length gives the number of randomly-generated favorable locations within the field. In our simulation, we chose the CS vector length so that there was an average of 1 favorable location per square kilometer. In practice, the user would determine favorable locations by observation and supply these locations to the algorithm.
2.4. Coverage Field Specification
2.5. Charging Station Location Selection
- K and M are the number of candidate CS and the number of points in the field discretization, respectively.
- is a vector of all 1’s.
- A is a incidence matrix. The entry is equal to 1 if the mth grid point in the field discretization is within a distance R of CS k, and 0 otherwise.
- is a solution vector. Interpreted as a logical index vector: if CS k is in the solution, and otherwise.
2.6. Creation of Voronoi Regions
2.7. Charging Station Spanning Walk Construction
- (a)
- Create a 0–1 vertex-vertex adjacency matrix, where vertices represent CSs and adjacent vertices represent CSs with adjacent Voronoi cells.
- (b)
- Use Dijkstra’s algorithm to find the shortest path linking every pair of vertices in the graph described by this adjacency matrix.
- (c)
- Create a weight matrix for the complete graph using the weights computed in (b).
- (d)
- Solve for a weight-minimizing Hamiltonian cycle that includes all CSs.
- (e)
- Replace the edges in solution (d) with weights greater than 1 with the intermediate edges obtained by Dijkstra’s algorithm to obtain the complete walk on the original graph.
- E and N denote the number of vertices and edges in the graph, respectively;
- is an vector, where is the weight associated with the j’th edge in the complete graph;
- M is a vertex-edge incidence matrix. The entry is equal to 1 if node i is an endpoint of edge j;
- is an matrix of all 2’s.
- is an solution vector, interpreted as a logical index vector: if edge j is in the solution, and otherwise.
2.8. Drone Trajectories within Individual Voronoi Cells
- to ;
- to ;
- Go back and forth parallel to until the entire region is covered.
2.9. Performance Evaluation
- is the total distance that the drone flies while its distance to the nearest CS is greater than r, for . Note that , while is equal to the total distance that the drone flies during its surveillance mission (denoted by D).
- is the field area that lies farther than a distance r of the nearest CS, for . Note that , while is the total field area (denoted by ).
2.10. Simulation Specifications
3. Results
4. Discussion
5. Conclusions
Author Contributions
Funding
Conflicts of Interest
Appendix A. Data Tables
Shape | Area | CS Radius | nCS | Total Distance Travel | Theoretical Best Distance | Total Mission Time | Theo. Best Mission Time | |||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
- | - | - | Mean | std | Mean | std | Mean | std | Mean | std | Mean | std |
Octagon | 25 | 2 | 5.6 | 0.49 | 577.68 | 3.15 | 514.03 | 1.52 | 69.32 | 0.38 | 61.68 | 0.18 |
Octagon | 25 | 2.5 | 4.03 | 0.17 | 585.3 | 5.17 | 520.83 | 1.75 | 70.24 | 0.62 | 62.5 | 0.21 |
Octagon | 25 | 3 | 2.97 | 0.17 | 604.5 | 15.88 | 531.42 | 4.85 | 72.54 | 1.91 | 63.77 | 0.58 |
Octagon | 25 | 3.5 | 2.27 | 0.45 | 664.41 | 42.99 | 549.09 | 11.59 | 79.73 | 5.16 | 65.89 | 1.39 |
Octagon | 25 | 3.9 | 2.01 | 0.1 | 696.21 | 31.01 | 555.98 | 3.03 | 83.54 | 3.72 | 66.72 | 0.36 |
Octagon | 50 | 2 | 9.29 | 0.46 | 1149.18 | 5.11 | 1032.16 | 1.75 | 137.9 | 0.61 | 123.86 | 0.21 |
Octagon | 50 | 2.5 | 6.36 | 0.48 | 1172.33 | 10.32 | 1049.87 | 4.42 | 140.68 | 1.24 | 125.98 | 0.53 |
Octagon | 50 | 3 | 5 | 0 | 1223.12 | 29.13 | 1066.9 | 2.77 | 146.77 | 3.5 | 128.03 | 0.33 |
Octagon | 50 | 3.5 | 4 | 0 | 1351.37 | 55.4 | 1091.9 | 6.63 | 162.16 | 6.65 | 131.03 | 0.8 |
Octagon | 50 | 3.9 | 3.09 | 0.29 | 1470.21 | 75.66 | 1138.25 | 19.82 | 176.43 | 9.08 | 136.59 | 2.38 |
Octagon | 100 | 2 | 16.36 | 0.63 | 2299.21 | 6.25 | 2070.7 | 2.9 | 275.91 | 0.75 | 248.48 | 0.35 |
Octagon | 100 | 2.5 | 10.5 | 0.52 | 2345.68 | 12.95 | 2116.3 | 6.96 | 281.48 | 1.55 | 253.96 | 0.83 |
Octagon | 100 | 3 | 7.98 | 0.2 | 2520.19 | 51.55 | 2165.86 | 8.36 | 302.42 | 6.19 | 259.9 | 1 |
Octagon | 100 | 3.5 | 6.01 | 0.1 | 2945.54 | 74.01 | 2244.15 | 14.97 | 353.46 | 8.88 | 269.3 | 1.8 |
Octagon | 100 | 3.9 | 5.02 | 0.14 | 3221.37 | 79.58 | 2317.22 | 22.71 | 386.56 | 9.55 | 278.07 | 2.73 |
Rectangle | 25 | 2 | 6.14 | 0.57 | 574.38 | 4.09 | 513.08 | 1.47 | 68.93 | 0.49 | 61.57 | 0.18 |
Rectangle | 25 | 2.5 | 4.19 | 0.39 | 578.6 | 6.41 | 520.6 | 2.1 | 69.43 | 0.77 | 62.47 | 0.25 |
Rectangle | 25 | 3 | 3.06 | 0.24 | 622.16 | 18.73 | 531.62 | 3.39 | 74.66 | 2.25 | 63.79 | 0.41 |
Rectangle | 25 | 3.5 | 3 | 0 | 625.99 | 17.08 | 532.38 | 2.22 | 75.12 | 2.05 | 63.89 | 0.27 |
Rectangle | 25 | 3.9 | 2.06 | 0.24 | 777.67 | 41.05 | 566.75 | 14.4 | 93.32 | 4.93 | 68.01 | 1.73 |
Rectangle | 50 | 2 | 10.07 | 0.57 | 1146.14 | 5.44 | 1030.17 | 2.13 | 137.54 | 0.65 | 123.62 | 0.26 |
Rectangle | 50 | 2.5 | 7.11 | 0.4 | 1172.43 | 10.29 | 1045.41 | 4.27 | 140.69 | 1.23 | 125.45 | 0.51 |
Rectangle | 50 | 3 | 5.08 | 0.31 | 1269.67 | 34.03 | 1068.52 | 5.6 | 152.36 | 4.08 | 128.22 | 0.67 |
Rectangle | 50 | 3.5 | 4.04 | 0.2 | 1406.68 | 46.76 | 1092.11 | 5.06 | 168.8 | 5.61 | 131.05 | 0.61 |
Rectangle | 50 | 3.9 | 3.67 | 0.47 | 1502.07 | 134.27 | 1110.79 | 25.23 | 180.25 | 16.11 | 133.3 | 3.03 |
Rectangle | 100 | 2 | 17.46 | 0.61 | 2295.22 | 6.94 | 2066.85 | 2.58 | 275.43 | 0.83 | 248.02 | 0.31 |
Rectangle | 100 | 2.5 | 11.04 | 0.49 | 2346.95 | 14.9 | 2111.61 | 6.47 | 281.63 | 1.79 | 253.39 | 0.78 |
Rectangle | 100 | 3 | 8.38 | 0.49 | 2566.02 | 66.03 | 2160.66 | 12.54 | 307.92 | 7.92 | 259.28 | 1.51 |
Rectangle | 100 | 3.5 | 7 | 0.14 | 2860.3 | 66.66 | 2208.43 | 13.91 | 343.24 | 8 | 265.01 | 1.67 |
Rectangle | 100 | 3.9 | 5.74 | 0.58 | 3255.78 | 172.34 | 2274.21 | 42.62 | 390.69 | 20.68 | 272.91 | 5.11 |
Square | 25 | 2 | 5.69 | 0.53 | 574.32 | 3.44 | 514.23 | 1.58 | 68.92 | 0.41 | 61.71 | 0.19 |
Square | 25 | 2.5 | 4.02 | 0.14 | 579.08 | 5.14 | 521.85 | 1.45 | 69.49 | 0.62 | 62.62 | 0.17 |
Square | 25 | 3 | 3.22 | 0.42 | 621.4 | 24.66 | 530.81 | 5.86 | 74.57 | 2.96 | 63.7 | 0.7 |
Square | 25 | 3.5 | 3 | 0 | 634.01 | 12.22 | 533.77 | 3.98 | 76.08 | 1.47 | 64.05 | 0.48 |
Square | 25 | 3.9 | 2.28 | 0.45 | 718.45 | 53.28 | 555.34 | 13.43 | 86.21 | 6.39 | 66.64 | 1.61 |
Square | 50 | 2 | 9.43 | 0.5 | 1146.63 | 5.22 | 1032.23 | 2.11 | 137.6 | 0.63 | 123.87 | 0.25 |
Square | 50 | 2.5 | 6.51 | 0.5 | 1170.59 | 12.08 | 1049.77 | 4.8 | 140.47 | 1.45 | 125.97 | 0.58 |
Square | 50 | 3 | 4.79 | 0.41 | 1244.96 | 34.02 | 1073.87 | 10.17 | 149.39 | 4.08 | 128.86 | 1.22 |
Square | 50 | 3.5 | 4 | 0 | 1323.3 | 40.51 | 1095.39 | 5.76 | 158.8 | 4.86 | 131.45 | 0.69 |
Square | 50 | 3.9 | 3.97 | 0.17 | 1336.64 | 82.51 | 1097.3 | 12.05 | 160.4 | 9.9 | 131.68 | 1.45 |
Square | 100 | 2 | 16.8 | 0.7 | 2295.17 | 6.83 | 2069.55 | 3.05 | 275.42 | 0.82 | 248.35 | 0.37 |
Square | 100 | 2.5 | 10.91 | 0.43 | 2351.34 | 15.09 | 2113.64 | 5.74 | 282.16 | 1.81 | 253.64 | 0.69 |
Square | 100 | 3 | 8.02 | 0.14 | 2547.13 | 52.99 | 2167.99 | 9.08 | 305.66 | 6.36 | 260.16 | 1.09 |
Square | 100 | 3.5 | 6.11 | 0.31 | 3020.36 | 98.3 | 2244.43 | 20.44 | 362.44 | 11.8 | 269.33 | 2.45 |
Square | 100 | 3.9 | 5.04 | 0.2 | 3354.2 | 101.54 | 2323.74 | 21.5 | 402.5 | 12.18 | 278.85 | 2.58 |
Appendix B. Algorithms
Algorithm A1: Triangular drone path algorithm. |
|
Algorithm A2: Help Functions 1 |
|
Algorithm A3: Help Functions 2 |
|
References
- Song, L.J.; Fan, Q. The design and implementation of a video surveillance system for large scale wind farm. Adv. Mater. Res. Trans. Tech. Publ. 2012, 361, 1257–1262. [Google Scholar] [CrossRef]
- Lloret, J.; Garcia, M.; Bri, D.; Sendra, S. A wireless sensor network deployment for rural and forest fire detection and verification. Sensors 2009, 9, 8722–8747. [Google Scholar] [CrossRef]
- Lloret, J.; Bosch, I.; Sendra, S.; Serrano, A. A wireless sensor network for vineyard monitoring that uses image processing. Sensors 2011, 11, 6165–6196. [Google Scholar] [CrossRef] [Green Version]
- Garcia-Sanchez, A.J.; Garcia-Sanchez, F.; Garcia-Haro, J. Wireless sensor network deployment for integrating video-surveillance and data-monitoring in precision agriculture over distributed crops. Comput. Electron. Agric. 2011, 75, 288–303. [Google Scholar] [CrossRef]
- Garcia-Sanchez, A.J.; Garcia-Sanchez, F.; Losilla, F.; Kulakowski, P.; Garcia-Haro, J.; Rodríguez, A.; López-Bao, J.V.; Palomares, F. Wireless sensor network deployment for monitoring wildlife passages. Sensors 2010, 10, 7236–7262. [Google Scholar] [CrossRef] [Green Version]
- Eisenbeiss, H. A mini unmanned aerial vehicle (UAV): System overview and image acquisition. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2004, 36, 1–7. [Google Scholar]
- Freeman, P.K.; Freeland, R.S. Agricultural UAVs in the US: Potential, policy, and hype. Remote Sens. Appl. Soc. Environ. 2015, 2, 35–43. [Google Scholar]
- Herwitz, S.; Johnson, L.; Dunagan, S.; Higgins, R.; Sullivan, D.; Zheng, J.; Lobitz, B.; Leung, J.; Gallmeyer, B.; Aoyagi, M.; et al. Imaging from an unmanned aerial vehicle: Agricultural surveillance and decision support. Comput. Electron. Agric. 2004, 44, 49–61. [Google Scholar] [CrossRef]
- Jarman, M.; Vesey, J.; Febvre, P. Unmanned Aerial Vehicles (UAVs) for UK Agriculture: Creating an Invisible Precision Farming Technology. White Paper, 19 July 2016. [Google Scholar]
- Barrero, O.; Perdomo, S.A. RGB and multispectral UAV image fusion for Gramineae weed detection in rice fields. Precis. Agric. 2018, 19, 809–822. [Google Scholar] [CrossRef]
- Hassanein, M.; El-Sheimy, N. An efficient weed detection procedure using low-cost uav imagery system for precision agriculture applications. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2018. [Google Scholar] [CrossRef] [Green Version]
- Pallottino, F.; Menesatti, P.; Figorilli, S.; Antonucci, F.; Tomasone, R.; Colantoni, A.; Costa, C. Machine vision retrofit system for mechanical weed control in precision agriculture applications. Sustainability 2018, 10, 2209. [Google Scholar] [CrossRef] [Green Version]
- Huang, Y.; Reddy, K.N.; Fletcher, R.S.; Pennington, D. UAV low-altitude remote sensing for precision weed management. Weed Technol. 2018, 32, 2–6. [Google Scholar] [CrossRef]
- Shafian, S.; Rajan, N.; Schnell, R.; Bagavathiannan, M.; Valasek, J.; Shi, Y.; Olsenholler, J. Unmanned aerial systems-based remote sensing for monitoring sorghum growth and development. PLoS ONE 2018, 13, e0196605. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Han, L.; Yang, G.; Yang, H.; Xu, B.; Li, Z.; Yang, X. Clustering field-based maize phenotyping of plant-height growth and canopy spectral dynamics using a UAV remote-sensing approach. Front. Plant Sci. 2018, 9, 1638. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Iwasaki, K.; Torita, H.; Abe, T.; Uraike, T.; Touze, M.; Fukuchi, M.; Sato, H.; Iijima, T.; Imaoka, K.; Igawa, H. Spatial pattern of windbreak effects on maize growth evaluated by an unmanned aerial vehicle in Hokkaido, northern Japan. Agrofor. Syst. 2019, 93, 1133–1145. [Google Scholar] [CrossRef]
- Mahajan, U.; Raj, B. Drones for normalized difference vegetation index (NDVI), to estimate crop health for precision agriculture: A cheaper alternative for spatial satellite sensors. In Proceedings of the International Conference on Innovative Research in Agriculture, Food Science, Forestry, Horticulture, Aquaculture, Animal Sciences, Biodiversity, Ecological Sciences and Climate Change (AFHABEC-2016), Delhi, India, 22 October 2016; Volume 22. [Google Scholar]
- Albetis, J.; Jacquin, A.; Goulard, M.; Poilvé, H.; Rousseau, J.; Clenet, H.; Dedieu, G.; Duthoit, S. On the potentiality of UAV multispectral imagery to detect Flavescence dorée and Grapevine Trunk Diseases. Remote Sens. 2019, 11, 23. [Google Scholar] [CrossRef] [Green Version]
- Kerkech, M.; Hafiane, A.; Canals, R. Deep leaning approach with colorimetric spaces and vegetation indices for vine diseases detection in UAV images. Comput. Electron. Agric. 2018, 155, 237–243. [Google Scholar] [CrossRef]
- Ronchetti, G.; Mayer, A.; Facchi, A.; Ortuani, B.; Sona, G. Crop Row Detection through UAV Surveys to Optimize On-farm Irrigation Management. Remote Sens. 2020, 12, 1967. [Google Scholar] [CrossRef]
- Chen, A.; Orlov-Levin, V.; Meron, M. Applying high-resolution visible-channel aerial imaging of crop canopy to precision irrigation management. Agric. Water Manag. 2019, 216, 196–205. [Google Scholar] [CrossRef]
- Stein, M.; Bargoti, S.; Underwood, J. Image based mango fruit detection, localisation and yield estimation using multiple view geometry. Sensors 2016, 16, 1915. [Google Scholar] [CrossRef]
- Zhou, X.; Zheng, H.; Xu, X.; He, J.; Ge, X.; Yao, X.; Cheng, T.; Zhu, Y.; Cao, W.; Tian, Y. Predicting grain yield in rice using multi-temporal vegetation indices from UAV-based multispectral and digital imagery. ISPRS J. Photogramm. Remote Sens. 2017, 130, 246–255. [Google Scholar] [CrossRef]
- Yeom, J.; Jung, J.; Chang, A.; Maeda, M.; Landivar, J. Automated open cotton boll detection for yield estimation using unmanned aircraft vehicle (UAV) data. Remote Sens. 2018, 10, 1895. [Google Scholar] [CrossRef] [Green Version]
- Mardanisamani, S.; Maleki, F.; Hosseinzadeh Kassani, S.; Rajapaksa, S.; Duddu, H.; Wang, M.; Shirtliffe, S.; Ryu, S.; Josuttes, A.; Zhang, T.; et al. Crop lodging prediction from UAV-acquired images of wheat and canola using a DCNN augmented with handcrafted texture features. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops 2019, Long Beach, CA, USA, 16–20 June 2019. [Google Scholar]
- Zhang, Z.; Flores, P.; Igathinathane, C.; L Naik, D.; Kiran, R.; Ransom, J.K. Wheat Lodging Detection from UAS Imagery Using Machine Learning Algorithms. Remote Sens. 2020, 12, 1838. [Google Scholar] [CrossRef]
- Zhao, X.; Yuan, Y.; Song, M.; Ding, Y.; Lin, F.; Liang, D.; Zhang, D. Use of unmanned aerial vehicle imagery and deep learning unet to extract rice lodging. Sensors 2019, 19, 3859. [Google Scholar] [CrossRef] [Green Version]
- Pádua, L.; Marques, P.; Hruška, J.; Adão, T.; Peres, E.; Morais, R.; Sousa, J.J. Multi-temporal vineyard monitoring through UAV-based RGB imagery. Remote Sens. 2018, 10, 1907. [Google Scholar] [CrossRef] [Green Version]
- De Castro, A.I.; Jimenez-Brenes, F.M.; Torres-Sánchez, J.; Peña, J.M.; Borra-Serrano, I.; López-Granados, F. 3-D characterization of vineyards using a novel UAV imagery-based OBIA procedure for precision viticulture applications. Remote Sens. 2018, 10, 584. [Google Scholar] [CrossRef] [Green Version]
- Tsouros, D.C.; Bibi, S.; Sarigiannidis, P.G. A review on UAV-based applications for precision agriculture. Information 2019, 10, 349. [Google Scholar] [CrossRef] [Green Version]
- Choset, H. Coverage for robotics—A survey of recent results. Ann. Math. Artif. Intell. 2001, 31, 113–126. [Google Scholar] [CrossRef]
- Oksanen, T.; Visala, A. Coverage path planning algorithms for agricultural field machines. J. Field Robot. 2009, 26, 651–668. [Google Scholar] [CrossRef]
- Jin, J.; Tang, L. Coverage path planning on three-dimensional terrain for arable farming. J. Field Robot. 2011, 28, 424–440. [Google Scholar] [CrossRef]
- Hameed, I.A. Intelligent coverage path planning for agricultural robots and autonomous machines on three-dimensional terrain. J. Intell. Robot. Syst. 2014, 74, 965–983. [Google Scholar] [CrossRef] [Green Version]
- Kakaes, K.; Greenwood, F.; Lippincott, M.; Dosemagen, S.; Meier, P.; Wich, S. Drones and Aerial Observation: New Technologies for Property Rights. In Human Rights, and Global Development: A Primer; New America: Washington, DC, USA, 2015; pp. 514–519. [Google Scholar]
- Ghaddar, A.; Merei, A. Energy-Aware Grid Based Coverage Path Planning for UAVs. In Proceedings of the Thirteenth International Conference on Sensor Technologies and Applications SENSORCOMM, Nice, France, 27–31 October 2019; pp. 27–31. [Google Scholar]
- Vasquez-Gomez, J.I.; Herrera-Lozada, J.C.; Olguin-Carbajal, M. Coverage path planning for surveying disjoint areas. In Proceedings of the 2018 International Conference on Unmanned Aircraft Systems (ICUAS), Dallas, TX, USA, 12–15 June 2018; pp. 899–904. [Google Scholar]
- Coombes, M.; Chen, W.H.; Liu, C. Boustrophedon coverage path planning for UAV aerial surveys in wind. In Proceedings of the 2017 International Conference on Unmanned Aircraft Systems (ICUAS), Miami, FL, USA, 13–16 June 2017; pp. 1563–1571. [Google Scholar]
- Barrientos, A.; Colorado, J.; Cerro, J.D.; Martinez, A.; Rossi, C.; Sanz, D.; Valente, J. Aerial remote sensing in agriculture: A practical approach to area coverage and path planning for fleets of mini aerial robots. J. Field Robot. 2011, 28, 667–689. [Google Scholar] [CrossRef] [Green Version]
- Almeida, A.; Ramalho, G.; Santana, H.; Tedesco, P.; Menezes, T.; Corruble, V.; Chevaleyre, Y. Recent advances on multi-agent patrolling. In Brazilian Symposium on Artificial Intelligence; Springer: Berlin/Heidelberg, Germany, 2004; pp. 474–483. [Google Scholar]
- Pasqualetti, F.; Franchi, A.; Bullo, F. On cooperative patrolling: Optimal trajectories, complexity analysis, and approximation algorithms. IEEE Trans. Robot. 2012, 28, 592–606. [Google Scholar] [CrossRef] [Green Version]
- Huang, W.H. Optimal line-sweep-based decompositions for coverage algorithms. In Proceedings of the 2001 ICRA IEEE International Conference on Robotics and Automation (Cat. No. 01CH37164), Seoul, Korea, 21–26 May 2001; Volume 1, pp. 27–32. [Google Scholar]
- Galceran, E.; Carreras, M. A survey on coverage path planning for robotics. Robot. Auton. Syst. 2013, 61, 1258–1276. [Google Scholar] [CrossRef] [Green Version]
- Collins, M.D. Using a Drone to Search for the Ivory-billed Woodpecker (Campephilus principalis). Drones 2018, 2, 11. [Google Scholar] [CrossRef] [Green Version]
- Raciti, A.; Rizzo, S.A.; Susinni, G. Drone charging stations over the buildings based on a wireless power transfer system. In Proceedings of the 2018 IEEE/IAS 54th Industrial and Commercial Power Systems Technical Conference (I&CPS), Niagara Falls, ON, Canada, 7–10 May 2018; pp. 1–6. [Google Scholar]
- Choi, C.H.; Jang, H.J.; Lim, S.G.; Lim, H.C.; Cho, S.H.; Gaponov, I. Automatic wireless drone charging station creating essential environment for continuous drone operation. In Proceedings of the 2016 International Conference on Control, Automation and Information Sciences (ICCAIS), Ansan, Korea, 27–29 October 2016; pp. 132–136. [Google Scholar]
- Kim, S.J.; Lim, G.J. A hybrid battery charging approach for drone-aided border surveillance scheduling. Drones 2018, 2, 38. [Google Scholar] [CrossRef] [Green Version]
- Feng, Y.; Zhang, C.; Baek, S.; Rawashdeh, S.; Mohammadi, A. Autonomous landing of a UAV on a moving platform using model predictive control. Drones 2018, 2, 34. [Google Scholar] [CrossRef] [Green Version]
- Tseng, C.M.; Chau, C.K.; Elbassioni, K.M.; Khonji, M. Flight tour planning with recharging optimization for battery-operated autonomous drones. arXiv 2017, arXiv:1703.10049. [Google Scholar]
- Trotta, A.; Di Felice, M.; Montori, F.; Chowdhury, K.R.; Bononi, L. Joint coverage, connectivity, and charging strategies for distributed UAV networks. IEEE Trans. Robot. 2018, 34, 883–900. [Google Scholar] [CrossRef]
- Hong, I.; Kuby, M.; Murray, A.T. A range-restricted recharging station coverage model for drone delivery service planning. Transp. Res. Part C Emerg. Technol. 2018, 90, 198–212. [Google Scholar] [CrossRef]
- Mekikis, P.V.; Antonopoulos, A. Breaking the boundaries of aerial networks with charging stations. In Proceedings of the ICC 2019—2019 IEEE International Conference on Communications (ICC), Shanghai, China, 20–24 May 2019; pp. 1–6. [Google Scholar]
- Li, B.; Patankar, S.; Moridian, B.; Mahmoudian, N. Planning large-scale search and rescue using team of uavs and charging stations. In Proceedings of the 2018 IEEE International Symposium on Safety, Security and Rescue Robotics (SSRR), Philadelphia, PA, USA, 6–8 August 2018; pp. 1–8. [Google Scholar]
- Kershner, R. The number of circles covering a set. Am. J. Math. 1939, 61, 665–671. [Google Scholar] [CrossRef]
- DJI. Mavic Air User Manual (v1.2). 2018. Available online: https://dl.djicdn.com/downloads/Mavic+Air/20180525/Mavic+Air+User+Manual+v1.2.pdf (accessed on 13 July 2020).
- HeishaTech. C300 Drone Charging Pad. 2019. Available online: https://www.heishatech.com/charging-pad-3/ (accessed on 13 July 2020).
Name | Symbol | Value(s) in Simulation |
---|---|---|
FOV radius | km | |
Drone range | d | 8 km |
Drone velocity | v | 25 km/h |
CS radius | R | 2.0, 2.5, 3.0, 3.5, 3.9 km |
Mesh step | s | km |
Field vertices | Various (see Figure 1) | |
Field area | 25, 50, 100 | |
Start location | ||
CS vector length | 25, 50, 100 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Vargas Tamayo, L.; Thron, C.; Fendji, J.L.K.E.; Thomas, S.-K.; Förster, A. Cost-Minimizing System Design for Surveillance of Large, Inaccessible Agricultural Areas Using Drones of Limited Range. Sustainability 2020, 12, 8878. https://doi.org/10.3390/su12218878
Vargas Tamayo L, Thron C, Fendji JLKE, Thomas S-K, Förster A. Cost-Minimizing System Design for Surveillance of Large, Inaccessible Agricultural Areas Using Drones of Limited Range. Sustainability. 2020; 12(21):8878. https://doi.org/10.3390/su12218878
Chicago/Turabian StyleVargas Tamayo, Luis, Christopher Thron, Jean Louis Kedieng Ebongue Fendji, Shauna-Kay Thomas, and Anna Förster. 2020. "Cost-Minimizing System Design for Surveillance of Large, Inaccessible Agricultural Areas Using Drones of Limited Range" Sustainability 12, no. 21: 8878. https://doi.org/10.3390/su12218878
APA StyleVargas Tamayo, L., Thron, C., Fendji, J. L. K. E., Thomas, S.-K., & Förster, A. (2020). Cost-Minimizing System Design for Surveillance of Large, Inaccessible Agricultural Areas Using Drones of Limited Range. Sustainability, 12(21), 8878. https://doi.org/10.3390/su12218878