Citrus Tree Segmentation from UAV Images Based on Monocular Machine Vision in a Natural Orchard Environment
Abstract
:1. Introduction
2. Materials
2.1. The Study Area and the UAV Equipment
2.2. The Dataset
3. The Citrus Tree Segmentation Method
3.1. Image Pre-Processing
3.2. Potential Fruit Tree Region Extraction
3.2.1. RG Chromatic Tree Extraction Method
3.2.2. Chromatic Mapping Extraction Method Enhanced with MSR
3.3. Fruit Tree SVM Segmentation Model
4. Results and Discussion
4.1. Evaluation of the Influence of the Image Brightness Condition on the Segmentation Results
4.2. Evaluation of the Influence of the Weed Coverage Condition on the Segmentation
4.3. Evaluation of Fruit Tree Segmentation Results
5. Conclusions
Author Contributions
Funding
Conflicts of Interest
Abbreviations
ANN | Artificial neural network |
BC | Brightness condition |
CNN | Convolutional neural network |
CTR | Correct segmentation rate of fruit trees |
EMSRCM | Extraction on multi-scale retinex chromatic mapping method |
ERGCM | RG chromatic extraction method |
GLCM | Gray-level co-occurrence matrix |
HE | Histogram equalization |
HSI | Hue/saturation/intensity color space |
IB | Insufficient brightness |
imp | Relative G–R chromatic value or mapping |
IoU | Intersection over union |
LBP | Local binary pattern |
LiDar | Light detection and ranging |
LWCR | Large weed coverage rate |
MSR | Multi-scale retinex method |
MWCR | Medium weed coverage rate |
NDVI | Normalized difference vegetation index |
RGB | Red/green/blue color space |
RoIs | Regions of interest (potential area of fruit trees) |
SB | Sufficient brightness |
SfM | Structure from motion |
SRIHE | Selective regions intensity histogram equalization method |
SVM | Support vector machine |
SWCR | Small weed coverage rate |
UAV | Unmanned aerial vehicle |
UEJR | Under-extraction judgement rule |
WCC | Weed coverage condition |
References
- Katsigiannis, P.; Misopolinos, L.; Liakopoulos, V.; Alexandridis, T.K.; Zalidis, G. An autonomous multi-sensor UAV system for reduced-input precision agriculture applications. In Proceedings of the 2016 24th Mediterranean Conference on Control and Automation (MED), Athens, Greece, 21–24 June 2016; IEEE: Piscataway, NJ, USA, 2016; Volume 24, pp. 60–64. [Google Scholar] [CrossRef]
- Becerra, V.M. Autonomous control of unmanned aerial vehicles. Electronics 2019, 8, 452. [Google Scholar] [CrossRef] [Green Version]
- Abu, J.M.; Hossain, S.; Al-Masud, M.A.; Hasan, K.M.; Newaz, S.H.S.; Ahsan, M.S. Design and development of an autonomous agricultural drone for sowing seeds. In Proceedings of the 7th Brunei International Conference on Engineering and Technology (BICET) 2018, Bandar Seri Begawan, Brunei, 12–14 November 2018; Volume 7, pp. 1–4. [Google Scholar] [CrossRef]
- Philipp, L.; Raghav, K.; Johannes, P.; Roland, S.; Cyrill, S. UAV-Based Crop and Weed Classification for Smart Farming. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017; IEEE: Piscataway, NJ, USA, 2018; Volume 6, pp. 1–8. [Google Scholar] [CrossRef]
- Dilek, K.; Serdar, S.; Nagihan, A.; Bekir, T.S. Automatic citrus tree extraction from UAV images and digital surface models using circular Hough transform. Comput. Electron. Agric. 2018, 150, 289–301. [Google Scholar] [CrossRef]
- Shouyang, L.; Fred, B.; Bruno, A.; Philippe, B.; Hemmerlé, M. Estimates of plant density of wheat crops at emergence from very low altitude UAV imagery. Remote Sens. Environ. 2017, 198, 105–114. [Google Scholar] [CrossRef] [Green Version]
- Zou, X.; Zhang, X.; Shi, J.; Li, Z.; Shen, T. Detection of chlorophyll content and distribution in citrus orchards based on low-altitude remote sensing and bio-sensors. Int. J. Agric. Biol. Eng. 2018, 11, 164–169. [Google Scholar] [CrossRef] [Green Version]
- Zhuang, J.; Hou, C.; Tang, Y.; He, Y.; Guo, Q.; Zhong, Z.; Luo, S. Computer vision-based localisation of picking points for automatic litchi harvesting applications towards natural scenarios. Biosyst. Eng. 2019, 187, 1–20. [Google Scholar] [CrossRef]
- Shao, Z.; Nan, Y.N.; Xiao, X.; Zhang, L.; Peng, Z. A Multi-View Dense Point Cloud Generation Algorithm Based on Low-Altitude Remote Sensing Images. Remote Sens. 2016, 8, 381. [Google Scholar] [CrossRef] [Green Version]
- Qin, W.C.; Qiu, B.J.; Xue, X.Y.; Chen, C.; Xu, Z.F.; Zhou, Q.Q. Droplet deposition and control effect of insecticides sprayed with an unmanned aerial vehicle against plant hoppers. Crop Prot. 2016, 85, 79–88. [Google Scholar] [CrossRef]
- Näsi, R.; Viljanen, N.; Kaivosoja, J.; Alhonoja, K.; Hakala, T.; Markelin, L.; Honkavaara, E. Estimating Biomass and Nitrogen Amount of Barley and Grass Using UAV and Aircraft Based Spectral and Photogrammetric 3D Features. Remote Sens. 2018, 10, 1082. [Google Scholar] [CrossRef] [Green Version]
- Csillik, O.; Cherbini, J.; Johnson, R.; Lyons, A.; Kelly, M. Identification of Citrus Trees from Unmanned Aerial Vehicle Imagery Using Convolutional Neural Networks. Drones 2018, 2, 39. [Google Scholar] [CrossRef] [Green Version]
- Johansen, K.; Raharjo, T.; Mccabe, M. Using Multi-Spectral UAV Imagery to Extract Tree Crop Structural Properties and Assess Pruning Effects. Remote Sens. 2018, 10, 854. [Google Scholar] [CrossRef] [Green Version]
- Srestasathiern, P.; Rakwatin, P. Oil Palm Tree Detection with High Resolution Multi-Spectral Satellite Imagery. Remote Sens. 2014, 6, 9749–9774. [Google Scholar] [CrossRef] [Green Version]
- Malambo, L.; Popescu, S.C.; Murray, S.C.; Putmana, E.; Pugh, N.A.; Horne, D.W.; Richardson, G.; Sheridan, R.; Rooney, W.L.; Avant, R.; et al. Multitemporal field-based plant height estimation using 3D point clouds generated from small unmanned aerial systems high-resolution imagery. Int. J. Appl. Earth Obs. Geoinf. 2018, 64, 31–42. [Google Scholar] [CrossRef]
- Torres-Sanchez, J.; De Castro, A.I.; Pena-Barragan, J.M.; Jimenez-Brenes, F.M.; Arquero, O.; Lovera, M.; Lopez-Granados, F. Mapping the 3D structure of almond trees using UAV acquired photogrammetric point clouds and object-based image analysis. Biosyst. Eng. 2018, 176, 172–184. [Google Scholar] [CrossRef]
- Juan, G.H.; Eduardo, G.F.; Alexandre, S.; João, S.; Alexandra, N.; Alexandra, C.C.; Luis, F.; Margarida, T.; Ramón, D.V. Using high resolution UAV imagery to estimate tree variables in Pinus pinea plantation in Portugal. For. Syst. 2016, 25, 16. [Google Scholar] [CrossRef] [Green Version]
- Guo, Q.; Su, Y.; Hu, T.; Zhao, X.; Wu, F.; Li, Y.; Liu, J.; Chen, L.; Xu, G.; Lin, G.; et al. An integrated UAV-borne LiDar system for 3D habitat mapping in three forest ecosystems across China. Int. J. Remote Sens. 2017, 38, 2954–2972. [Google Scholar] [CrossRef]
- Pedro, M.; Luis, P.; Telmo, A.; Jonas, H.; Emanuel, P.; Antonio, S.; Joaquim, J.S. UAV-based automatic detection and monitoring of chestnut trees. Remote Sens. 2019, 11, 855. [Google Scholar] [CrossRef] [Green Version]
- Wu, X.; Shen, X.; Cao, L.; Wang, G.; Cao, F. Assessment of Individual Tree Detection and Canopy Cover Estimation using Unmanned Aerial Vehicle based Light Detection and Ranging (UAV-LiDar) Data in Planted Forests. Remote Sens. 2019, 11, 908. [Google Scholar] [CrossRef] [Green Version]
- Li, J.; Yang, B.; Cong, Y.; Cao, L.; Fu, X.; Dong, Z. 3D Forest Mapping Using a Low-Cost UAV Laser Scanning System: Investigation and Comparison. Remote Sens. 2019, 11, 717. [Google Scholar] [CrossRef] [Green Version]
- Omair, H.; Nasir, A.K.; Roth, H.; Khan, M.F. Precision forestry: Trees counting in urban areas using visible imagery based on an unmanned aerial vehicle. IFAC-PapersOnLine 2016, 49, 16–21. [Google Scholar] [CrossRef]
- Zortea, M.; Macedo, M.M.G.; Mattos, A.B.; Ruga, B.C.; Gemignani, B.H. Automatic citrus tree detection from UAV images based on convolutional neural networks. In Proceedings of the 2018 31th SIBGRAPI Conference on Graphics, Patterns and Images (SIBGRAPI), Foz do Iguaçu, Brazil, 29 October–1 November 2018; Volume 11, pp. 1–7. [Google Scholar]
- Ramesh, K.; Akanksha, A.; Bazila, B.; Omkar, S.N.; Gautham, A.; Meenavathi, M.B. Tree crown detection, delineation and counting in uav remote sensed images: A neural network based spectral–spatial method. J. Indian Soc. Remote Sens. 2018, 46, 991–1004. [Google Scholar] [CrossRef]
- Lin, Y.; Lin, J.; Jiang, M.; Yao, L.; Lin, J. Use of UAV oblique imaging for the detection of individual trees in residential environments. Urban For. Urban Green. 2015, 14, 404–412. [Google Scholar] [CrossRef]
- Poblete-Echeverría, C.; Guillermo, O.; Ben, I.; Matthew, B. Detection and segmentation of vine canopy in ultra-high spatial resolution RGB imagery obtained from unmanned aerial vehicle (UAV): A case study in a commercial vineyard. Remote Sens. 2017, 9, 268. [Google Scholar] [CrossRef] [Green Version]
- Zhao, T.; Yang, Y.; Niu, H.; Wang, D.; Chen, Y.Q. Comparing U-Net convolutional network with mask R-CNN in the performances of pomegranate tree canopy segmentation. SPIE 2018, 107801J, 1–9. [Google Scholar] [CrossRef]
- Feduck, C.; McDermid, G.J.; Castilla, G. Detection of Coniferous Seedlings in UAV Imagery. Forests 2018, 9, 432. [Google Scholar] [CrossRef] [Green Version]
- Sabzi, S.; Abbaspour-Gilandeh, Y.; García-Mateos, G.; Ruiz-Canales, A.; Molina-Maetinez, J.M.; Ignacio Arribas, J. An Automatic Non-Destructive Method for the Classification of the Ripeness Stage of Red Delicious Apples in Orchards Using Aerial Video. Agronomy 2019, 9, 84. [Google Scholar] [CrossRef] [Green Version]
- Li, X.; Dai, B.; Sun, H.; Li, W. Corn classification system based on computer vision. Symmetry 2019, 11, 591. [Google Scholar] [CrossRef] [Green Version]
- Blaix, C.; Moonen, A.C.; Dostatny, D.F.; Izquierdo, J.; Le Corff, J.; Morrison, J.; Von Redwitz, C.; Schumacher, M.; Westerman, P.R. Quantification of regulating ecosystem services provided by weeds in annual cropping systems using a systematic map approach. Weed Res. 2018, 58, 151–164. [Google Scholar] [CrossRef]
- Tang, Y.; Hou, C.; Luo, S.; Lin, J.; Yang, Z.; Huang, W. Effects of operation height and tree shape on droplet deposition in citrus trees using an unmanned aerial vehicle. Comput. Electron. Agric. 2018, 148, 1–7. [Google Scholar] [CrossRef]
- Nam, M.; Rhee, P.K. An efficient face recognition for variant illumination condition. Intell. Signal Process. Commun. Syst. (ISPACS) 2004, 12, 111–115. [Google Scholar] [CrossRef]
- Tan, S.F.; Isa, N.A.M. Exposure Based Multi-Histogram Equalization Contrast Enhancement for Non-Uniform Illumination Images. IEEE Access 2019, 7, 70842–70861. [Google Scholar] [CrossRef]
- Kim, J.Y.; Kim, L.S.; Hwang, S.H. An advanced contrast enhancement using partially overlapped sub-block histogram equalization. IEEE Trans. Circuits Syst. Video Technol. 2002, 11, 475–484. [Google Scholar] [CrossRef]
- Otsu, N.A. Threshold Selection Method from Gray-Level Histograms. IEEE Trans. Geosci. Remote Sens. 1979, 9, 62–66. [Google Scholar] [CrossRef] [Green Version]
- Pu, Y.F.; Siarry, P.; Chatterjee, A.; Wang, Z.N.; Zhang, Y.; Liu, Y.G.; Zhong, J.L.; Wang, Y. A Fractional-Order Variational Framework for Retinex: Fractional-Order Partial Differential Equation-Based Formulation for Multi-Scale Nonlocal Contrast Enhancement with Texture Preserving. IEEE Trans. Geosci. Remote Sens. 2018, 27, 1214–1229. [Google Scholar] [CrossRef] [PubMed]
- Wu, Q.S.; Luo, X.L.; Li, H.; Liu, P.Z. An Improved Multi-Scale Retinex Algorithm for Vehicle Shadow Elimination Based on Variational Kimmel. In Proceedings of the 7th International Conference on Ubiquitous Intelligence and Computing and 7th International Conference on Autonomic and Trusted Computing (UIC/ATC), Xi’an, China, 26–29 October 2010; IEEE Computer Society: Washington, DC, USA, 2010; pp. 31–34. [Google Scholar] [CrossRef]
- Yu, T.; Meng, X.; Zhu, M.; Han, M. An Improved Multi-scale Retinex Fog and Haze Image Enhancement Method. In Proceedings of the 2016 International Conference on Information System and Artificial Intelligence (ISAI), Hong Kong, China, 24–26 June 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 557–560. [Google Scholar] [CrossRef]
- Ojala, T.; Pietikäinen, M.; Mäenpää, T. Multiresolution gray-scale and rotation invariant texture classification with local binary patterns. IEEE Trans. Pattern Anal. Mach. Intell. 2002, 24, 971–987. [Google Scholar] [CrossRef]
- Guo, Q.; Chen, Y.; Tang, Y.; Zhuang, J.; He, Y.; Hou, C.; Chu, X.; Zhong, Z.; Luo, S. Lychee Fruit Detection Based on Monocular Machine Vision in Orchard Environment. Sensors 2019, 19, 4091. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Kyung, W.; Kim, D.; Ha, Y. Real-time multi-scale Retinex to enhance night scene of vehicular camera. In Proceedings of the 17th Korea-Japan Joint Workshop on Frontiers of Computer Vision (FCV), Ulsan, Korea, 9–11 February 2011; Volume 17, pp. 1–4. [Google Scholar] [CrossRef]
- Fawcett, T. An introduction to ROC analysis. Pattern Recognit. Lett. 2006, 27, 861–874. [Google Scholar] [CrossRef]
System Parameter | Value |
---|---|
Vertical accuracy | ±0.1 m |
Camera sensor | 1/2.3-inch CMOS |
Image resolution | 4032 × 3024 |
FOV | 85° |
Focal length | 24 mm |
Diaphragm size | f/2.8 |
Time of exposure | 1/1000 s |
Camera model | FC2103 |
Format/type of images | JPEG |
Dataset | BC | WCC | Total | ||
---|---|---|---|---|---|
SWCR | MWCR | LWCR | |||
Dataset 0 | IB | 64 (1344) * | 29 (589) | 89 (1869) | 182 (3802) |
SB | 57 (1351) | 32 (483) | 63 (1512) | 152 (3346) | |
Total | 121 (2695) | 61 (1072) | 152 (3381) | 334 (7148) | |
Training Set | IB | 32 (701) | 14 (222) | 44 (1024) | 90 (1947) |
SB | 29 (641) | 16 (280) | 32 (780) | 77 (1701) | |
Total | 61 (1342) | 30 (502) | 76 (1804) | 167 (3648) | |
Test Set | IB | 32 (643) | 15 (347) | 45 (845) | 92 (1855) |
SB | 28 (710) | 16 (203) | 31 (732) | 75 (1645) | |
Total | 60(1353) | 31(570) | 76(1577) | 167(3500) |
Input: | The Under-Extracted Image Iu Judged with the UEJR |
---|---|
Step 1 | Separate the R, G, and B channels of Iu and calculate Riby Equation (2) and M images (Mu). |
Step 2 | Convert Mu into the 2R-G-B chromatic mapping Ic. |
Step 3 | Apply a closed filter Ic with a 20-pixel radius to obtain Icc. |
Step 4 | Apply a top-hat filter Icc with a 24-pixel radius to obtain Icct. |
Step 5 | Apply an open filter Icct with a 20-pixel radius to obtain Iccto. |
Step 6 | Extract Icct with the Otsu method to obtain the binary image Ibw. |
Step 7 | Use a convex hull transform to convert Ibw into Icn-bw. |
Step 8 | Exclude small areas of interference in Icn-bw with a 0.05% threshold and generate Iroi. |
Output | The extracted foreground areas of the image, Iroi. |
BC and WCC | A a (%) | At b (%) | N c (Trees) | Nt d (Trees) | |
---|---|---|---|---|---|
IB | SWCR | 29 ± 8.5 | 30 ± 6.5 | 39.1 ± 10.2 | 37.7 ± 8.5 |
MWCR | 45 ± 9.1 | 31 ± 8.5 | 12.1 ± 4.4 | 41.7 ± 14.4 | |
LWCR | 76 ± 19.1 | 32 ± 5.8 | 1.3 ± 0.6 | 32.4 ± 7.1 | |
SB | SWCR | 32 ± 7.2 | 31 ± 5.2 | 27.5 ± 17.3 | 25.5 ± 12.1 |
MWCR | 41 ± 10.9 | 36 ± 6.9 | 16.5 ± 2.7 | 30.1 ± 4.7 | |
LWCR | 81 ± 17.3 | 38 ± 9.2 | 1.5 ± 0.4 | 35.3 ± 10.6 | |
Average | 57 ± 21.1 | 34 ± 11.5 | 17 ± 10.5 | 27.8 ± 7.7 |
Method | BC | WCC | IoU a (%) | PI b (%) | RI c (%) | F1I d (%) | CTR e (%) | PC f (%) | RC g (%) | F1C h (%) |
---|---|---|---|---|---|---|---|---|---|---|
Without SVM model | IB | S | 86.02 | 93.28 | 91.71 | 92.49 | 90.97 | 97.47 | 93.17 | 95.27 |
M | 71.61 | 86.55 | 80.58 | 83.46 | 78.38 | 93.89 | 82.59 | 87.88 | ||
L | 64.89 | 86.52 | 72.19 | 78.71 | 75.92 | 94.22 | 79.63 | 86.31 | ||
Average | 72.52 ± 15.34 | 88.90 ± 5.51 | 79.74 ± 13.99 | 84.07 ± 9.96 | 81.16 ± 11.42 | 95.32 ± 2.80 | 84.53 ± 10.08 | 89.61 ± 6.77 | ||
SB | S | 88.19 | 92.98 | 94.47 | 93.72 | 91.66 | 97.78 | 93.60 | 95.65 | |
M | 73.78 | 87.50 | 82.47 | 84.91 | 79.88 | 91.93 | 85.91 | 88.82 | ||
L | 68.99 | 87.30 | 75.46 | 80.95 | 77.27 | 92.63 | 82.33 | 87.18 | ||
Average | 76.01 ± 14.14 | 89.47 ± 4.56 | 83.47 ± 13.69 | 86.37 ± 9.28 | 83.09 ± 10.84 | 94.61 ± 4.53 | 87.23 ± 8.18 | 90.77 ± 6.36 | ||
With SVM model | IB | S | 90.64 | 92.34 | 98.01 | 95.09 | 92.94 | 95.02 | 97.70 | 97.02 |
M | 77.17 | 84.49 | 89.92 | 87.11 | 80.71 | 93.04 | 85.893 | 87.982 | ||
L | 75.28 | 84.16 | 87.70 | 85.89 | 77.60 | 94.22 | 81.48 | 85.07 | ||
Average | 80.68 ± 11.85 | 87.09 ± 6.55 | 91.64 ± 7.68 | 89.31 ± 7.07 | 82.97 ± 11.48 | 94.32 ± 1.46 | 87.3 ± 11.89 | 90.10 ± 8.83 | ||
SB | S | 90.78 | 91.58 | 99.05 | 95.17 | 93.60 | 96.37 | 96.34 | 96.70 | |
M | 78.32 | 84.69 | 91.25 | 87.84 | 81.02 | 91.09 | 89.32 | 89.51 | ||
L | 76.39 | 87.25 | 85.94 | 86.61 | 79.19 | 91.96 | 87.39 | 88.38 | ||
Average | 81.85 ± 11.06 | 88.36 ± 5.00 | 91.73 ± 9.41 | 90.01 ± 6.55 | 84.89 ± 9.43 | 93.62 ± 4.02 | 90.69 ± 6.67 | 91.83 ± 6.38 |
Method | WCC | Average | |||
---|---|---|---|---|---|
SWCR | MWCR | LWCR | |||
Method 1 | IoU (%) | 84.23 | 67.25 | 51.11 | 66.06 ± 8. 61 |
CTR (%) | 85.66 | 72.25 | 55.52 | 69.49 ± 10.37 | |
Method 2 | IoU (%) | 79.81 | 63.18 | 39.24 | 58.31 ± 13.72 |
CTR (%) | 80.22 | 70.33 | 48.64 | 64.04 ± 12.82 | |
Proposed method | IoU (%) | 90.78 | 78.32 | 76.39 | 81.76 ± 11.06 |
CTR (%) | 93.60 | 81.02 | 79.19 | 85.27 ± 9.43 |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Chen, Y.; Hou, C.; Tang, Y.; Zhuang, J.; Lin, J.; He, Y.; Guo, Q.; Zhong, Z.; Lei, H.; Luo, S. Citrus Tree Segmentation from UAV Images Based on Monocular Machine Vision in a Natural Orchard Environment. Sensors 2019, 19, 5558. https://doi.org/10.3390/s19245558
Chen Y, Hou C, Tang Y, Zhuang J, Lin J, He Y, Guo Q, Zhong Z, Lei H, Luo S. Citrus Tree Segmentation from UAV Images Based on Monocular Machine Vision in a Natural Orchard Environment. Sensors. 2019; 19(24):5558. https://doi.org/10.3390/s19245558
Chicago/Turabian StyleChen, Yayong, Chaojun Hou, Yu Tang, Jiajun Zhuang, Jintian Lin, Yong He, Qiwei Guo, Zhenyu Zhong, Huan Lei, and Shaoming Luo. 2019. "Citrus Tree Segmentation from UAV Images Based on Monocular Machine Vision in a Natural Orchard Environment" Sensors 19, no. 24: 5558. https://doi.org/10.3390/s19245558
APA StyleChen, Y., Hou, C., Tang, Y., Zhuang, J., Lin, J., He, Y., Guo, Q., Zhong, Z., Lei, H., & Luo, S. (2019). Citrus Tree Segmentation from UAV Images Based on Monocular Machine Vision in a Natural Orchard Environment. Sensors, 19(24), 5558. https://doi.org/10.3390/s19245558