Hair Fescue and Sheep Sorrel Identification Using Deep Learning in Wild Blueberry Production
Abstract
:1. Introduction
1.1. Background
1.2. Related Work
1.3. Convolutional Neural Networks Used In This Study
1.4. Outline
2. Materials and Methods
2.1. Image Datasets
2.2. Convolutional Neural Network Training, Validation, and Testing
3. Results
3.1. Hair Fescue and Sheep Sorrel Targeting with Object Detection Networks
3.2. Classification of Images Containing Hair Fescue and Sheep Sorrel
3.3. Effect of Training Dataset Size on Detection Accuracy
4. Discussion
4.1. Accuracy of Object-Detection Networks
4.2. Processing Speed Considerations for Sprayer Implementation
4.3. Accuracy of Image-Classification Networks
4.4. Identification Accuracy with Different Training Dataset Sizes
5. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Hall, I.V.; Aalders, L.E.; Nickerson, N.L.; Vander Kloet, S.P. The biological flora of Canada 1. Vaccinium angustifolium Ait., sweet lowbush blueberry. Can. Field-Nat. 1979, 93, 415–430. [Google Scholar]
- Yarborough, D.E. Blueberry crop trends 1996–2017. In Proceedings of the WBPANS Annual General Meeting, Truro, NS, Canada, 17–18 November 2017. [Google Scholar]
- Dinshaw, F. Frost batters Pictou County wild blueberry plants. Truro News 2018. Available online: https://www.saltwire.com/news/frost-batters-pictou-county-wild-blueberry-plants-216587/ (accessed on 1 March 2021).
- Beattie, J.; Crozier, A.; Duthie, G. Potential health benefits of berries. Curr. Nutr. Food Sci. 2005, 1, 71–86. [Google Scholar] [CrossRef]
- Kay, C.D.; Holub, B.J. The effect of wild blueberry (Vaccinium angustifolium) consumption on postprandial serum antioxidant status in human subjects. Br. J. Nutr. 2002, 88, 389–397. [Google Scholar] [CrossRef] [Green Version]
- Lobo, V.; Patil, A.; Phatak, A.; Chandra, N. Free radicals, antioxidants and functional foods: Impact on human health. Pharmacogn. Rev. 2010, 4, 118–126. [Google Scholar] [CrossRef] [Green Version]
- McCully, K.V.; Sampson, M.G.; Watson, A.K. Weed survey of Nova Scotia lowbush blueberry (Vaccinium angustifolium) fields. Weed Sci. 1991, 39, 180–185. [Google Scholar] [CrossRef]
- White, S.N. Final weed survey update and research progress on priority weed species in wild blueberry. In Proceedings of the Wild Blueberry Producers Association of Nova Scotia Annual General Meeting, Truro, NS, Canada, 14–15 November 2019. [Google Scholar]
- Jensen, K.I.N.; Yarborough, D.E. An overview of weed management in the wild lowbush blueberry—Past and present. Small Fruits Rev. 2004, 3, 229–255. [Google Scholar] [CrossRef]
- Kennedy, K.J.; Boyd, N.S.; Nams, V.O. Hexazinone and fertilizer impacts on sheep sorrel (Rumex acetosella) in wild blueberry. Weed Sci. 2010, 58, 317–322. [Google Scholar] [CrossRef]
- Esau, T.; Zaman, Q.U.; MacEachern, C.; Yiridoe, E.K.; Farooque, A.A. Economic and management tool for assessing wild blueberry production costs and financial feasibility. Appl. Eng. Agric. 2019, 35, 687–696. [Google Scholar] [CrossRef]
- White, S.N.; Kumar, S.K. Potential role of sequential glufosinate and foramsulfuron applications for management of fescues (Festuca spp.) in wild blueberry. Weed Technol. 2017, 31, 100–110. [Google Scholar] [CrossRef] [Green Version]
- White, S.N. Determination of Festuca filiformis seedbank characteristics, seedling emergence and herbicide susceptibility to aid management in lowbush blueberry (Vaccinium angustifolium). Weed Res. 2018, 58, 112–120. [Google Scholar] [CrossRef]
- Hughes, A.; White, S.N.; Boyd, N.S.; Hildebrand, P.; Cutler, C.G. Red sorrel management and potential effect of red sorrel pollen on Botrytis cinerea spore germination and infection of lowbush blueberry (Vaccinium angustifolium Ait.) flowers. Can. J. Plant. Sci. 2016, 96, 590–596. [Google Scholar] [CrossRef] [Green Version]
- Esau, T.; Zaman, Q.U.; Chang, Y.K.; Groulx, D.; Schumann, A.W.; Farooque, A.A. Prototype variable rate sprayer for spot-application of agrochemicals in wild blueberry. Appl. Eng. Agric. 2014, 30, 717–725. [Google Scholar] [CrossRef] [Green Version]
- Esau, T.; Zaman, Q.; Groulx, D.; Corscadden, K.; Chang, Y.K.; Schumann, A.; Havard, P. Economic analysis for smart sprayer application in wild blueberry fields. Precis. Agric. 2016, 17, 753–765. [Google Scholar] [CrossRef]
- Esau, T.; Zaman, Q.; Groulx, D.; Farooque, A.; Schumann, A.; Chang, Y. Machine vision smart sprayer for spot-application of agrochemical in wild blueberry fields. Precis. Agric. 2018, 19, 770–788. [Google Scholar] [CrossRef]
- Rehman, T.U.; Zaman, Q.U.; Chang, Y.K.; Schumann, A.W.; Corscadden, K.W.; Esau, T. Optimising the parameters influencing performance and weed (goldenrod) identification accuracy of colour co-occurrence matrices. Biosyst. Eng. 2018, 170, 85–95. [Google Scholar] [CrossRef] [Green Version]
- Rehman, T.U.; Zaman, Q.U.; Chang, Y.K.; Schumann, A.W.; Corscadden, K.W. Development and field evaluation of a machine vision based in-season weed detection system for wild blueberry. Comput. Electron. Agric. 2019, 162, 1–13. [Google Scholar] [CrossRef]
- Parra, L.; Marin, J.; Yousfi, S.; Rincón, G.; Mauri, P.V.; Lloret, J. Edge detection for weed recognition in lawns. Comput. Electron. Agric. 2020, 176, 105684. [Google Scholar] [CrossRef]
- Lu, B.; Dao, P.D.; Liu, J.; He, Y.; Shang, J. Recent advances of hyperspectral imaging technology and applications in agriculture. Remote Sens. 2020, 12, 2659. [Google Scholar] [CrossRef]
- Peteinatos, G.G.; Weis, M.; Andújar, D.; Rueda Ayala, V.; Gerhards, R. Potential use of ground-based sensor technologies for weed detection. Pest. Manag. Sci. 2014, 70, 190–199. [Google Scholar] [CrossRef] [PubMed]
- Andújar, D.; Rueda-Ayala, V.; Moreno, H.; Rosell-Polo, J.R.; Escolà, A.; Valero, C.; Gerhards, R.; Fernández-Quintanilla, C.; Dorado, J.; Griepentrog, H.W. Discriminating crop, weeds and soil surface with a terrestrial LIDAR sensor. Sensors 2013, 13, 14662–14675. [Google Scholar] [CrossRef] [Green Version]
- Eddy, P.R.; Smith, A.M.; Hill, B.D.; Peddle, D.R.; Coburn, C.A.; Blackshaw, R.E. Weed and crop discrimination using hyperspectral image data and reduced bandsets. Can. J. Remote Sens. 2014, 39, 481–490. [Google Scholar] [CrossRef]
- Liu, B.; Li, R.; Li, H.; You, G.; Yan, S.; Tong, Q. Crop/weed discrimination using a field imaging spectrometer system. Sensors 2019, 19, 5154. [Google Scholar] [CrossRef] [Green Version]
- Salazar-Vazquez, J.; Mendez-Vazquez, A. A plug-and-play Hyperspectral Imaging Sensor using low-cost equipment. HardwareX 2020, 7, e00087. [Google Scholar] [CrossRef]
- LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef]
- Goodfellow, I.; Bengio, Y.; Courville, A. Deep Learning, 1st ed.; MIT Press: Cambridge, MA, USA, 2016. [Google Scholar]
- Rumelhart, D.; Hinton, G.; Williams, R.J. Learning representations by back-propagating errors. Nature 1986, 323, 533–536. [Google Scholar] [CrossRef]
- Schumann, A.W.; Mood, N.S.; Mungofa, P.D.K.; MacEachern, C.; Zaman, Q.U.; Esau, T. Detection of three fruit maturity stages in wild blueberry fields using deep learning artificial neural networks. In Proceedings of the 2019 ASABE Annual International Meeting, Boston, MA, USA, 7–10 July 2019. [Google Scholar]
- Sharpe, S.M.; Schumann, A.W.; Boyd, N.S. Detection of Carolina geranium (Geranium carolinianum) growing in competition with strawberry using convolutional neural networks. Weed Sci. 2019, 67, 239–245. [Google Scholar] [CrossRef]
- Hussain, N.; Farooque, A.A.; Schumann, A.W.; McKenzie-Gopsill, A.; Esau, T.; Abbas, F.; Acharya, B.; Zaman, Q. Design and development of a smart variable rate sprayer using deep learning. Remote Sens. 2020, 12, 4091. [Google Scholar] [CrossRef]
- Yu, J.; Sharpe, S.M.; Schumann, A.W.; Boyd, N.S. Detection of broadleaf weeds growing in turfgrass with convolutional neural networks. Pest. Manag. Sci. 2019, 75, 2211–2218. [Google Scholar] [CrossRef] [PubMed]
- Yu, J.; Sharpe, S.M.; Schumann, A.W.; Boyd, N.S. Deep learning for image-based weed detection in turfgrass. Eur. J. Agron. 2019, 104, 78–84. [Google Scholar] [CrossRef]
- Yu, J.; Schumann, A.W.; Cao, Z.; Sharpe, S.M.; Boyd, N.S. Weed detection in perennial ryegrass with deep learning convolutional neural network. Front. Plant. Sci. 2019, 10, 1–9. [Google Scholar] [CrossRef] [Green Version]
- Sharpe, S.M.; Schumann, A.W.; Yu, J.; Boyd, N.S. Vegetation detection and discrimination within vegetable plasticulture row-middles using a convolutional neural network. Precis. Agric. 2020, 21, 264–277. [Google Scholar] [CrossRef]
- Fuentes, A.; Yoon, S.; Kim, S.C.; Park, D.S. A robust deep-learning-based detector for real-time tomato plant diseases and pests recognition. Sensors 2017, 17, 2022. [Google Scholar] [CrossRef] [Green Version]
- Venkataramanan, A.; Honakeri, D.K.P.; Agarwal, P. Plant disease detection and classification using deep neural networks. Int. J. Comput. Sci. Eng. 2019, 11, 40–46. [Google Scholar]
- Schumann, A.; Waldo, L.; Mungofa, P.; Oswalt, C. Computer Tools for Diagnosing Citrus Leaf Symptoms (Part 2): Smartphone Apps for Expert Diagnosis of Citrus Leaf Symptoms. EDIS 2020, 2020, 1–2. [Google Scholar] [CrossRef]
- Qi, W.; Su, H.; Yang, C.; Ferrigno, G.; De Momi, E.; Aliverti, A. A fast and robust deep convolutional neural networks for complex human activity recognition using smartphone. Sensors 2019, 19, 3731. [Google Scholar] [CrossRef] [Green Version]
- Pertusa, A.; Gallego, A.J.; Bernabeu, M. MirBot: A collaborative object recognition system for smartphones using convolutional neural networks. Neurocomputing 2018, 293, 87–99. [Google Scholar] [CrossRef] [Green Version]
- Maeda, H.; Sekimoto, Y.; Seto, T.; Kashiyama, T.; Omata, H. Road Damage Detection and Classification Using Deep Neural Networks with Smartphone Images. Comput. Civ. Infrastruct. Eng. 2018, 33, 1127–1141. [Google Scholar] [CrossRef]
- Government of New Brunswick. 2020 Wild Blueberry Weed Control Selection Guide—Fact Sheet C1.6. Available online: https://www2.gnb.ca/content/dam/gnb/Departments/10/pdf/Agriculture/WildBlueberries-BleuetsSauvages/C162-E.pdf (accessed on 5 June 2020).
- Redmon, J.; Farhadi, A. YOLOv3: An incremental improvement. arXiv 2018, arXiv:1804.02767. Available online: https://arxiv.org/abs/1804.02767 (accessed on 11 March 2019).
- Wang, C.-Y.; Liao, H.-Y.M.; Chen, P.-Y.; Hsieh, J.-W. Enriching variety of layer-wise learning information by gradient combination. In Proceedings of the IEEE International Conference on Computer Vision (ICCV), Seoul, South Korea, 27 October–2 November 2019. [Google Scholar]
- Redmon, J. Darknet: Open Source Neural Networks in C. Available online: http://pjreddie.com/darknet/ (accessed on 8 June 2020).
- Everingham, M.; Van Gool, L.; Williams, C.K.I.; Winn, J.; Zisserman, A. The Pascal visual object classes (VOC) challenge. Int. J. Comput. Vis. 2010, 88, 303–338. [Google Scholar] [CrossRef] [Green Version]
- Szegedy, C.; Liu, W.; Jia, Y.; Sermanet, P.; Reed, S.; Anguelov, D.; Erhan, D.; Vanhoucke, V.; Rabinovich, A. Going deeper with convolutions. In Proceedings of the 2015 IEEE Conference on Computer Vision and Pattern Recognition (CPVR), Boston, MA, USA, 7–12 June 2015. [Google Scholar]
- Redmon, J. YOLO: Real-Time Object Detection. Available online: https://pjreddie.com/darknet/yolo/ (accessed on 23 October 2020).
- Tan, M.; Le, Q.V. EfficientNet: Rethinking model scaling for convolutional neural networks. In Proceedings of the International Conference on Machine Learning (ICML), Long Beach, CA, USA, 10–15 June 2019. [Google Scholar]
- Sandler, M.; Howard, A.; Zhu, M.; Zhmoginov, A.; Chen, L.C. MobileNetV2: Inverted residuals and linear bottlenecks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–22 June 2018; pp. 4510–4520. [Google Scholar] [CrossRef] [Green Version]
- Google LLC.; Hennessy, P. Map of Wild Blueberry Sites Used for Image Collection in Spring 2019 – Google Maps™. Available online: https://www.google.com/maps/@45.3747723,-63.691178,68373m/data=!3m1!1e3 (accessed on 23 February 2021).
- Matsumoto, M.; Nishimura, T. Mersenne Twister: A 623-Dimensionally Equidistributed Uniform Pseudo-Random Number Generator. ACM Trans. Model. Comput. Simul. 1998, 8, 3–30. [Google Scholar] [CrossRef] [Green Version]
- Redmon, J.; Bochkovskiy, A.; Sinigardi, S. Darknet: YOLOv3—Neural Network for Object Detection. Available online: https://github.com/AlexeyAB/darknet (accessed on 20 January 2020).
- Wang, C.-Y.; Liao, H.-Y.M.; Yeh, I.-H.; Wu, Y.-H.; Chen, P.-Y.; Hsieh, J.-W. CSPNet: A new backbone that can enhance learning capability of CNN. In Proceedings of the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, Online, 16–18 June 2020; pp. 390–391. [Google Scholar]
- Sokolova, M.; Lapalme, G. A systematic analysis of performance measures for classification tasks. Inf. Process. Manag. 2009, 45, 427–437. [Google Scholar] [CrossRef]
- Rahman, M.A.; Wang, Y. Optimizing intersection-over-union in deep neural networks for image segmentation. In Proceedings of the International Symposium on Visual Computing, Las Vegas, NV, USA, 12–14 December 2016; pp. 234–244. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CPVR), Las Vegas, NV, USA, 26 June–1 July 2016. [Google Scholar]
Network | Resolution | Validation | Testing | |||
---|---|---|---|---|---|---|
Precision | Recall | F1-Score | AP50 (%) | AP50 (%) | ||
YOLOv3 | 1280 × 736 | 0.94 | 0.46 | 0.62 | 75.83 | 72.84 |
1024 × 576 | 0.94 | 0.44 | 0.60 | 75.56 | 73.87 | |
960 × 544 | 0.94 | 0.43 | 0.59 | 75.13 | 72.81 | |
864 × 480 | 0.95 | 0.39 | 0.55 | 71.68 | 70.61 | |
YOLOv3-Tiny | 1280 × 736 | 0.91 | 0.45 | 0.60 | 74.83 | 75.18 |
1024 × 576 | 0.92 | 0.46 | 0.62 | 75.46 | 76.57 | |
960 × 544 | 0.91 | 0.46 | 0.61 | 75.27 | 75.63 | |
864 × 480 | 0.90 | 0.45 | 0.60 | 71.40 | 74.06 | |
YOLOv3-Tiny-PRN | 1280 × 736 | 0.92 | 0.44 | 0.59 | 71.40 | 73.26 |
1024 × 576 | 0.94 | 0.39 | 0.56 | 66.40 | 63.91 | |
960 × 544 | 0.94 | 0.38 | 0.54 | 61.38 | 61.04 | |
864 × 480 | 0.94 | 0.34 | 0.50 | 55.29 | 54.89 |
Network | Resolution | Validation | Testing | |||
---|---|---|---|---|---|---|
Precision | Recall | F1-Score | AP50 (%) | AP50 (%) | ||
YOLOv3 | 1280 × 736 | 0.79 | 0.37 | 0.51 | 60.18 | 62.98 |
1024 × 576 | 0.82 | 0.28 | 0.42 | 52.20 | 57.05 | |
960 × 544 | 0.83 | 0.26 | 0.40 | 49.15 | 53.04 | |
864 × 480 | 0.83 | 0.21 | 0.34 | 42.06 | 46.63 | |
YOLOv3-Tiny | 1280 × 736 | 0.79 | 0.29 | 0.42 | 54.02 | 57.00 |
1024 × 576 | 0.81 | 0.24 | 0.37 | 48.29 | 52.32 | |
960 × 544 | 0.80 | 0.23 | 0.35 | 45.16 | 49.46 | |
864 × 480 | 0.78 | 0.19 | 0.31 | 39.40 | 42.91 | |
YOLOv3-Tiny-PRN | 1280 × 736 | 0.80 | 0.13 | 0.22 | 26.80 | 29.41 |
1024 × 576 | 0.74 | 0.07 | 0.13 | 14.26 | 15.28 | |
960 × 544 | 0.72 | 0.06 | 0.11 | 11.54 | 12.26 | |
864 × 480 | 0.69 | 0.04 | 0.08 | 7.73 | 7.98 |
Network | Resolution | Threshold = 0.15 | Threshold = 0.25 | ||||
---|---|---|---|---|---|---|---|
Precision | Recall | F1-Score | Precision | Recall | F1-Score | ||
YOLOv3 | 1280 × 736 | 0.97 | 0.93 | 0.95 | 0.98 | 0.88 | 0.93 |
1024 × 576 | 0.96 | 0.94 | 0.95 | 0.98 | 0.88 | 0.93 | |
960 × 544 | 0.96 | 0.95 | 0.95 | 0.98 | 0.88 | 0.93 | |
864 × 480 | 0.96 | 0.96 | 0.96 | 0.98 | 0.86 | 0.91 | |
YOLOv3-Tiny | 1280 × 736 | 0.97 | 0.97 | 0.97 | 0.97 | 0.91 | 0.94 |
1024 × 576 | 0.96 | 0.97 | 0.96 | 0.97 | 0.93 | 0.95 | |
960 × 544 | 0.95 | 0.97 | 0.96 | 0.97 | 0.92 | 0.94 | |
864 × 480 | 0.93 | 1.00 | 0.96 | 0.96 | 0.94 | 0.95 | |
YOLOv3-Tiny-PRN | 1280 × 736 | 0.97 | 0.99 | 0.98 | 0.98 | 0.88 | 0.93 |
1024 × 576 | 0.97 | 0.98 | 0.98 | 0.99 | 0.87 | 0.93 | |
960 × 544 | 0.97 | 0.96 | 0.97 | 0.99 | 0.86 | 0.92 | |
864 × 480 | 0.97 | 0.95 | 0.96 | 0.99 | 0.83 | 0.90 |
Network | Resolution | Threshold = 0.15 | Threshold = 0.25 | ||||
---|---|---|---|---|---|---|---|
Precision | Recall | F1-Score | Precision | Recall | F1-Score | ||
YOLOv3 | 1280 × 736 | 0.83 | 0.96 | 0.89 | 0.88 | 0.94 | 0.91 |
1024 × 576 | 0.86 | 0.94 | 0.90 | 0.91 | 0.89 | 0.90 | |
960 × 544 | 0.87 | 0.91 | 0.89 | 0.91 | 0.87 | 0.89 | |
864 × 480 | 0.89 | 0.88 | 0.88 | 0.92 | 0.83 | 0.87 | |
YOLOv3-Tiny | 1280 × 736 | 0.81 | 0.97 | 0.88 | 0.87 | 0.93 | 0.90 |
1024 × 576 | 0.82 | 0.94 | 0.88 | 0.88 | 0.91 | 0.89 | |
960 × 544 | 0.82 | 0.94 | 0.88 | 0.89 | 0.90 | 0.89 | |
864 × 480 | 0.82 | 0.93 | 0.87 | 0.89 | 0.87 | 0.88 | |
YOLOv3-Tiny-PRN | 1280 × 736 | 0.84 | 0.89 | 0.86 | 0.88 | 0.78 | 0.82 |
1024 × 576 | 0.82 | 0.79 | 0.80 | 0.88 | 0.65 | 0.75 | |
960 × 544 | 0.82 | 0.76 | 0.79 | 0.88 | 0.64 | 0.74 | |
864 × 480 | 0.82 | 0.69 | 0.75 | 0.86 | 0.56 | 0.68 |
Network | Resolution | Validation | Testing | |||
---|---|---|---|---|---|---|
Precision | Recall | F1-Score | Top-1 (%) | Top-1 (%) | ||
Darknet Reference | 1280 × 736 | 0.96 | 0.97 | 0.96 | 96.25 | 96.29 |
1024 × 576 | 0.91 | 0.98 | 0.94 | 93.88 | 95.29 | |
960 × 544 | 0.93 | 0.98 | 0.95 | 95.25 | 96.14 | |
864 × 480 | 0.86 | 0.99 | 0.92 | 91.38 | 93.00 | |
EfficientNet-B0 | 1280 × 736 | 0.96 | 0.95 | 0.96 | 95.50 | 95.71 |
1024 × 576 | 0.90 | 0.96 | 0.93 | 92.75 | 97.29 | |
960 × 544 | 0.86 | 0.96 | 0.91 | 90.25 | 96.14 | |
864 × 480 | 0.83 | 0.95 | 0.88 | 87.38 | 95.57 | |
MobileNetV2 | 1280 × 736 | 0.98 | 0.93 | 0.96 | 95.63 | 95.43 |
1024 × 576 | 0.94 | 0.98 | 0.96 | 95.88 | 97.29 | |
960 × 544 | 0.95 | 0.97 | 0.96 | 95.75 | 98.14 | |
864 × 480 | 0.91 | 0.99 | 0.95 | 94.50 | 97.28 |
Network | Resolution | Validation | Testing | |||
---|---|---|---|---|---|---|
Precision | Recall | F1-Score | Top-1 (%) | Top-1 (%) | ||
Darknet Reference | 1280 × 736 | 0.96 | 0.93 | 0.95 | 94.75 | 94.14 |
1024 × 576 | 0.93 | 0.98 | 0.95 | 95.25 | 92.42 | |
960 × 544 | 0.93 | 0.97 | 0.95 | 95.00 | 92.42 | |
864 × 480 | 0.94 | 0.96 | 0.95 | 95.25 | 94.14 | |
EfficientNet-B0 | 1280 × 736 | 0.97 | 0.87 | 0.92 | 92.13 | 96.57 |
1024 × 576 | 0.98 | 0.74 | 0.84 | 86.25 | 98.00 | |
960 × 544 | 0.97 | 0.76 | 0.85 | 87.38 | 96.57 | |
864 × 480 | 0.97 | 0.63 | 0.76 | 80.88 | 96.86 | |
MobileNetV2 | 1280 × 736 | 1.00 | 0.75 | 0.86 | 87.63 | 82.29 |
1024 × 576 | 0.98 | 0.86 | 0.91 | 92.13 | 89.71 | |
960 × 544 | 0.96 | 0.91 | 0.94 | 93.88 | 92.71 | |
864 × 480 | 0.97 | 0.93 | 0.95 | 95.25 | 92.43 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Hennessy, P.J.; Esau, T.J.; Farooque, A.A.; Schumann, A.W.; Zaman, Q.U.; Corscadden, K.W. Hair Fescue and Sheep Sorrel Identification Using Deep Learning in Wild Blueberry Production. Remote Sens. 2021, 13, 943. https://doi.org/10.3390/rs13050943
Hennessy PJ, Esau TJ, Farooque AA, Schumann AW, Zaman QU, Corscadden KW. Hair Fescue and Sheep Sorrel Identification Using Deep Learning in Wild Blueberry Production. Remote Sensing. 2021; 13(5):943. https://doi.org/10.3390/rs13050943
Chicago/Turabian StyleHennessy, Patrick J., Travis J. Esau, Aitazaz A. Farooque, Arnold W. Schumann, Qamar U. Zaman, and Kenny W. Corscadden. 2021. "Hair Fescue and Sheep Sorrel Identification Using Deep Learning in Wild Blueberry Production" Remote Sensing 13, no. 5: 943. https://doi.org/10.3390/rs13050943