Next Article in Journal
Effects of Pruning on Growth, Rhizosphere Soil Physicochemical Indexes and Bacterial Community Structure of Tea Tree and Their Interaction
Previous Article in Journal
Optimization Model and Application for Agricultural Machinery Systems Based on Timeliness Losses of Multiple Operations
Previous Article in Special Issue
VGNet: A Lightweight Intelligent Learning Method for Corn Diseases Recognition
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Editorial

Novel Applications of Optical Sensors and Machine Learning in Agricultural Monitoring

1
College of Information and Management Science, Henan Agricultural University, Zhengzhou 450002, China
2
Institute of Agricultural Equipment, Zhejiang Academy of Agricultural Sciences, Hangzhou 310021, China
3
Key Laboratory of Quantitative Remote Sensing in Agriculture, Ministry of Agriculture, Beijing Research Center for Information Technology in Agriculture, Beijing 100097, China
4
Department of Plant and Soil Sciences, College of Agriculture, Food and Environment, University of Kentucky, Lexington, KY 40546, USA
5
Agricultural Information Institute, Chinese Academy of Agricultural Sciences, Beijing 100081, China
*
Author to whom correspondence should be addressed.
Agriculture 2023, 13(10), 1970; https://doi.org/10.3390/agriculture13101970
Submission received: 25 September 2023 / Revised: 5 October 2023 / Accepted: 9 October 2023 / Published: 10 October 2023
The rapid development of intelligence and automated technologies has provided new management opportunities for agricultural production. In particular, the progress of remote sensing equipment has allowed for vast improvements in the spatial, temporal, and spectral resolutions of optical sensors. Such sensors are key in current agricultural production management practices, with applications in areas that were previously explored using field observations, including the monitoring of plant health, growth conditions, and pest infestations.
The papers published in this Special Issue, “Novel Applications of Optical Sensors and Machine Learning in Agricultural Monitoring”, present some of the most current and novel results of scholars’ investigations on the applications of optical sensors and machine learning in the field of agriculture. Table 1 summarizes the 16 peer-reviewed articles included in this Special Issue. We found the guest editing for this exercise to be very inspiring, with contents including:
(1)
The application of machine learning techniques to examine the key physiological development and production variables of crops.
(2)
The use of datasets obtained from multiple sources and sensors to enhance crop mapping.
(3)
Advanced target recognition algorithm techniques for weed and disease identification.
The optical sensors used in the presented research include a digital RGB camera, spectrometers, a 3D TOF sensor, a multispectral imaging sensor, and a satellite-based multispectral sensor. The machine learning methods include conventional machine learning techniques such as KNN, RF, SVM, and ANN, and deep learning techniques such as LSTM, VGG, YOLO, and SSD.
The contributions to this Special Issue are summarized in the following.
Wang et al. [1] employed LAI as the input to four machine learning models (RF, SVR, PLSR, and XGBOOST) and one deep learning model (LSTM) for winter wheat production estimates in Henan Province, China, during 2016. The results indicated that the LSTM performed better than the four traditional machine learning models, exhibiting the optimal R2 and RMSE values. Kumar et al. [9] investigated the canopy cover of sugarcane and its relationship with dry matter and yield, and analyzed the relationship between (a) canopy temperature, chlorophyll fluorescence, SPAD index, and (b) yield. Luo et al. [13] fused vegetation indices determined using a UAV with brightness, greenness, and moisture indices estimated using tasseled cap transformation (TCT). The proposed approach was observed to enhance the accuracy of rice yield predictions and was able to avoid the saturation phenomenon.
In order to enhance the estimation accuracy of LULC models, Ibrahim [2] performed RF-based feature selection using data obtained from Sentinel-1, -2, and the Shuttle Radar Topographic Mission. The author revealed that integrating optical, radar, and elevation information is key to increasing the precision of LULC models for agriculturally dominated landscapes. Wang et al. [4] developed an information extraction method for the accurate determination of the spatial distribution of crops by integrating spatiotemporal image information using a fractal model. The authors demonstrated the ability of their approach to determine key cropland variables for the effective monitoring, conservation, and development of black soil. Li et al. [7] developed a 3D-CNN and ConvLSTM2D method for the classification of crops across time. Five deep learning models were tested, namely 1D-CNNs, LSTM, 2D-CNNs, 3D-CNNs, and ConvLSTM2D. 3D-CNN and ConvLSTM2D, which combine temporal, spectral, and spatial information, outperformed the other models in terms of crop classification using time series images.
Gao et al. [3] developed an approach based on UAV and multispectral imagery that integrated the spectral and textural features of images to examine wheat fusarium head blight (FHB) and estimate several Vis and Tis. The VIs, TIs, and combined VIs and TIs were adopted as the inputs to KNN, PSO-SVM, and XGBoost to develop wheat FHB monitoring models. The proposed approach was revealed to have potential for fast and nonintrusive observations of wheat FHB. Guo et al. [10] proposed the Peanut Southern Blight Severity method by combining hyperspectral data, continuous wavelet transform, and machine learning. The machine learning methods SVM, DT, and KNN were tested and compared. Fan et al. [11] developed a VGNet with the backbone set as VGG16, with the ability to improve the recognition of corn with poor health in fields. In particular, there was a 3.5% enhancement in the accuracy of the proposed VGNet compared to its predecessor VGG16.
Hu et al. [5] developed a soybean maturity recognition approach that combined UAV-based LCC and FVC maps with an anomaly detection method, exhibiting total monitoring accuracies greater than 98%. Zhang et al. [16] designed the novel CNN DS-SoybeanNet to enhance UAV-based soybean maturity observations, with the ability to extract and employ shallow and deep image features. The authors compared it with the widely used Alexnet, InceptionResNetV2, MobileNetV2, ResNet50, SVM, and RF, revealing the high accuracy of DS-SoybeanNet in soybean maturity classification.
Yurochka et al. [8] developed an approach for the automatic evaluation of dairy herd fatness using a 3D TOF sensor and the body condition score (BCS). The proposed approach was able to perform nonintrusive BCS evaluations of dairy herds throughout the lifetime of the herd while meeting the requirements of the farm. The overall accuracy of the system was estimated at 93.4%.
Jiang et al. [12] proposed an SMC estimation approach for mixed soil types based on PCA and machine learning, with hyperspectral data as the input. The R2 and RMSE of the optimal model were determined as 0.932 and <2%, respectively. This approach proved to be valuable in extracting data on farm entropy prior to the sowing of crops on agricultural land, and provides a basis for the use of hyperspectral imagery to calculate SMC.
Geostationary satellites are able to extract information on the daily variations in crop canopy reflectance based on high-temporal-resolution imagery. Lin et al. [15] proposed the synthetic angle normalization model (SANM), which uses vegetation canopy reflectance as its input. The SANM makes use of the advantages of GSS imaging and is able to quantitatively compare spatiotemporal remote sensing data.
Advanced target recognition algorithm techniques, such as YOLO-, Swin-Transformer-, and Faster-RCNN-based models, have also been developed to identify weeds and diseases for farmland management.
For example, Zhang et al. [14] introduced EM-YOLOv4-Tiny to identify weeds and compared it with six other weed recognition deep learning models, namely YOLOv4-Tiny, YOLOv4, YOLOv5s, Swin-Transformer, and Faster-RCNN. The proposed approach was observed to outperform the majority of the models, with an mAP of 94.54%.
Li et al. [6] developed BTC-YOLOv5s based on YOLOv5s for the detection of apple leaf disease. In particular, the inclusion of the transformer and convolutional block attention modules decreased the background noise.
Intelligent agriculture can achieve information perception, quantitative decision-making, and intelligent control throughout agricultural production by integrating information technologies such as the Internet of Things, big data, artificial intelligence, and intelligent equipment with agriculture. Therefore, interdisciplinary cooperation is necessary for deepening the application of deep learning in intelligent agriculture. These collaborations include expert-assisted data annotation, machine learning methods, the design of agricultural-specific sensors, intelligent drones, intelligent robots, and more. Optical sensors and deep learning are fundamental in data collection, information perception, and decision analyses. Research on their combinations is crucial for promoting the development of intelligent agriculture. Therefore, we hope this work can attract the attention of the agricultural, electronic, and computer communities and promote more research on optical sensors and machine learning. The research published in this Special Issue focus on a variety of machine learning methods, optical sensors, and platforms for agricultural monitoring. The novel results and progress made by the papers will hopefully stimulate further research in these areas.

Funding

This study was supported by the Henan Province Science and Technology Research Project (232102111123) and the National Natural Science Foundation of China (grant number 42101362).

Institutional Review Board Statement

Not applicable.

Data Availability Statement

No new data were created or analyzed in this study.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Wang, J.; Si, H.; Gao, Z.; Shi, L. Winter Wheat Yield Prediction Using an LSTM Model from MODIS LAI Products. Agriculture 2022, 12, 1707. [Google Scholar] [CrossRef]
  2. Ibrahim, S. Improving Land Use/Cover Classification Accuracy from Random Forest Feature Importance Selection Based on Synergistic Use of Sentinel Data and Digital Elevation Model in Agriculturally Dominated Landscape. Agriculture 2023, 13, 98. [Google Scholar] [CrossRef]
  3. Gao, C.; Ji, X.; He, Q.; Gong, Z.; Sun, H.; Wen, T.; Guo, W. Monitoring of Wheat Fusarium Head Blight on Spectral and Textural Analysis of UAV Multispectral Imagery. Agriculture 2023, 13, 293. [Google Scholar] [CrossRef]
  4. Wang, Q.; Guo, P.; Dong, S.; Liu, Y.; Pan, Y.; Li, C. Extraction of Cropland Spatial Distribution Information Using Multi-Seasonal Fractal Features: A Case Study of Black Soil in Lishu County, China. Agriculture 2023, 13, 486. [Google Scholar] [CrossRef]
  5. Hu, J.; Yue, J.; Xu, X.; Han, S.; Sun, T.; Liu, Y.; Feng, H.; Qiao, H. UAV-Based Remote Sensing for Soybean FVC, LCC, and Maturity Monitoring. Agriculture 2023, 13, 692. [Google Scholar] [CrossRef]
  6. Li, H.; Shi, L.; Fang, S.; Yin, F. Real-Time Detection of Apple Leaf Diseases in Natural Scenes Based on YOLOv5. Agriculture 2023, 13, 878. [Google Scholar] [CrossRef]
  7. Li, Q.; Tian, J.; Tian, Q. Deep Learning Application for Crop Classification via Multi-Temporal Remote Sensing Images. Agriculture 2023, 13, 906. [Google Scholar] [CrossRef]
  8. Yurochka, S.S.; Dovlatov, I.M.; Pavkin, D.Y.; Panchenko, V.A.; Smirnov, A.A.; Proshkin, Y.A.; Yudaev, I. Technology of Automatic Evaluation of Dairy Herd Fatness. Agriculture 2023, 13, 1363. [Google Scholar] [CrossRef]
  9. Kumar, R.A.; Vasantha, S.; Gomathi, R.; Hemaprabha, G.; Alarmelu, S.; Srinivasa, V.; Vengavasi, K.; Alagupalamuthirsolai, M.; Hari, K.; Palaniswami, C.; et al. Rapid and Non-Destructive Methodology for Measuring Canopy Coverage at an Early Stage and Its Correlation with Physiological and Morphological Traits and Yield in Sugarcane. Agriculture 2023, 13, 1481. [Google Scholar] [CrossRef]
  10. Guo, W.; Sun, H.; Qiao, H.; Zhang, H.; Zhou, L.; Dong, P.; Song, X. Spectral Detection of Peanut Southern Blight Severity Based on Continuous Wavelet Transform and Machine Learning. Agriculture 2023, 13, 1504. [Google Scholar] [CrossRef]
  11. Fan, X.; Guan, Z. VGNet: A Lightweight Intelligent Learning Method for Corn Diseases Recognition. Agriculture 2023, 13, 1606. [Google Scholar] [CrossRef]
  12. Jiang, X.; Luo, S.; Ye, Q.; Li, X.; Jiao, W. Hyperspectral Estimates of Soil Moisture Content Incorporating Harmonic Indicators and Machine Learning. Agriculture 2022, 12, 1188. [Google Scholar] [CrossRef]
  13. Luo, S.; Jiang, X.; Jiao, W.; Yang, K.; Li, Y.; Fang, S. Remotely Sensed Prediction of Rice Yield at Different Growth Durations Using UAV Multispectral Imagery. Agriculture 2022, 12, 1447. [Google Scholar] [CrossRef]
  14. Zhang, H.; Wang, Z.; Guo, Y.; Ma, Y.; Cao, W.; Chen, D.; Yang, S.; Gao, R. Weed Detection in Peanut Fields Based on Machine Vision. Agriculture 2022, 12, 1541. [Google Scholar] [CrossRef]
  15. Lin, Y.; Tian, Q.; Qiao, B.; Wu, Y.; Zuo, X.; Xie, Y.; Lian, Y. A Synthetic Angle Normalization Model of Vegetation Canopy Reflectance for Geostationary Satellite Remote Sensing Data. Agriculture 2022, 12, 1658. [Google Scholar] [CrossRef]
  16. Zhang, S.; Feng, H.; Han, S.; Shi, Z.; Xu, H.; Liu, Y.; Feng, H.; Zhou, C.; Yue, J. Monitoring of Soybean Maturity Using UAV Remote Sensing and Deep Learning. Agriculture 2022, 13, 110. [Google Scholar] [CrossRef]
Table 1. Summary of publications featured in this Special Issue.
Table 1. Summary of publications featured in this Special Issue.
ArticleAgricultural Activities/VariablesOptical SensorsPlatformsMachine Learning Methods
[1]Winter wheat yield predictionMODISSatelliteLSTM, RF, SVR, PLSR, and XGBoost
[2]Land use/cover classificationSentinel-2 MSISatelliteRF
[3]Wheat fusarium head blightMultispectral imaging sensorUAVKNN, SVM, XGBoost
[4]Cropland spatial distributionLandsat 8 OLISatelliteBlanket covering method
[5]Soybean FVC, LCC, and maturitySONY DSC-QX100UAVRF, PLSR, GPR, MSR
[6]Apple leaf diseasesCanon Rebel T5i DSLRFieldBTC-YOLOv5s, YOLOv5, SSD, R-CNN, Faster R-CNN, YOLOv4-tiny, and YOLOx, YOLOx-s
[7]Crop classificationSentinel-2Satellite1D-CNNs, LSTM, 2D-CNNs, 3D-CNNs, and ConvLSTM2D
[8]Dairy herd fatness3D TOF sensorFieldBCS
[9]Sugarcane dry matter and cane yieldMobile phone cameraFieldTwo-Way cluster
[10]Peanut southern blight severityASD Field Spec3 VNIR-SWIR sensorFieldSVM, DT, and KNN
[11]Corn diseasesdigital cameraFieldVGNet, VGG16
[12]Soil moisture contentASD Field Spec3 VNIR-SWIR sensorFieldPCA, PCR, PLSR, and BP-ANN
[13]Rice yieldMini-MCA 1000UAVTCT
[14]Weed detection in peanut fieldsFuji Finepixs4500FieldYOLOv4-Tiny, YOLOv5s, Swin-Transformer, Faster-RCNN, YOLOv6-Tiny, and EM-YOLOv4-Tiny
[15]Vegetation canopy reflectance angle normalizationGOCISatelliteSANM
[16]Soybean maturitySONY DSC-QX100UAVSVM, RF, InceptionResNetV2, MobileNetV2, Alexnet, ResNet50, and DS-SoybeanNet
Note: UAV, unmanned aerial vehicle; RF, random forest; TCT, tasseled cap transformation; SANM, synthetic angle normalization model; PCA, principal component analysis; LSTM, long short-term memory; SVR, support vector regression; PLSR, partial least squares regression; XGBoost, eXtreme gradient boosting; DT, decision tree; KNN, K-nearest neighbor; SVM, support vector machine; GPR, Gaussian process regression; MSR, stepwise multiple linear regression; YOLO, you only look once; SSD, single shot multi-box detector; CNN, convolutional neural network; R-CNN, regions-convolutional neural network; BCS, body condition scoring; PCA, principal component analysis; and BP-ANN, back propagation-artificial neural network.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Yue, J.; Zhou, C.; Feng, H.; Yang, Y.; Zhang, N. Novel Applications of Optical Sensors and Machine Learning in Agricultural Monitoring. Agriculture 2023, 13, 1970. https://doi.org/10.3390/agriculture13101970

AMA Style

Yue J, Zhou C, Feng H, Yang Y, Zhang N. Novel Applications of Optical Sensors and Machine Learning in Agricultural Monitoring. Agriculture. 2023; 13(10):1970. https://doi.org/10.3390/agriculture13101970

Chicago/Turabian Style

Yue, Jibo, Chengquan Zhou, Haikuan Feng, Yanjun Yang, and Ning Zhang. 2023. "Novel Applications of Optical Sensors and Machine Learning in Agricultural Monitoring" Agriculture 13, no. 10: 1970. https://doi.org/10.3390/agriculture13101970

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop