Next Article in Journal
Grain Self-Sufficiency in Guangdong, China: Current Trends and Future Perspectives
Next Article in Special Issue
Effect of Drying Methods on Aroma Profiling of Large-Leaf Green Tea (Camellia sinensis var. Assamica) Determined by HS-SPME-GC-MS
Previous Article in Journal
Effects of Dietary n-3 Polyunsaturated Fatty Acids and Selenomethionine on Meat Quality and Fatty Acid Composition in Finishing Pigs
Previous Article in Special Issue
Magnesium Regulation Increases the Content of Characteristic Volatile Compounds and Enhances the Intensity of Odor Characteristics in Tea Tree Leaves
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Predicting the Degree of Fresh Tea Leaves Withering Using Image Classification Confidence

1
Tea Research Institute of Shandong Academy of Agricultural Sciences, Jinan 250100, China
2
College of Mechanical and Electrical Engineering, Shihezi University, Shihezi 832003, China
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Foods 2025, 14(7), 1125; https://doi.org/10.3390/foods14071125
Submission received: 11 February 2025 / Revised: 19 March 2025 / Accepted: 21 March 2025 / Published: 25 March 2025

Abstract

:
Rapid and non-destructive detection methods for the withering degree of fresh tea leaves are crucial for ensuring high-quality tea production. Therefore, this study proposes a fresh tea withering degree detection model based on image classification confidence. The moisture percentage of fresh tea leaves is calculated by developing a weighted method that combines confidence levels and moisture labels, and the degree of withering is ultimately determined by incorporating the standard for wilted moisture content. To enhance the feature extraction ability and classification accuracy of the model, we introduce the Receptive-Field Attention Convolution (RFAConv) and Cross-Stage Feature Fusion Coordinate Attention (C2f_CA) modules. The experimental results demonstrate that the proposed model achieves a classification accuracy of 92.7%. Compared with the initial model, the detection accuracy was improved by 0.156. In evaluating the predictive performance of the model for moisture content, the correlation coefficients (Rp), root mean square error (RMSEP), and relative standard deviation (RPD) of category 1 in the test set were 0.9983, 0.006278, and 39.2513, respectively, and all performance were significantly better than PLS and CNN methods. This method enables accurate and rapid detection of tea leaf withering, providing crucial technical support for online determination during processing.

1. Introduction

In the process of black tea production, withering is the first vital step and the key to making good quality black tea [1,2]. Wilting refers to a series of characteristic changes in the physical parameters of tea leaves, such as moisture content and color, under specific temperature and humidity conditions [3]. As withering progresses, the water content of fresh tea leaves gradually decreases, the leaves become soft, they shrink, their color darkens, and the grassy flavor gradually disappears, resulting in a distinctive fragrance [4,5,6]. Moisture content is an important indicator of the degree of withering of fresh tea leaves. The industry standard for wilting moisture content in fresh tea leaves is 58–66%. In actual production engineering, the degree of withering mainly relies on a combination of manual judgment and leaf moisture content detection. However, manual judgment is easily influenced by factors such as light, personal experience, and subjective judgment, hence it cannot accurately and objectively determine the degree of withering of fresh tea leaves. In addition, such judgments about color and texture are qualitative rather than standardized [7], making it difficult to distinguish subtle differences between colors and textures, which can easily lead to significant errors and inaccurate judgments. As such, traditional detection methods have the weaknesses of low accuracy, lack of objectivity, and slow speed [8].
In recent years, machine vision [9] combined with machine learning has been used effectively to extract image colors and textures for tasks such as image classification, detection and segmentation, achieving efficient image processing and analysis in various fields of agriculture. Wijaya et al. (2023) [10] used electronic noses and optimized machine learning algorithms to effectively evaluate and predict the quality of non-infused tea aroma, achieving the identification of the quality of non-infused Gambung Green Tea. Li et al. (2023) [11] used the least squares support vector machine qualitative model analysis to quickly detect the fermentation degree of Pu’er tea online, achieving a prediction accuracy of 99.3% on the fermentation prediction set. However, in terms of moisture content detection, Zhang et al. (2024) [12] collected near-infrared spectral data in seven stages of green tea processing and established a water content prediction model for green tea using partial least squares regression and backpropagation neural network, achieving high detection performance. Wang et al. (2023) [13] extracted nine color and five texture features from tea images and constructed two preprocessing methods and two optimization algorithms to predict the moisture content of fresh tea leaves through cross combination. However, machine learning detection methods can only input fixed fresh tea features (such as color and texture), which limits the model’s ability to learn features.
Deep learning [14], as a branch of machine learning, is based on the core idea of extracting advanced features of images through multi-level neural networks, thereby achieving tasks such as target classification, recognition, and segmentation. In recent years, deep learning has developed rapidly and applied in various aspects of the tea fields. Chen et al. (2024) [15] proposed an improved YOLOV7 tea bud detection network with multiple sizes and targets in the field of view. By introducing attention mechanism and multi-feature fusion module, an average accuracy of 94.43% was achieved. Based on the regional convolutional neural network Mask-RCNN, Wang et al. (2023) [16] proposed a harvesting point localization method to classify, regress, and segment buds by training regression classification and preselected frames and eliminating the quantization error using regional feature aggregation. An et al. (2020) [17] proposed a withered water detection method based on convolutional neural network confidence to achieve effective detection of moisture content in tea leaves.
The YOLOv8 image classification model, known for its balanced trade-off between speed and accuracy, is utilized as the core algorithm in this study to predict the moisture content of withered fresh tea leaves. A novel weighted method is developed, which combines image classification confidence scores with moisture labels to calculate the percentage of moisture in the tea leaves. This is then correlated with the established standard for wilting moisture content in tea leaves, enabling the determination of the degree of wilting. This approach facilitates rapid, non-destructive detection of the withering level of fresh tea leaves. Furthermore, to enhance both the efficiency and accuracy of the detection process, the model was optimized by sacrificing a small portion of inference speed, significantly improving detection precision. This advancement provides a reliable tool for assessing the quality of tea in subsequent stages of processing.

2. Materials and Methods

2.1. Image Acquisition and Preprocessing

The withered fresh tea leaf images used in this study come from the Tea Research Institute of the Chinese Academy of Agricultural Sciences. The tea variety is Jin Guan Yin, collected in March 2024. In this study, we developed a data collection methodology that involved the synchronized acquisition of both high-resolution images and corresponding moisture measurements across 13 consecutive temporal intervals. Continuous time interval sample data can reflect the impact of changes in moisture content on image color, texture, and other features, helping the model better learn subtle changes in tea phenotype. The data collection and experimental process is shown in Figure 1.
During image acquisition, the acquisition system consists of an artificial climate chamber, an industrial camera (Intel RealSense D405, Intel Corporation, Santa Clara, CA, USA), a uniform light source, and a sample container. During the collection process, the camera is placed 28 cm away from the sample. Samples are collected once every hour, and images of the samples are taken at 13 consecutive time points. A moisture analyzer (XY-110MW) measures the moisture value at each collection time point. Three measurements are recorded at each time point, and the average value is taken as the withering moisture label for that time point. This study obtained a total of 709 wilted fresh tea images and 13 moisture category labels. The data with category labels 1, 5, and 9 were used as an external test set (a total of 153 images). The remaining 10 categories (a total of 556 images) were divided into training sets (a total of 501 images) and validation sets (a total of 55 images) in a ratio of 9:1 to ensure the generalization effect of the model. To prevent the model from overfitting during the training process, a total of 2255 image samples were obtained in this study by cropping, rotating, flipping, and changing the brightness as data enhancement methods of images in the training set.
The experimental data of PLS [18] and CNN [19] were extracted from 709 images using a tea visual image acquisition and analysis software (Software Copyright No. 2014 SR 149549) developed based on Matlab GUI module extracted nine color metrics and six texture features. The nine color metrics include average red component, average green component, average blue component, color angle, average saturation, average luminance component, super-green transformation, ratio of the average red component to average green component, and average hue. The six texture features are average grayscale value, standard deviation, grayscale histogram smoothness, third-order moment bias, grayscale histogram consistency, and grayscale histogram entropy.
Figure 2 shows the moisture percentage and the criteria for determining the degree of withering under continuous time. In Figure 2, the blue shadow indicates moderate withering (moisture content between 58% and 66%), the area above the blue shadow indicates insufficient withering (moisture content greater than 66%), and the area below the blue shadow indicates excessive withering (moisture content less than 58%). The coordinates of the intersection point between the fold line and the blue shadow indicate that, at time t, the fresh tea leaves have reached a moderate withering value.

2.2. Model Selection and Optimization for Judging the Degree of Withering

2.2.1. Model Selection

YOLOv8 is a major update of YOLOv5 in 2023, which has greatly improved the detection performance of the model [20]. The core of YOLOv8 image classification is to extract surface features from different feature layers of the input image through convolutional layers, then gradually extract abstract features of the image through multiple convolutions, achieving accurate image classification. However, selecting different variants in different application scenarios is the key to maximizing the advantages of YOLOv8. In image classification models, based on increasing network depth and width, the model is divided into five versions: YOLOv8n, YOLOv8s, YOLOv8m, YOLOv8l, and YOLOv8x. The greater the depth, the higher the accuracy and the lower the timeliness. In this study, we selected YOLOv8s as the prediction model for moisture content and improved its basic structure to enhance the prediction accuracy. The results of the models trained with different versions are presented in Section 3.1.
The structure of YOLOv8s is different from that of mainstream image classification models, as it consists of a backbone and head module. The model structure is shown on the left side of Table 1. To enhance the model’s prediction accuracy for the withering degree of tea leaves, this study replaces the standard convolution with the Receptive-Field Attention Convolution (RFAConv) to address the issue of parameter sharing within the model. Additionally, the CA mechanism is incorporated into the C2f module to mitigate the problems of information loss and inaccurate feature fusion during the model’s feature integration process. The improved model results are shown on the right side of Table 1.

Perceived Field Attention Convolutional RFAConv

RFAConv is a combination of the RFA attention mechanism and convolution, designed to address the issue of parameter sharing in convolutional kernels [21]. Unlike traditional convolutions, RFAConvs dynamically adjust the receptive field for each input image, enabling the model to more accurately learn local features. The RFA mechanism uses average pooling to capture global information within each receptive field. It then applies a 1 × 1 convolution to facilitate feature interaction. Finally, it uses the Softmax function to weigh the importance of each receptive field. This approach effectively captures spatial feature differences and overcomes the limitations of parameter sharing in convolutional neural networks. The calculation method is detailed in Equation (1).
F = S o f t m a x g 1 × 1 A v g P o o l X × R e L U N o r m g k × k X = A r f × F r f
Here, g1×1 is 1 × 1 grouped convolution, X is the input feature map, Norm is normalization, K is the size of the convolution kernel, F represents the product of the attention feature map Arf and the receptive field spatial feature Frf.
In RFAConv implementation, the RFA-generated feature maps are reshaped to eliminate spatial overlap, aggregating all receptive field block features. This allows the model to assign specific convolutional kernels to each receptive field, enhancing its ability to adapt to varying input complexities. The method’s structure is detailed in Figure 3.

Cross-Stage Feature Fusion Coordinate Attention

The feature fusion process in the C2f module of the YOLOv8s classification model suffers from minor information loss and inaccuracies. To address this issue, the Coordinate Attention (CA) [22] is introduced into the C2f module, resulting in the development of the Cross-Stage Feature Fusion Coordinate Attention (C2f_CA) module. By utilizing CA-block’s attention to width and height dimensions, the model’s sensitivity to subtle variations in the image is enhanced, leading to more accurate recognition and classification. The C2f_CA module enables the model to extract higher-level information from the feature map, retain more useful feature details, and reduce information loss during the feature transmission process, thereby improving the model’s detection accuracy. The structural diagrams of the module before and after optimization are shown in Figure 4.

2.3. Quantitative Prediction of Moisture Content and Determination of Wilting Degree Based on Classification Confidence

In this study, a model is developed by using tea leaf images from the training set as input and moisture content and withering degree as output, as shown in Equation (2). The model outputs predictions for the moisture content and withering degree of tea leaves by performing weighted calculations of image classification confidence and moisture content labels, in conjunction with the standard for tea leaf withering moisture content. The confidence is calculated by measuring the similarity between the input image features and the predicted values. The higher the confidence, the more reliable the image classification results, which enhances the accuracy of moisture prediction based on classification confidence.
Y = i = 0 n Y i N i
Here, Y represents the moisture content of the predicted sample, Yi represents the true i-sample moisture category label, Ni represents the predicted i-sample category confidence, and n is the total number of sample categories entered for training; in this study, n = 9.

2.4. Evaluation

To better evaluate the predictive performance of the model on the moisture content of fresh tea leaves, this study used correlation coefficient (Rp), relative standard deviation (RPD), and root-mean-square error (RMSEP) as evaluation indicators for the model. The larger the Rp and RPD value, the smaller the RMSEP value, the better the predictive performance of the model [23]. Among these, RPD is the ratio of standard deviation to root mean square error of prediction, which can better reflect the performance of the model. When its value is greater than 2, it indicates that the model can well judge the degree of withering; when it is between 1 and 2, it indicates that the model has an average predictive ability for this parameter; when the RPD is less than 1, it indicates that the model cannot be used for this task. The calculation formulas for Rp, RMSEP, and RPD are shown in Equations (3)–(6).
R p = 1 i = 1 n ( y ^ i y i ) 2 i = 1 n ( y ^ i y ¯ ) 2
R M S E P = i = 1 n y i y ^ i 2 n
R P D = S D R M S E P
S D = 1 N 1 i = 1 N y i y ¯ 2
Among them,  y ^ i  represents the predicted value at time i y i  represents the true value at time i, and  y ¯  represents the average value of 10 types of label data, with SD as the standard deviation.

2.5. Implementation Details

During the model training process, this study adopted the Pytorch learning framework and conducted an experiment in Windows 11 using the improved YOLOv8s as the benchmark model. The CPU was AMD Ryzen 5 7500F 6-Core Processor (3.70 GHz, 6 cores/12 threads, 32 MB), and the running memory was 32 GB; the GPU was NVIDIA GeForce RTX 4060 Ti 16G. During the training process, the hyperparameters for model training include a learning rate of 0.01, batch size of 16, epochs of 200, input image resolution of 224 × 224, momentum of 0.937, and a weight of 0.0005, using Adam optimizer.

3. Results and Discussion

3.1. Comparison Between Different Versions of YOLOv8

In this study, we conducted experiments on all five versions of the model, and the results are shown in Table 2. Among these, YOLOv8s exhibited a superior balance of performance characteristics. Specifically, YOLOv8s achieved an accuracy of 0.771, a value surpassed only by YOLOv8m, YOLOv8l, and YOLOv8x. It is crucial that YOLOv8 achieves this performance with a model size of only 4.856 MB, which is significantly smaller than the aforementioned versions. This reduced parameter count facilitates deployment in resource-constrained environments, such as mobile and embedded systems, where computational resources are limited.
Furthermore, YOLOv8s demonstrated impressive computational efficiency. While its inference time of 1.4 ms was slightly longer than the lightweight YOLOv8n (1.3 ms), YOLOv8s offered substantially improved accuracy. In addition, YOLOv8s achieved the fastest preprocessing time of 0.1 ms, further enhancing its real-time applicability. Based on these findings, YOLOv8s was selected as the foundational model for moisture content prediction, and its architecture was subsequently modified to improve predictive accuracy.

3.2. Comparative Experiments Before and After Model Optimization

To evaluate the superiority of the improved model, this study compared YOLOv8s with the improved model, and the results are shown in Table 3. The improved model reached a Top-1 accuracy of 0.927. This is a 0.156 accuracy increase over YOLOv8s. However, model parameters (Params) increased by 0.224 M, and floating-point operations (GFLOPs) increased by 1.2 G. Image processing speed rose by 0.2ms, and inference speed rose by 3.6ms. These results indicate that the model significantly improves accuracy while slightly increasing computation and detection time. This has great benefits for predicting moisture content based on confidence. Therefore, it is the best choice to increase calculation and detection time a little, greatly improving accuracy. Figure 5 illustrates the differences between the performance of the improved and initial models in identifying the degree of tea leaf wilting. This figure further highlights the superior performance of the improved model in real-time monitoring of tea leaf wilting.

3.3. Improve the Model to Predict Moisture Content and Wilting Degree

Table 4 shows an example of the external test set. It shows the image category labels and moisture values in the training set, the prediction confidence of the three category samples in the external test set, and the predicted moisture content of fresh tea leaves in the external test set according to Equation (2). From Table 4, it can be seen that the errors between the predicted and true moisture values of the three types of external test sets are all less than 0.01, indicating that the model also achieves good detection results for untrained categories. In addition, the sum of the confidence levels of the three categories in the external test set in Table 4 is not completely equal to 1. This is because in this study, when predicting the external test set, the confidence level of a single image and the moisture value were only retained until eight decimal places, resulting in rounding errors. Figure 6 shows the entire process of the improved model prediction.
Table 5 shows the evaluation metrics for the relationship between the predicted results and the true values for the external validation set of the model. These data reveal that the RPD predicted by the model in this study is greater than 2, which well reflects the predictive performance of our model. However, compared with the other two categories, there is an obvious gap between the evaluation index base of category 9 in the external test set of the model. There may be two reasons for this: one is that the color and texture of fresh tea leaves change very little in continuous time, resulting in a larger prediction error of the model; the other is that a large error exists between the moisture labeled values in continuous time before and after category 9, leading to a bias toward the side with a high level of confidence in the computation process, which results in a larger prediction error.

3.4. Comparison of Moisture Content Prediction Using Different Models

Figure 7 shows the prediction results of PLS, CNN, and YOLOv8s on the external test set. Figure 7a presents the distribution of predicted values versus true values. It is evident that the predictions from our model are closest to the true values, demonstrating that our model has better generalization and robustness compared to PLS and CNN models. Figure 7b shows the prediction errors of the three models. From the error distribution, we can see that the error fluctuations between predicted and true values for the PLS and CNN models are much larger than those of our model. Although our model has notable outliers in Class 5 and Class 9, overall it achieves the best accuracy and is more suitable for online collection of wilted moisture content data from fresh tea leaf images. Figure 7c displays the normal distribution of prediction errors for the three models, helping us further explore the error intervals. Here, the horizontal axis represents the error range and the vertical axis represents the frequency of results within that range. From the normal distribution, it is clear that, compared to PLS and CNN, our model’s errors are more concentrated, mainly within the range of [−0.02, 0.02]. Additionally, the comparison of the three models’ Rp, RMSEP, and RPD values in Table 5 further confirms that our model performs well in predicting untrained categories.

4. Conclusions

This study addresses the need for detecting moisture content and determining wilting levels in fresh tea leaves. To improve the model’s prediction accuracy and reduce moisture content prediction errors, we introduced the RFAConv and C2f_CA modules. These additions significantly enhanced the model’s ability to extract and process features, resulting in a 0.156 improvement in detection accuracy. While this method increased the model’s parameters (Params) by 0.224 M and floating-point operations (GFLOPs) by 1.2 G, the average prediction time remained efficient at 5.3 ms. This balance ensures high detection accuracy while maintaining fast prediction speed.
In terms of experimental evaluation, this study achieved the best performance on the external test set 1, with Rp, RMSEP, and RPD values of 0.9983, 0.006278, and 39.2513, respectively. Further analysis revealed that the prediction errors were primarily within the range of [−0.02, 0.02], demonstrating high precision and stability. The results indicate that this model offers significant advantages in real-time detection of moisture content in fresh tea leaves and in determining their wilting levels. It effectively improves the accuracy of wilting degree judgment, providing reliable technical support for tea leaf quality assessment.
Although the proposed tea leaf withering degree detection model, though satisfactory, has limitations in accuracy, single-variety focus, and single-modal analysis. In future research, we will expand the diversity of research subjects and construct a multimodal perception framework. By deeply integrating and analyzing multidimensional data, the prediction accuracy and stability of the model can be further optimized.

Author Contributions

M.W.: validation, writing—original draft. Y.S.: data curation, methodology. Y.L.: software, investigation. H.M.: formal analysis. Z.D.: formal analysis. Z.T.: software. C.D.: software, investigation. Z.C.: writing—review and editing, conceptualization, funding acquisition, supervision. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Agricultural Improved Variety Project of Shandong Province (2023LZGCQY015), the Key R&D Projects in Zhejiang Province (2023C02043), the Major Project of Shandong Academy of Agricultural Sciences Innovation Project (R&D of specialty crop picking robots), the Agricultural Science and Technology Research Project of Jinan City (GG202401), and the Technology System of Modern Agricultural Industry in Shandong Province (SDAIT19).

Data Availability Statement

The original contributions presented in the study are included in the article, further inquiries can be directed to the corresponding author.

Acknowledgments

I would like to thank Chunwang Dong and Zhiwei Chen for providing valuable revision suggestions during the discussion stage of revising the paper.

Conflicts of Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

References

  1. Huang, W.; Fang, S.; Su, Y.; Xia, D.; Wu, Y.; Liu, Q.; Zou, L.; Lu, C.; Deng, W.-W.; Ning, J. Insights into the mechanism of different withering methods on flavor formation of black tea based on target metabolomics and transcriptomics. LWT 2023, 189, 115537. [Google Scholar] [CrossRef]
  2. Deb, S.; Jolvis Pou, K.R. A review of withering in the processing of black tea. J. Biosyst. Eng. 2016, 41, 365–372. [Google Scholar] [CrossRef]
  3. Wu, H.; Chen, Y.; Feng, W.; Shen, S.; Wei, Y.; Jia, H.; Wang, Y.; Deng, W.; Ning, J. Effects of three different withering treatments on the aroma of white tea. Foods 2022, 11, 2502. [Google Scholar] [CrossRef] [PubMed]
  4. Huang, W.; Fang, S.; Wang, J.; Zhuo, C.; Luo, Y.; Yu, Y.; Li, L.; Wang, Y.; Deng, W.-W.; Ning, J. Sensomics analysis of the effect of the withering method on the aroma components of Keemun black tea. Food Chem. 2022, 395, 133549. [Google Scholar] [CrossRef] [PubMed]
  5. Liang, G.; Dong, C.; Hu, B.; Zhu, H.; Yuan, H.; Jiang, Y.; Hao, G. Prediction of moisture content for congou black tea withering leaves using image features and nonlinear method. Sci. Rep. 2018, 8, 7854. [Google Scholar] [CrossRef]
  6. Jabeen, S.; Alam, S.; Saleem, M.; Ahmad, W.; Bibi, R.; Hamid, F.S.; Shah, H.U. Withering timings affect the total free amino acids and mineral contents of tea leaves during black tea manufacturing. Arab. J. Chem. 2019, 12, 2411–2417. [Google Scholar] [CrossRef]
  7. Madhavi, K.V.; Tamilkodi, R.; Dinakar, R.B.; JayaSudha, K. An innovative technique for content based image retrieval using color and texture features. Int. J. Innov. Res. Comput. Commun. Eng. 2013, 1, 1257–1263. [Google Scholar]
  8. Chen, A.; Chen, H.Y.; Chen, C. Use of temperature and humidity sensors to determine moisture content of oolong tea. Sensors 2014, 14, 15593–15609. [Google Scholar] [CrossRef]
  9. Steger, C.; Ulrich, M.; Wiedemann, C. Machine Vision Algorithms and Applications; John Wiley & Sons: Hoboken, NJ, USA, 2018. [Google Scholar]
  10. Wijaya, D.R.; Handayani, R.; Fahrudin, T.; Kusuma, G.P.; Afianti, F. Electronic Nose and Optimized Machine Learning Algorithms for Non-infused Aroma-based Quality Identification of Gambung Green Tea. IEEE Sens. J. 2023, 24, 1880–1893. [Google Scholar] [CrossRef]
  11. Li, T.; Lu, C.; Huang, J.; Chen, Y.; Zhang, J.; Wei, Y.; Wang, Y.; Ning, J. Qualitative and quantitative analysis of the pile fermentation degree of Pu-erh tea. LWT 2023, 173, 114327. [Google Scholar] [CrossRef]
  12. Zhang, Z.; Deng, L. An Adaptive Genetic Algorithm Optimizes Double-Hidden Layer BPNN for Rapid Detection of Moisture Content of Green Tea in Processing. J. Food Process. Preserv. 2024, 2024, 5876051. [Google Scholar]
  13. Wang, F.; Xie, B.; Lü, E.; Zeng, Z.; Mei, S.; Ma, C.; Guo, J. Design of a Moisture content detection system for yinghong no. 9 tea leaves based on machine vision. Appl. Sci. 2023, 13, 1806. [Google Scholar] [CrossRef]
  14. LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar]
  15. Chen, T.; Li, H.; Chen, J.; Zeng, Z.; Han, C.; Wu, W. Detection network for multi-size and multi-target tea bud leaves in the field of view via improved YOLOv7. Comput. Electron. Agric. 2024, 218, 108700. [Google Scholar]
  16. Wang, T.; Zhang, K.; Zhang, W.; Wang, R.; Wan, S.; Rao, Y.; Jiang, Z.; Gu, L. Tea picking point detection and location based on Mask-RCNN. Inf. Process. Agric. 2023, 10, 267–275. [Google Scholar]
  17. An, T.; Yu, H.; Yang, C.; Liang, G.; Chen, J.; Hu, Z.; Hu, B.; Dong, C. Black tea withering moisture detection method based on convolution neural network confidence. J. Food Process Eng. 2020, 43, e13428. [Google Scholar]
  18. Bastien, P.; Vinzi, V.E.; Tenenhaus, M. PLS generalised linear regression. Comput. Stat. Data Anal. 2005, 48, 17–46. [Google Scholar]
  19. Chen, L.; Li, S.; Bai, Q.; Yang, J.; Jiang, S.; Miao, Y. Review of image classification algorithms based on convolutional neural networks. Remote Sens. 2021, 13, 4712. [Google Scholar] [CrossRef]
  20. Terven, J.; Córdova-Esparza, D.M.; Romero-González, J.A. A comprehensive review of yolo architectures in computer vision: From yolov1 to yolov8 and yolo-nas. Mach. Learn. Knowl. Extr. 2023, 5, 1680–1716. [Google Scholar] [CrossRef]
  21. Zhang, X.; Liu, C.; Yang, D.; Song, T.; Ye, Y.; Li, K.; Song, Y. Rfaconv: Innovating spatital attention and standard convolutional operation. arXiv 2023, arXiv:2304.03198. [Google Scholar]
  22. Hou, Q.; Zhou, D.; Feng, J. Coordinate attention for efficient mobile network design. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Nashville, TN, USA, 20–25 June 2021; pp. 13713–13722. [Google Scholar]
  23. Chen, Q.; Zhang, D.; Pan, W.; Ouyang, Q.; Li, H.; Urmila, K.; Zhao, J. Recent developments of green analytical techniques in analysis of tea’s quality and nutrition. Trends Food Sci. Technol. 2015, 43, 63–82. [Google Scholar]
Figure 1. Data collection and algorithm process.
Figure 1. Data collection and algorithm process.
Foods 14 01125 g001
Figure 2. Moisture values of fresh tea leaves under continuous time and the standard for determining the degree of withering.
Figure 2. Moisture values of fresh tea leaves under continuous time and the standard for determining the degree of withering.
Foods 14 01125 g002
Figure 3. Structure diagram of RFAConv.
Figure 3. Structure diagram of RFAConv.
Foods 14 01125 g003
Figure 4. Structural diagram of C2f before and after optimization.
Figure 4. Structural diagram of C2f before and after optimization.
Foods 14 01125 g004
Figure 5. Comparison of model detection results before and after improvement.
Figure 5. Comparison of model detection results before and after improvement.
Foods 14 01125 g005
Figure 6. The entire process of improved model prediction. Model input (a). The network structure of the improved model and the visualization of feature maps at each layer (b). Identification results of the model (c).
Figure 6. The entire process of improved model prediction. Model input (a). The network structure of the improved model and the visualization of feature maps at each layer (b). Identification results of the model (c).
Foods 14 01125 g006
Figure 7. Prediction results of PLS, CNN, and YOLOv8 external test sets. (a) The distribution of predicted values and true values. (b) The distribution of errors. (c) Normal distribution.
Figure 7. Prediction results of PLS, CNN, and YOLOv8 external test sets. (a) The distribution of predicted values and true values. (b) The distribution of errors. (c) Normal distribution.
Foods 14 01125 g007
Table 1. YOLOv8s and improved YOLOv8s network layers.
Table 1. YOLOv8s and improved YOLOv8s network layers.
Network StructureNetwork Layer Before OptimizationImproved Network Layer
BackboneLayer numberNetwork LayerRepeatsKernels sizeNetwork LayerRepeatsKernels size
(1)Conv164RFAConv164
(2)Conv1128RFAConv1128
(3)C2f3128C2f_CA3128
(4)Conv1256RFAConv1256
(5)C2f6256C2f_CA6256
(6)Conv1512RFAConv1512
(7)C2f6512C2f_CA6512
(8)Conv11024RFAConv11024
(9)C2f31024C2f_CA31024
HeadLayer numberNetwork LayerRepeatsOutputNetwork LayerRepeatsOutput
(1)Classify110Classify110
Table 2. Detection results of the model.
Table 2. Detection results of the model.
ModelAccPara (MB)GFLOPs (G)Speed (ms)
PreprocessingInference
YOLOv8n0.7291.3843.30.21.3
YOLOv8s0.7714.85612.50.11.4
YOLOv8m0.77115.04841.60.21.7
YOLOv8l0.78634.52498.70.21.9
YOLOv8x0.78653.539153.80.32.5
Table 3. Comparative experiments before and after model optimization.
Table 3. Comparative experiments before and after model optimization.
Modeltop1_acctop5_acc:Params (M)GFLOPs (G)Speed (ms)
PreprocessingInference
YOLOv8s0.77114.85612.50.11.4
Ours0.92715.0813.70.35
Table 4. Test set confidence and predicted value labels.
Table 4. Test set confidence and predicted value labels.
Training Set CategoryMoisture LabelTest Set Confidence
159
00.78430.9188814200
10.77140.081092350.000000070
20.7630.000026190.009614270
30.75010.000000010.419524190
40.69900.570851680.00000004
50.665400.000009800.00098720
60.6372000.53591472
70.5671000.45765832
80.5177000.00543968
90.4863000.00000003
Sum of Confidence0.999999971.000000010.99999999
Predictive value0.783253330.721052690.60449594
True value0.77990.72310.597
Table 5. Evaluation indicators of different models for the test set prediction results.
Table 5. Evaluation indicators of different models for the test set prediction results.
OursPLSCNN
159159159
Rp0.99830.9930.94740.9820.9660.94880.97260.92770.905
RMSEP0.0062780.006940.0184110.0186150.0142870.0249740.0221240.0185670.03413
RPD39.251320.16937.36312.06677.9746.46519.8045 5.52164.7987
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wang, M.; Shi, Y.; Li, Y.; Meng, H.; Ding, Z.; Tian, Z.; Dong, C.; Chen, Z. Predicting the Degree of Fresh Tea Leaves Withering Using Image Classification Confidence. Foods 2025, 14, 1125. https://doi.org/10.3390/foods14071125

AMA Style

Wang M, Shi Y, Li Y, Meng H, Ding Z, Tian Z, Dong C, Chen Z. Predicting the Degree of Fresh Tea Leaves Withering Using Image Classification Confidence. Foods. 2025; 14(7):1125. https://doi.org/10.3390/foods14071125

Chicago/Turabian Style

Wang, Mengjie, Yali Shi, Yaping Li, Hewei Meng, Zezhong Ding, Zhengrui Tian, Chunwang Dong, and Zhiwei Chen. 2025. "Predicting the Degree of Fresh Tea Leaves Withering Using Image Classification Confidence" Foods 14, no. 7: 1125. https://doi.org/10.3390/foods14071125

APA Style

Wang, M., Shi, Y., Li, Y., Meng, H., Ding, Z., Tian, Z., Dong, C., & Chen, Z. (2025). Predicting the Degree of Fresh Tea Leaves Withering Using Image Classification Confidence. Foods, 14(7), 1125. https://doi.org/10.3390/foods14071125

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop