Next Article in Journal
Key Technologies for Intelligent Operation of Plant Protection UAVs in Hilly and Mountainous Areas: Progress, Challenges, and Prospects
Previous Article in Journal
Chromosome Analysis of Mitosis on Interspecific Hybrid Progenies on (Fagopyrum tataricum) with Golden Buckwheat (Fagopyrum cymosum Complex)
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Spectral Device for Rice Chlorophyll Content Detection with Background Classification Capability

1
College of Mechanical and Electrical Engineering, Taizhou University, Taizhou 225300, China
2
College of Engineering, Nanjing Agricultural University, Nanjing 210031, China
3
Jiangsu Province Engineering Research Center for Optical Detection Technology and Equipment of Food and Pharmaceutical Industry, Taizhou 225300, China
4
College of Mechanical and Electrical Engineering, Shandong Agricultural University, Taian 271018, China
5
CAAS East Center (Suzhou) for Agricultural Science and Technology, Suzhou 215000, China
*
Authors to whom correspondence should be addressed.
Agronomy 2026, 16(2), 192; https://doi.org/10.3390/agronomy16020192
Submission received: 19 November 2025 / Revised: 14 December 2025 / Accepted: 11 January 2026 / Published: 13 January 2026
(This article belongs to the Section Precision and Digital Agriculture)

Abstract

The aim of this study is to enhance the detection accuracy of rice chlorophyll content under complex backgrounds, optimize rice production management models, and improve the efficiency and quality of grain production. To achieve this, a spectral device for rice chlorophyll content detection with integrated background classification capability was developed. The device employed a MobileNetV4-Conv-Small-based model for rice background classification, enabling the categorization of clear, muddy, and green algae-covered backgrounds. The classification results showed that the model performed best under the green algae-covered background, with all performance metrics exceeding 97%. The muddy background followed, with metrics surpassing 94%, while the clear background proved more challenging, though the metrics still exceeded 93%. By combining preprocessing techniques with convolutional neural networks (CNNs), distinct rice chlorophyll content detection models were developed for each background type. For clear backgrounds, the optimal model was FD + CNN, with R2, RMSE, and RPD values of 0.975, 5.191, and 6.318, respectively. For muddy backgrounds, the optimal model was SS + CNN, with R2, RMSE, and RPD values of 0.627, 18.249, and 1.638, respectively. For green algae-covered backgrounds, the SS + CNN model also achieved the best results, with R2, RMSE, and RPD values of 0.719, 16.417, and 1.885, respectively. Field experiments confirmed that the device achieved a rice background classification accuracy of 94.67% and a relative error compliance rate of 84.00% for chlorophyll content detection. These results demonstrate the feasibility of integrating background classification and chlorophyll content detection for rice cultivation.

1. Introduction

Rice (Oryza sativa L.) is a strategic staple crop worldwide, serving as the primary food source for more than half of the global population and playing a critical role in food security and socio-economic development [1]. Chlorophyll, as the key pigment in photosynthesis, is also an essential indicator of crop nutritional status and physiological condition. Variations in chlorophyll content are closely associated with yield formation and stress tolerance [2]. Consequently, the rapid, accurate, and nondestructive detection of chlorophyll content remains an urgent requirement for precision cultivation and yield enhancement.
Traditional chemical methods for chlorophyll detection offer high accuracy but are limited by complex procedures, high cost, time consumption, and destructiveness to plant samples, rendering them unsuitable for large-scale field applications [3]. In recent years, advances in spectroscopic, remote sensing, and sensor technologies have driven rapid progress in nondestructive chlorophyll detection, leading to the commercialization of various portable chlorophyll meters.
Representative products are summarized in Table 1. These devices have contributed substantially to crop growth monitoring and precision agricultural management; however, most are based on point-to-point measurements, resulting in low detection efficiency and limited spatial coverage [4]. In response to the practical demands of domestic agricultural production, Chinese researchers have, in recent years, actively developed low-cost, high-performance portable chlorophyll detection devices, achieving notable progress in both technological innovation and product development.
The research team at China Agricultural University has made remarkable contributions to the development of chlorophyll detection technologies. Zhang et al. proposed a chlorophyll content detection model based on PLSR and designed a contact-type device for measuring chlorophyll in maize and wheat [16]. Building on this approach, Li et al. integrated a PLSR model with the Raspberry Pi 4B platform to establish a portable device, providing an effective tool for rapid field measurements of maize chlorophyll content [17]. Further advancements were achieved by Wang et al., who incorporated multispectral imaging technology to construct an intelligent field-based system capable of automated maize plant recognition and chlorophyll index measurement [18].
In addition, several studies focused on microcontroller-based solutions. Zhang et al. developed an online chlorophyll detection system using the Arduino UNO platform, enabling dynamic monitoring of maize chlorophyll content [19]. Tang et al. extended this line of research by employing the Arduino Mega 2560 to create a portable device equipped with light-environment correction for vegetation index calculation and chlorophyll estimation [20]. Sun et al. introduced a detection device based on the STM32F103 microcontroller and an active light source, which was successfully applied to maize chlorophyll measurement [21].
Beyond China Agricultural University, other universities and research institutes have also made noteworthy contributions in this field. At Jiangsu University, Chen developed a reflection-type chlorophyll detection device based on an active light source, enabling measurements in spinach, bok choy, and romaine lettuce [22]. At the Henan Academy of Agricultural Sciences, Yang employed a sealed optical path design with three light sources and vertically incident beams to establish a chlorophyll detection system based on optical diffuse reflection, which was successfully applied to wheat, maize, and flowers [23]. At Northwest A&F University, Yang introduced a portable chlorophyll detection device suitable for spinach, brassica, and romaine lettuce [24]. At Jilin University, Liu and Li jointly developed a portable chlorophyll diagnostic device, which demonstrated favorable performance in detecting chlorophyll content in maize, holly, and soybean [25].
The rapid development of portable, non-destructive chlorophyll detection technology has provided a variety of technological solutions for monitoring the nutritional status of field crops and has facilitated the implementation of precision agricultural management models. However, the existing technological system still faces several challenges in practical applications. Firstly, from the perspective of detection mode, mainstream devices generally adopt point-to-point contact detection, requiring multiple repetitions for a single measurement point. This not only restricts the rapid large-area scanning of crop canopies but also leads to low detection efficiency, making it inadequate for large-scale paddy field operations. Secondly, from the perspective of target crops, current technological research and applications primarily focus on dryland crops such as corn and wheat, with a notable lack of detection tools specifically designed for aquatic crops like rice, resulting in significant gaps in the coverage of crop varieties and application scenarios.
Compared to dryland crops, the complexity of the rice growth environment poses greater challenges to the stability and adaptability of detection technology. Various factors of interference, including fluctuations in water turbidity, algae coverage on the water surface, and the reflection of suspended soil particles, significantly reduce the signal-to-noise ratio of spectral detection, resulting in increased errors in chlorophyll detection using traditional equipment. The existing technological system fails to provide effective methods for identifying the complex background in rice fields and has not yet established targeted detection models that adapt to varying environmental conditions, hindering the ability of current technologies to meet the real-time and precise monitoring requirements for chlorophyll content in precision rice cultivation.
Therefore, developing a rice-specific non-destructive chlorophyll detection technology that integrates background adaptability and efficient detection performance has become a critical necessity to overcome the limitations of existing technologies and support intelligent paddy field management. Building on this need and the foundations of our prior research [26,27], this study proposes an integrated device that combines image recognition and spectral analysis to achieve both background classification and chlorophyll content detection. This study broadens the application of portable non-destructive detection technology to rice and provides new technical support for precision management and efficient cultivation within the framework of smart agriculture.

2. Materials and Methods

2.1. Hardware System

The rice chlorophyll detection device comprises a C11708MA micro spectrometer (640–1050 nm, Hamamatsu, Japan), micro camera, Raspberry Pi 4B processor, spectrometer driver board, touch screen, fill light, battery, and aluminum waterproof housing. The key specifications are summarized in Table 2. By simultaneously acquiring background images and spectral data of rice, the device enables both background classification and chlorophyll content detection, thereby providing technical support for rice precision management and the advancement of smart agriculture.
During operation, reference and dark-field spectra are first collected. Subsequently, images and spectral data of the rice canopy are collected. The Raspberry Pi processor then processes the spectral data and performs background classification and chlorophyll content detection based on the predefined model. Finally, the detection results are output to a touchscreen, computer, or mobile device, providing users with real-time feedback. The logical structure and interaction flow of the functional modules are illustrated in Figure 1.

2.2. Software System

To enable automated and integrated analysis of rice background classification and chlorophyll content detection, a software system was developed based on Python 3.8 and PyQt5. The system provides multiple functions, including the collection and storage of rice background images, collection and storage of spectral data, computation and visualization of the background classification model, and computation and visualization of the chlorophyll detection model. The detailed software design workflow is illustrated in Figure 2.
In addition, to enable real-time monitoring and recording on both mobile and computer terminals, VNC Connect remote monitoring software was configured on the Raspberry Pi. This configuration allows users to remotely access and monitor the system via the Internet, thereby enhancing its flexibility and practicality.

2.3. Experimental Design

The experiment was conducted at the Wulie Modern Agricultural Industrial Park in Dongtai City, Jiangsu Province, China, from July to August 2024 (120.232495° E, 32.891592° N). The experimental materials were uniformly growing and healthy Nanjing 9108 rice plants, cultivated in a rectangular plot measuring 25 m × 100 m (Figure 3a). To facilitate data collection and field partitioning, marker flags were placed at 5 m intervals along both horizontal and vertical boundaries (Figure 3b). A circular sampling area with a radius of 1 m was established at each intersection of the horizontal and vertical markers (Figure 3d). Within each sampling area, background images, spectral data, and chlorophyll content measurements were simultaneously collected (Figure 3c).

2.4. Data Acquisition and Processing

2.4.1. Spectral Data

Prior to data collection, the device was preheated for 30 min, and the collection parameters were set [26]. After preheating and parameter adjustment, reference spectra and dark-field spectra were collected using a standard whiteboard and a dark chamber, respectively. Subsequently, rice canopy spectra were sequentially collected from each designated sampling area. For each area, five datasets were collected, with each dataset representing the average of three consecutive measurements. A total of 1000 spectral datasets were collected for each of the three backgrounds.
According to Equation (1), rice, reference, and dark-field spectral data were calibrated and converted to obtain the rice spectral reflectance. To remove irrelevant variations such as baseline drift, multiplicative effects, and peak shifts, twelve commonly used preprocessing methods—including the first derivative (FD), second derivative (SD), detrend correction (DC), and continuous wavelet transform (CWT). Scatter correction techniques encompass multiplicative scatter correction (MSC) and standard normal variate (SNV). Noise reduction techniques are represented by moving average smoothing (MAS) and Savitzky–Golay convolution smoothing (SGCS). Scaling techniques—involving mean centering (MC), standardization scaling (SS), max–min scaling (MMS), and vector normalization (VN)—were applied to the rice spectral reflectance data under three background conditions [27].
R r e f = R r a w - R d c R w h i t e - R d c ,
where Rref denotes the rice spectral reflectance, Rraw represents the raw rice spectrum, Rwhite is the reference spectrum, and Rdc corresponds to the dark-field spectrum.

2.4.2. Background Image Data

During the collection of rice spectral data, background images and water quality parameters were simultaneously collected for each sampling area, with five datasets collected per area. Based on turbidity, background images were classified into three categories: clear, muddy, and green algae-covered. Water bodies with turbidity below 10 NTU were defined as clear backgrounds, those above 10 NTU as muddy backgrounds, and water bodies with green algae coverage exceeding 80% were defined as green algae-covered backgrounds. For each background type, 1000 images were collected, and the images from all three background types were randomly partitioned into training, validation, and test sets in an 8:1:1 ratio (Figure 4a).

2.4.3. Chlorophyll Content Data

After collecting spectral data and background images for each sampling area, the SPAD-502 Plus chlorophyll meter (Konica-Minolta, Tokyo, Japan) was used to measure the SPAD values of rice leaves at the corresponding positions. Each position was measured five times, with the maximum and minimum values discarded, and the mean of the remaining three measurements calculated as the final value. The corresponding rice chlorophyll content was subsequently derived according to Equation (2) [28].
L C C r i c e = 1.034 + 0.308 × [ S P A D r i c e ] + 0.110 × [ S P A D r i c e ] 2 ,
where LCCrice is the rice leaf chlorophyll content (mg·m−2), and SPADrice is the relative content of the rice chlorophyll obtained from SPAD-502Plus, i.e., the SPAD value.
Based on turbidity levels and sampling areas, the spectral and chlorophyll content data corresponding to clear, muddy, and green algae-covered backgrounds were co-registered and randomly partitioned into training and test sets at a 7:3 ratio (Figure 4b).

2.5. Establishment and Evaluation of Rice Background Classification Model

2.5.1. Model Structure

MobileNet is an efficient deep learning architecture particularly suitable for deployment on resource-constrained platforms. By employing depthwise separable convolutions and optimization strategies, it substantially reduces computational and storage demands while maintaining high predictive accuracy [29,30].
With the improvement of computational capabilities on mobile platforms, the MobileNet family has continued to evolve. The latest version, MobileNetV4, introduces the universal inverted bottleneck (UIB), which integrates extra depthwise convolution (ExtraDW), inverted bottleneck (IB), convolutional neural network new architecture (ConvNext), and feedforward neural networks (FFNs) (Figure 5). These innovations not only enhance the representational capacity of the model but also enable MobileNetV4 to address a broader range of visual tasks with improved efficiency and accuracy [31].
The MobileNetV4 series includes convolutional models of different scales, such as MobileNetV4-Conv-Large, MobileNetV4-Conv-Medium, and MobileNetV4-Conv-Small. For efficient deployment on the Raspberry Pi processor, MobileNetV4-Conv-Small was adopted in this study to construct the rice background classification model. Its detailed architecture is presented in Table 3. See the details of the model parameters in Table S1 of the Supplementary Materials.

2.5.2. Loss Function

The soft-label cross-entropy loss function extends the standard cross-entropy loss by using a probability distribution as the target label instead of a single binary value (0 or 1). This approach increases model flexibility, enabling it to capture class ambiguities and smooth the training process, which is particularly advantageous for enhancing model performance and generalization in complex tasks [32]. Accordingly, the soft-label cross-entropy loss function employed in this study is defined as follows:
L s o f t = - i = 1 C p i log ( q i ) ,
where Lsoft denotes the soft-label cross-entropy loss; C represents the number of rice background classes (C = 3 in this study); pi is the soft label for class i, p i [ 0 , 1 ] and i = 1 C p i = 1 , indicating that the soft label forms a probability distribution, in which soft labels were generated using the Mixup data augmentation method in this study; qi is the predicted probability of class i output by the model.

2.5.3. Evaluation Index

To evaluate the performance of the rice background classification model based on MobileNetV4-Conv-Small in classifying clear, muddy, and green algae-covered backgrounds, accuracy, precision, recall, and F1-score were employed as evaluation metrics [33], which are defined as follows:
A c c u r a c y = T P + T N T P + F P + T N + F N ,
P r e c i s i o n = T P T P + F P ,
R e c a l l = T P T P + F N ,
F 1 - S c o r e = 2 × R e c a l l × P r c e i s i o n R e c a l l + P r c e i s i o n ,
where TP and TN denote positive and negative samples with accurate classifications, respectively; FP and FN denote negative and positive samples with incorrect classifications, respectively.

2.6. Establishment and Evaluation of Rice Chlorophyll Content Detection Model

2.6.1. Model Structure

Compared to traditional detection methods, Convolutional Neural Network (CNN) models have demonstrated superior performance in the task of rice chlorophyll content detection, making them an indispensable core tool in the field of spectral data analysis and modeling [34]. Based on this, this study combines previous research [26,27,34,35] and takes into account the data collection characteristics of the C11708MA spectrometer and the actual field data collection conditions. Ultimately, the CNN model was selected for predicting rice chlorophyll content.
The CNN comprises three convolutional layers (Conv1, Conv2, and Conv3), a flatten layer, and a fully connected (FC) layer. To enhance model nonlinearity and training stability, the leaky rectified linear unit (LeakyReLU) [36] and batch normalization (BN) [37] were applied to each convolutional and FC layer, while the Dropout technique was incorporated into the Conv2 and FC layers to prevent overfitting (Figure 6). For detailed model parameter settings, please refer to Supplementary Materials Table S2.

2.6.2. Evaluation Index

To evaluate the performance of different preprocessing methods and the CNN model in detecting rice chlorophyll content under clear, muddy, and green algae-covered backgrounds, this study employed the coefficient of determination (R2), root mean square error (RMSE), and residual predictive deviation (RPD) as evaluation metrics to assess the model’s effectiveness in chlorophyll content detection. The evaluation indices were calculated as follows:
R 2 = 1 i = 1 n ( y i y ^ i ) 2 i = 1 n ( y i y ¯ i ) 2 ,
R M S E = i = 1 n ( y i y ^ i ) 2 n ,
R P D = 1 1 R 2 ,
where n is the sample size, yi is the actual chlorophyll content of sample i, ŷi is the predicted chlorophyll content of sample i, and ȳi is the average of the actual chlorophyll contents of all samples.

2.7. Field Experiment

Beyond the designated experimental regions, 30 random data collection sites were selected, to evaluate the performance of the device in background classification and chlorophyll content detection. Five test points were examined within each site, and the corresponding measurement results were recorded. Simultaneously, water quality parameters and chlorophyll content data for each point were collected as reference standards (Figure 7).
Background classification performance was quantified by classification accuracy, defined as follows:
R correct = c bg n total × 100 %   R P D = 1 1 R 2 ,
where Rcorrect denotes the accuracy of rice background classification; cbg is the number of correctly classified backgrounds; and ntotal is the total number of samples tested.
The performance of chlorophyll content detection was evaluated using the relative error, with values below 10% considered acceptable and those above 10% considered unacceptable. The calculation formula is given as follows:
E relative = | l c c est l c c act | l c c act × 100 %
where Erelative denotes the relative error of rice chlorophyll content; lccest is the measured chlorophyll content; and lccact is the actual chlorophyll content.

3. Results and Analysis

3.1. Integral Structure

The integral structure of the rice chlorophyll content detection device, which is based on background classification, is illustrated in Figure 8.

3.2. Software Interface

The software interface of the rice chlorophyll content detection device based on background classification is shown in Figure 9.
The software allows configuration of rice spectral data collection parameters, collection of spectral and background image data, classification of background images, detection of chlorophyll content, and storage of experimental data and results. This system enhances the efficiency of rice field management and provides decision-support for agricultural production.

3.3. Statistical Analysis of Data

3.3.1. Spectral Data

Figure 10 presents the spectral characteristic curves of rice under clear, muddy, and green algae-covered backgrounds, encompassing both the original and averaged spectral signal intensity, as well as the averaged spectral reflectance curves of rice leaves. Given the substantial spatial heterogeneity and temporal variability in the field environment, the spectral signal intensity and reflectance values across the three backgrounds exhibit notable differences. Nevertheless, these curves reflect the fundamental spectral characteristics of green plants to a certain degree, thereby rendering them suitable for assessing chlorophyll content in rice.

3.3.2. Background Image Data

Significant differences exist between the three background images captured by the miniature camera (Figure 11), facilitating the reduction in background classification complexity and enhancing classification performance. To optimize background classification efficiency and minimize computational resource consumption, the miniature camera chosen for this study is characterized by a lower resolution and pixel count, leading to slightly blurred background images. However, given the distinct and easily distinguishable features of the three backgrounds, this blur has a negligible effect on the background classification results and can be temporarily disregarded.

3.3.3. Chlorophyll Content Data

Utilizing statistical principles, a comprehensive analysis was performed on the rice chlorophyll content data under clear, muddy, and green algae-covered backgrounds, with corresponding statistical figure (Figure 12) and tables (Table 4) generated.
In contrast to the relatively stable glass tank environment, the field environment demonstrates substantial spatial heterogeneity and temporal variability. Within the field environment, the rice chlorophyll content is relatively low, exhibiting a large coefficient of variation; however, the overall data dispersion remains minimal, thus satisfying the fundamental data quality requirements for chlorophyll content detection.

3.4. Evaluation Results and Analysis of Rice Background Classification Model

The training and validation results for the rice background classification model based on MobileNetV4-Conv-Small are presented in Figure 13. Due to the complexity of the field environment and the low resolution of the miniature camera, the acquired rice background images may include noise and outliers, causing a temporary increase in the loss value during the initial stages of training. However, as training progresses, the model’s loss value progressively decreases and stabilizes. When the number of iterations reaches 600, the loss values of both the training and validation sets drop below 0.5, and the accuracy exceeds 90%. This indicates that the model performs well during the training process and gradually reaches a convergent state.
Additionally, the rice background classification test results are presented in Figure 14 and Table 5. Owing to the distinct and distinguishable characteristics among the clear, muddy, and green algae-covered backgrounds, the classification test results are relatively satisfactory. Specifically, the classification performance for the green algae-covered background is the most outstanding, with accuracy, precision, recall, and F1 score all exceeding 97%. The classification performance for the muddy background ranks second, with all evaluation metrics exceeding 94%. In contrast, classification of the clear background is more challenging, with test results slightly lower than those of the other two backgrounds; however, its accuracy, precision, recall, and F1 score all exceed 93%.
Based on the aforementioned classification results, MobileNetV4-Conv-Small not only provides the advantages of low computational resource consumption and ease of deployment on mobile devices, but also exhibits strong performance in the rice background classification task. The model meets the requirements for rice background classification in field environments, demonstrating practical value and promising application potential.

3.5. Evaluation Results and Analysis of Rice Chlorophyll Content Detection Model

3.5.1. Clear Background

Under the clear background, the FD + CNN model performed the best (Table 6 and Figure 15a), with R2, RMSE, and RPD values of 0.990, 3.187, and 10.231 for the training set, and 0.975, 5.191, and 6.318 for the test set, respectively. In the clear background condition, background interference was minimal, and the acquired spectral data exhibited higher quality.
Additionally, FD preprocessing effectively mitigates spectral baseline drift and background noise, while amplifying subtle differences in chlorophyll absorption features in the red band and reflection features in the near-infrared band. This significantly enhances the correlation between spectral signals and chlorophyll content, providing a solid data foundation for the CNN model to achieve high-precision inversion. As a result, the FD + CNN model achieves optimal performance in chlorophyll detection for rice under a clear background.
According to relevant studies, the spectral bands of the selected miniature spectrometer predominantly cover the red (620–750 nm) and near-infrared (750–1000 nm) regions, which closely align with the optimal spectral bands under the clear background—orange (590–620 nm) and near-infrared (750–1000 nm). Furthermore, the R + NIR method yielded superior model evaluation results under the clear background [35]. This finding further corroborates the significance of the red and near-infrared bands in rice chlorophyll content detection.

3.5.2. Muddy Background

Under the muddy background, the SS + CNN model performed the best (Table 7 and Figure 15b), with R2, RMSE, and RPD values of 0.854, 11.613, and 2.614 for the training set, and 0.627, 18.249, and 1.638 for the test set, respectively.
SS preprocessing normalizes spectral data of varying magnitudes and fluctuations to a unified scale, reducing amplitude differences caused by background interference while preserving the shape of the spectral curves. This enables the CNN model to focus on extracting morphological features related to chlorophyll, thereby meeting the detection requirements for a muddy background. However, compared to the clear background, the muddy background exhibits more significant background interference, resulting in lower-quality spectral data and poorer performance in rice chlorophyll content detection.
Relevant studies indicate that the R + NIR method yields inferior model evaluation results under the muddy background. The optimal spectral color characteristics bands under the muddy background include violet (400–450 nm), cyan (475–495 nm), yellow (570–590 nm), red (620–750 nm), and near-infrared (750–1000 nm) [35]. These bands differ markedly from the red (620–750 nm) and near-infrared (750–1000 nm) bands of the spectrometer. Consequently, compared to the rice chlorophyll content detection performance under the clear background, the model’s accuracy and robustness are reduced under the turbid background.

3.5.3. Green Algae-Covered Background

Under the green algae-covered background, the SS + CNN model performed the best (Table 8 and Figure 15c), with R2, RMSE, and RPD values of 0.952, 7.267, and 4.543 for the training set, and 0.719, 16.417, and 1.885 for the test set, respectively.
Similarly, SS preprocessing can mitigate the interference caused by algae coverage, standardize the data distribution, and assist the CNN model in distinguishing the chlorophyll features of rice and algae, thereby meeting the detection requirements under an algae-covered background.
Relative to the clear background, the green algae-covered background exhibits increased background interference, leading to reduced performance in rice chlorophyll content detection. However, compared to the muddy background, the green algae-covered background exhibits lower interference, resulting in enhanced chlorophyll content detection performance relative to the muddy background.
According to relevant studies, the R + NIR method yields better model evaluation results under the clear background compared to both the green algae-covered and muddy backgrounds, with the model performance under the green algae-covered background surpassing that under the muddy background. Under the green algae-covered background, the optimal spectral color characteristics bands include blue (450–475 nm), cyan (475–495 nm), green (495–570 nm), and near-infrared (750–1000 nm) [35]. These bands differ markedly from the red (620–750 nm) and near-infrared (750–1000 nm) bands of the spectrometer. Consequently, in comparison to rice chlorophyll content detection under the clear background, the model’s accuracy and robustness are reduced under the green algae-covered background.

3.6. Field Experiment Results and Analysis

3.6.1. Rice Background Classification Experiment

The results of the rice background classification field test are shown in Table 9 (see details in Table S3 in the Supplementary Materials). Under the clear background, the rice background classification accuracy is 86.44%, while both the muddy and green algae-covered backgrounds achieve a classification accuracy of 100.00%. Overall, the rice background classification model attained a classification accuracy of 94.67% in the field test, which satisfactorily meets the background classification requirements.
However, it is worth noting that the model’s classification performance under the clear background is significantly lower than under the muddy and green algae-covered backgrounds. This discrepancy may be closely associated with the spatial heterogeneity and temporal dynamics of the field environment. In the field environment, the soil surface often contains impurities such as weeds and stubble, while the water is relatively clear and transparent. This could lead to the turbidity meter indicating a clear background, even though the actual rice background may be muddy, thereby compromising classification accuracy.
Furthermore, the muddy background (with turbidity exceeding 10 NTU) and the green algae-covered background (with algae coverage exceeding 80%) exhibit relatively broad classification ranges and greater adaptability to environmental changes. In contrast, the classification range for the clear background (with turbidity below 10 NTU) is more stringent and exhibits greater sensitivity to field environmental variations, which may be the main reason for its lower classification accuracy. Future studies may consider further refining background classification and optimizing the classification performance for the clear background to enhance the overall accuracy and robustness of the model.

3.6.2. Rice Chlorophyll Content Detection Experiment

The results of the rice chlorophyll content detection field test are shown in Table 10 (see details in Table S4 in the Supplementary Materials). Under the clear background, the rice chlorophyll content detection performance is superior, followed by the green algae-covered background, with the poorest performance observed under the muddy background.
Specifically, 59 samples were tested under the clear background, yielding a detection compliance rate of 94.92%. Of these, 52 samples exhibited a relative error of less than 5%, accounting for 88.14%; 4 samples exhibited a relative error between 5% and 10%, accounting for 6.78%; and 3 samples had a relative error greater than 10%, accounting for 5.08%.
Under the muddy background, 48 samples were tested, resulting in a detection compliance rate of 70.83%. Of these, 21 samples exhibited a relative error of less than 5%, accounting for 43.75%; 13 samples exhibited a relative error between 5% and 10%, accounting for 27.08%; and 14 samples had a relative error greater than 10%, accounting for 29.17%.
Under the green algae-covered background, 43 samples were tested, resulting in a detection compliance rate of 83.72%. Of these, 20 samples exhibited a relative error of less than 5%, accounting for 46.51%; 16 samples exhibited a relative error between 5% and 10%, accounting for 37.21%; and 7 samples had a relative error greater than 10%, accounting for 16.28%.
Overall, 150 samples were tested across the three backgrounds. Among them, 93 samples had a relative error of less than 5%, accounting for 62.00%; 33 samples had a relative error between 5% and 10%, accounting for 22.00%; and 24 samples had a relative error greater than 10%, accounting for 16.00%. The rice chlorophyll content detection results met the requirements for 126 samples, while 24 samples did not meet the requirements, resulting in a compliance rate of 84.00%. This indicates that the method and device proposed in this study have preliminarily achieved rice chlorophyll content detection and show potential for practical application.

4. Discussion

4.1. Advantages and Challenges of Rice Background Classification Model

Background classification is essential for rice chlorophyll content detection, and its accuracy directly influences the subsequent chlorophyll content detection process. In this study, a rice background classification model based on MobileNetV4-Conv-Small was developed, yielding favorable field test results with an overall classification accuracy of 94.67%. Specifically, the classification accuracy for the clear background was 86.44%, whereas both the muddy and green algae-covered backgrounds achieved a classification accuracy of 100.00%. These results demonstrate that MobileNetV4-Conv-Small not only offers advantages in terms of low computational resource consumption and ease of deployment on mobile devices but also performs well in background classification tasks.
However, under the clear background, factors such as spatial heterogeneity, temporal dynamics of the field environment, and the low resolution of the data collection device resulted in a background classification accuracy of only 86.44%. These results suggest that while MobileNetV4-Conv-Small can adapt to background variations to some extent, there remains potential for improving its accuracy and robustness in complex field environments.
Therefore, future research could consider optimizing the image acquisition device, improving the model architecture, and integrating multimodal data. Improving camera resolution, optimizing image preprocessing techniques, and incorporating data augmentation methods could enhance both image quality and the model’s generalization capabilities. Additionally, integrating data from water quality sensors, meteorological sensors, and other sources to develop a unified model could further improve the model’s adaptability to complex environments. Furthermore, establishing a dynamic model update mechanism and implementing online learning algorithms would allow the model to adapt to real-time changes in background conditions, thereby enhancing the stability and accuracy of background classification. With these improvements, the background classification model is expected to offer more reliable technical support for precision rice farming.

4.2. Advantages and Limitations of Rice Chlorophyll Content Detection Model

First, the data collection and experimental testing for the chlorophyll content detection model were conducted based on the spectral data acquisition parameters [35], resulting in high-quality spectral data that provided a solid foundation for model development. Second, the chlorophyll content detection model was developed based on background classification, demonstrating strong adaptability for chlorophyll content detection under clear, muddy, and green algae-covered backgrounds. Finally, this study employed the CNN method, which has been validated in multiple studies [27,34,35], to develop the rice chlorophyll content detection model, integrating it with appropriate data preprocessing techniques to ensure accuracy and robustness across various background conditions.
Notably, the clear background exhibits minimal interference, resulting in higher-quality spectral data and improved rice chlorophyll content detection performance. In contrast, the muddy and green algae-covered backgrounds are characterized by higher interference and lower-quality spectral data, which adversely impacts the accuracy of chlorophyll content detection.
Furthermore, the spectral range of the spectrometer employed in this study (640–1050 nm) is better suited for rice chlorophyll content detection under clear backgrounds; however, its performance is suboptimal under muddy and green algae-covered backgrounds. To mitigate this limitation, future research could explore the use of spectrometers with a broader spectral range or the customization of spectrometers with specific bands tailored to different backgrounds, thereby enhancing the detection of rice chlorophyll content under muddy and green algae-covered conditions.

4.3. Feasibility of System Integration and Its Prospects for Practical Application

The detection system developed in this study integrates two core functions: rice background classification and chlorophyll content detection. The system uses a Raspberry Pi as the main control module and relies on the collaborative interaction of multiple sensors to achieve efficient collection and integrated analysis of spectral data and background image data. It offers remarkable advantages of low cost and high portability, laying a solid technological foundation for the large-scale promotion and application of this technology in agricultural production scenarios.
Although the system has shown good detection performance in small-scale field trials, it continues to face significant challenges in the large-scale deployment of agricultural production. From a hardware perspective, the micro camera has limited resolution and is easily affected by field environmental noise. Additionally, the spectral range adaptability of the spectrometer still needs further improvement. Regarding real-time performance, the system’s computational speed currently cannot meet the rapid monitoring needs of large-scale rice fields, and in complex field scenarios, the model inference efficiency is insufficient to support large-scale field operations. Moreover, the inherent spatial heterogeneity and temporal dynamic changes in the field environment have had a significant negative impact on both the system’s detection accuracy and background classification accuracy.
To address the above challenges, future research can focus on system optimization in the following three aspects: first, improving hardware components by selecting high-resolution cameras and custom multi-band adaptable spectrometers to enhance the system’s ability to adapt to complex field backgrounds; second, optimizing the algorithmic framework by introducing lightweight network architectures and multi-modal data fusion technologies to improve the real-time response rate and generalization performance of the model, while enhancing inference efficiency in complex scenarios; third, establishing a dynamic update mechanism and embedding online learning algorithms to adapt to the dynamic evolution of field backgrounds and enhance the system’s self-adaptive ability to spatial heterogeneity and temporal variations.
Furthermore, with the continued iteration of related technologies, the system has the potential to be extended to other aquatic crops and different growth stages of rice. In the future, leveraging transfer learning and multi-source data fusion technologies will enable rapid adaptation to various crops and growth stages, further expanding the application boundaries of the technology and providing more efficient and accurate technical solutions for the precise cultivation management of multiple crops throughout their entire growth stages under smart agriculture models.

5. Conclusions

This study, by integrating image recognition and spectral analysis techniques, developed a device that combines rice background classification and chlorophyll content detection functionalities. The key research findings are summarized as follows: (1) A rice background classification model was developed based on MobileNetV4-Conv-Small, achieving classification of clear, muddy, and green algae-covered backgrounds. The results indicate that for the clear background, the classification accuracy, precision, recall, and F1 score are 97.33%, 98.94%, 93.00%, and 95.88%, respectively; for the muddy background, they are 97.33%, 94.29%, 99.00%, and 96.59%; and for the green algae-covered background, they are 97.33%, 99.01%, 100.00%, and 99.50%. The green algae-covered background yields the best classification performance, with all evaluation metrics surpassing 97%. The muddy background follows, with all metrics exceeding 94%. Although the clear background presents the greatest classification challenge, all metrics still exceed 93%. (2) Rice chlorophyll content detection models were developed for clear, muddy, and green algae-covered backgrounds by integrating preprocessing with CNN methods. For the clear background, the optimal detection model was FD + CNN, with R2, RMSE, and RPD values of 0.990, 3.187, and 10.231 in the training set, and 0.975, 5.191, and 6.318 in the test set, respectively. For the muddy background, the optimal detection model was SS + CNN, with R2, RMSE, and RPD values of 0.854, 11.613, and 2.614 in the training set, and 0.627, 18.249, and 1.638 in the test set, respectively. For the green algae-covered background, the optimal detection model was SS + CNN, with R2, RMSE, and RPD values of 0.952, 7.267, and 4.543 in the training set, and 0.719, 16.417, and 1.885 in the test set, respectively. (3) The field test results indicate that the device achieved a rice background classification accuracy of 94.67% and a chlorophyll content relative error compliance rate of 84.00%, thus preliminarily accomplishing both rice background classification and chlorophyll content detection. Specifically, the classification accuracy for the clear background was 86.44%, with a chlorophyll content relative error compliance rate of 94.92%; for the muddy background, the classification accuracy was 100.00%, with a chlorophyll content relative error compliance rate of 70.83%; and for the green algae-covered background, the classification accuracy was also 100.00%, with a chlorophyll content relative error compliance rate of 83.72%.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/agronomy16020192/s1, Table S1: Parameter setting of rice background classification model; Table S2: Parameter setting of rice chlorophyll content detection model; Table S3: Field experiment specific results of rice background classification; Table S4: Field experiment specific results of rice chlorophyll content detection.

Author Contributions

Conceptualization, Y.C.; methodology, Y.C., X.W., X.X. (Xiaoyang Xing), S.X., X.Z. and R.X.; software, Y.C., S.C., Y.Y., D.W. and X.X. (Xin Xu); validation, Y.C., X.X. (Xiaoyang Xing), S.X., X.Z., S.C., Y.Y., D.W., R.X. and X.X. (Xin Xu); formal analysis, Y.C.; investigation, Y.C.; data curation, Y.C., X.X. (Xiaoyang Xing), S.X., X.Z., S.C., Y.Y., D.W., R.X. and X.X. (Xin Xu); writing—original draft preparation, Y.C.; writing—review and editing, Y.C., X.W., X.X. (Xiaoyang Xing), X.Z. and R.X.; visualization, Y.C.; supervision, X.W.; funding acquisition, X.W., X.X. (Xiaoyang Xing), S.C. and R.X. All authors have read and agreed to the published version of the manuscript.

Funding

This work acknowledges the funding supported by the Shandong Provincial Natural Science Foundation (ZR2024QC037), National Key R & D Program of China (2019YFE0125201), Jiangsu Provincial Young Scientific and Technological Talents Support Program (JSTJ2024632), Taizhou Science and Technology Support Program (Social Development) (TSL202518), Taizhou Science and Technology Support Program (Modern Agriculture) (TN202502), Taizhou Natural Science Foundation Project (TZ202510).

Data Availability Statement

The original data supporting the conclusions of this article will be provided by the author as requested.

Conflicts of Interest

The authors declare that they have no anyone known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

References

  1. Asma, J.; Subrahmanyam, D.; Krishnaveni, D. The global lifeline: A staple crop sustaining two thirds of the world’s population. Agric. Arch. 2023, 2, 15–18. [Google Scholar] [CrossRef]
  2. Zheng, J.; Song, X.; Yang, G.; Du, X.; Mei, X.; Yang, X. Remote sensing monitoring of rice and wheat canopy nitrogen: A review. Remote Sens. 2022, 14, 5712. [Google Scholar] [CrossRef]
  3. Li, X.; Zhu, B.; Li, S.; Liu, L.; Song, K.; Liu, J. A Comprehensive Review of Crop Chlorophyll Mapping Using Remote Sensing Approaches: Achievements, Limitations, and Future Perspectives. Sensors 2025, 25, 2345. [Google Scholar] [CrossRef] [PubMed]
  4. Zaman, Q.U. Chapter 1—Precision Agriculture Technology: A Pathway toward Sustainable Agriculture. In Precision Agriculture; Zaman, Q., Ed.; Academic Press: Cambridge, MA, USA, 2023; pp. 1–17. ISBN 978-0-443-18953-1. [Google Scholar]
  5. Donnelly, A.; Yu, R.; Rehberg, C.; Meyer, G.; Young, E.B. Leaf chlorophyll estimates of temperate deciduous shrubs during autumn senescence using a SPAD-502 meter and calibration with extracted chlorophyll. Ann. For. Sci. 2020, 77, 30. [Google Scholar] [CrossRef]
  6. Parry, C.; Blonquist, J.M., Jr.; Bugbee, B. In situ measurement of leaf chlorophyll concentration: Analysis of the optical/absolute relationship. Plant Cell Environ. 2014, 37, 2508–2520. [Google Scholar] [CrossRef]
  7. Kuhlgert, S.; Austic, G.; Zegarac, R.; Osei-Bonsu, I.; Hoh, D.; Chilvers, M.I.; Roth, M.G.; Bi, K.; TerAvest, D.; Weebadde, P.; et al. MultispeQ Beta: A tool for large-scale plant phenotyping connected to the open PhotosynQ network. R. Soc. Open Sci. 2016, 3, 160592. [Google Scholar] [CrossRef]
  8. Nur Cahyo, A.; Murti, R.H.; TS Putra, E.; Nuringtyas, T.R.; Fabre, D.; Montoro, P. SPAD-502 and atLEAF CHL PLUS values provide good estimation of the chlorophyll content for Hevea brasiliensis Müll. Arg. Leaves. Menara Perkeb. 2020, 88, 1–8. [Google Scholar]
  9. Hura, K.; Dziurka, M.; Hura, T.; Wójcik-Jagła, M.; Zieliński, A. Physiological and molecular responses of the flag leaf under l-phenylalanine ammonia-lyase inhibition. Cells 2025, 14, 1368. [Google Scholar] [CrossRef] [PubMed]
  10. Moreira, L.A.; Tränkner, M.; Mariano, E.; Otto, R. Molybdenum supply increases 15N-nitrate uptake by maize. Front. Plant Sci. 2025, 16, 1546132. [Google Scholar] [CrossRef]
  11. Zhang, H.; Zhang, L.; Wu, H.; Wang, D.; Ma, X.; Shao, Y.; Jiang, M.; Chen, X. Unmanned-Aerial-Vehicle-Based Multispectral Monitoring of Nitrogen Content in Canopy Leaves of Processed Tomatoes. Agriculture 2025, 15, 309. [Google Scholar] [CrossRef]
  12. Jiang, L.; Dai, J.; Wang, L.; Chen, L.; Zeng, G.; Liu, E.; Zhou, X.; Yao, H.; Xiao, Y.; Fang, J. Ca(H2PO4)2 and MgSO4 activated nitrogen-related bacteria and genes in thermophilic stage of compost. Appl. Microbiol. Biotechnol. 2024, 108, 331. [Google Scholar] [CrossRef]
  13. Wang, W.; Liao, H.; Zhou, L.; Zheng, Y.; Li, L.; Zhang, Q.; Chen, Q.; Mu, K. Effects of organic waste liquids on Chinese cabbage growth and cadmium accumulation in cadmium-contaminated soils. Trans. Chin. Soc. Agric. Eng. 2023, 39, 235–244. [Google Scholar]
  14. Nurhermawati, R.; Lubis, I.; Junaedi, A. Respon Karakter Pengisian Biji dan Hasil terhadap Pemberian Pupuk Urea pada Empat Varietas Padi. J. Agron. Indones. 2021, 49, 235–241. [Google Scholar] [CrossRef]
  15. Nie, R.; Wu, C.; Ji, X.; Li, A.; Zheng, X.; Tang, J.; Sun, L.; Su, Y.; Zhang, J. Methyl Jasmonate Orchestrates Multi-Pathway Antioxidant Defense to Enhance Salt Stress Tolerance in Walnut (Juglans regia L.). Antioxidants 2025, 14, 974. [Google Scholar] [CrossRef]
  16. Zhang, C.; Liu, M.; Chao, J.; Tang, B.; Li, M.; Sun, H. Contact-based Crop Chlorophyll Detection System Based on Feature Wavelengths. Trans. Chin. Soc. Agric. Mach. 2024, 55, 255–262. [Google Scholar]
  17. Li, J.; Wang, N.; Li, Z.; Liu, M.; Sun, H.; Li, M. Development of Handheld Chlorophyll Detector Based on Characteristic Wavelengths Optimization. Trans. Chin. Soc. Agric. Mach. 2023, 54, 270–277. [Google Scholar]
  18. Wang, N.; Li, Z.; Li, J.; Zhang, Y.; Sun, H.; Li, M. Fusing Multispectral Imaging and Deep Learning in Plant Chlorophyll Index Detection System. Trans. Chin. Soc. Agric. Mach. 2023, 5454, 260–269. [Google Scholar]
  19. Zhang, Z.; Ma, X.; Long, Y.; Li, S.; Sun, H.; Li, M. Design and Development of Crop Chlorophyll Dynamic Monitoring System Based on Internet of Things. Trans. Chin. Soc. Agric. Mach. 2019, 50, 115–121, 166. [Google Scholar]
  20. Tang, W.; Wang, N.; Liu, G.; Zhao, R.; Li, M.; Sun, H. Design of Portable Crop Chlorophyll Detector with Ambient Light Correction. Trans. Chin. Soc. Agric. Mach. 2022, 53, 249–256. [Google Scholar]
  21. Sun, H.; Xing, Z.; Zhang, Z.; Long, Y.; Li, M.; Zhang, Q. Design and Experiment of Chlorophyll Content Detection Device for Active Light Source Based on RED-NIR. Trans. Chin. Soc. Agric. Mach. 2019, 50, 175–181, 296. [Google Scholar]
  22. Chen, J. Design and Experiment of Reflective Leaf Chlorophyll Content Detector Based on Active Light Source. Master’s Thesis, Jiangsu University, Zhenjiang, China, 2020. [Google Scholar]
  23. Yang, Z.; Li, G.; Zhang, Y.; Su, W.; Wu, J.; Duan, T.; Qi, L. Plant Chlorophyll a and Chlorophyll b Nondestructive Testing System: Research and Development. Chin. Agric. Sci. Bull. 2021, 37, 147–152. [Google Scholar]
  24. Yang, B.; Du, R.; Yang, Y.; Zhu, D.; Guo, W.; Zhu, X. Design of Portable Nondestructive Detector for Chlorophyll Content of Plant Leaves. Trans. Chin. Soc. Agric. Mach. 2019, 50, 180–186. [Google Scholar]
  25. Liu, M.; Li, Z. Principle and design of chlorophyll spectrometry meter for plants. J. Chin. Agric. Mech. 2017, 38, 74–79. [Google Scholar] [CrossRef]
  26. Chen, Y.; Wang, X.; Zhang, X.; Wang, D.; Xu, X.; Huang, X. Parameter optimization for spectral data collection in dark environments for rice leaf chlorophyll content estimation. Comput. Electron. Agric. 2025, 230, 109828. [Google Scholar] [CrossRef]
  27. Chen, Y.; Wang, X.; Zhang, X.; Xu, X.; Huang, X.; Wang, D.; Amin, A. Spectral-based estimation of chlorophyll content and determination of background interference mechanisms in low-coverage rice. Comput. Electron. Agric. 2024, 226, 109442. [Google Scholar] [CrossRef]
  28. Monje, O.A.; Bugbee, B. Inherent Limitations of Nondestructive Chlorophyll Meters: A Comparison of Two Types of Meters. HortSci 1992, 27, 69–71. [Google Scholar] [CrossRef]
  29. Wang, D.; Wang, X.; Shi, Y.; Zhang, X.; Chen, Y.; Zheng, J.; Liu, N. A lightweight keypoint detection model-based method for strawberry recognition and picking point localization in multi-occlusion scenes. Artif. Intell. Agric. 2025, 16, 316–341. [Google Scholar] [CrossRef]
  30. Liu, Y.; Wang, Z.; Wang, R.; Chen, J.; Gao, H. Flooding-based MobileNet to identify cucumber diseases from leaf images in natural scenes. Comput. Electron. Agric. 2023, 213, 108166. [Google Scholar] [CrossRef]
  31. Qin, D.; Leichner, C.; Delakis, M.; Fornoni, M.; Luo, S.; Yang, F.; Wang, W.; Banbury, C.; Ye, C.; Akin, B.; et al. MobileNetV4: Universal Models for the Mobile Ecosystem. In Proceedings of the Computer Vision–ECCV 2024, Milan, Italy, 29 September–4 October 2024; Leonardis, A., Ricci, E., Roth, S., Russakovsky, O., Sattler, T., Varol, G., Eds.; Springer Nature: Cham, Switzerland, 2025; pp. 78–96. [Google Scholar]
  32. Liu, B.; Jia, R.; Zhu, X.; Yu, C.; Yao, Z.; Zhang, H.; He, D. Lightweight identification model for apple leaf diseases and pests based on mobile terminals. Trans. Chin. Soc. Agric. Eng. 2022, 38, 130–139. [Google Scholar]
  33. Wang, D.; Wang, X.; Chen, Y.; Wu, Y.; Zhang, X. Strawberry ripeness classification method in facility environment based on red color ratio of fruit rind. Comput. Electron. Agric. 2023, 214, 108313. [Google Scholar] [CrossRef]
  34. Xu, X.; Chen, Y.; Yin, H.; Wang, X.; Zhang, X. Nondestructive detection of SSC in multiple pear (Pyrus pyrifolia Nakai) cultivars using Vis-NIR spectroscopy coupled with the Grad-CAM method. Food Chem. 2024, 450, 139283. [Google Scholar] [CrossRef] [PubMed]
  35. Chen, Y.; Wang, X.; Zhang, X.; Wang, D.; Xu, X. A band selection method combining spectral color characteristics for estimating chlorophyll content of rice in different backgrounds. Spectrochim. Acta A Mol. Biomol. Spectrosc. 2025, 330, 125681. [Google Scholar] [CrossRef] [PubMed]
  36. Zhang, X.; Zou, Y.; Shi, W. Dilated convolution neural network with LeakyReLU for environmental sound classification. In Proceedings of the 2017 22nd International Conference on Digital Signal Processing (DSP), London, UK, 23–25 August 2017; pp. 1–5. [Google Scholar]
  37. Ioffe, S.; Szegedy, C. Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift. In Proceedings of the 32nd International Conference on Machine Learning, Lille, France, 7–9 July 2015. [Google Scholar]
Figure 1. Working principle and hardware system.
Figure 1. Working principle and hardware system.
Agronomy 16 00192 g001
Figure 2. Software design flowchart.
Figure 2. Software design flowchart.
Agronomy 16 00192 g002
Figure 3. Experimental design. (a) Selection of experimental area; (b) Setting of sampling interval; (c) Testing of data collection; (d) Division of data collection area.
Figure 3. Experimental design. (a) Selection of experimental area; (b) Setting of sampling interval; (c) Testing of data collection; (d) Division of data collection area.
Agronomy 16 00192 g003
Figure 4. Data division. (a) Background image data; (b) Spectral and chlorophyll content data.
Figure 4. Data division. (a) Background image data; (b) Spectral and chlorophyll content data.
Agronomy 16 00192 g004
Figure 5. UIB structure diagram.
Figure 5. UIB structure diagram.
Agronomy 16 00192 g005
Figure 6. Rice chlorophyll content detection flowchart.
Figure 6. Rice chlorophyll content detection flowchart.
Agronomy 16 00192 g006
Figure 7. Field experiment.
Figure 7. Field experiment.
Agronomy 16 00192 g007
Figure 8. Device overall structure.
Figure 8. Device overall structure.
Agronomy 16 00192 g008
Figure 9. Software interface. (a) Classification test of the clear background; (b) Chlorophyll content detection under the clear background; (c) Classification test of the muddy background; (d) Chlorophyll content detection under the muddy background; (e) Classification test of the green algae-covered background; (f) Chlorophyll content detection under the green algae-covered background.
Figure 9. Software interface. (a) Classification test of the clear background; (b) Chlorophyll content detection under the clear background; (c) Classification test of the muddy background; (d) Chlorophyll content detection under the muddy background; (e) Classification test of the green algae-covered background; (f) Chlorophyll content detection under the green algae-covered background.
Agronomy 16 00192 g009
Figure 10. Original and average spectral signal intensity, as well as average spectral reflectance curves of rice leaves, under three backgrounds. (a) Original and average signal intensity curves under the clear background; (b) Original and average signal intensity curves under the muddy background; (c) Original and average signal intensity curves under the green algae-covered background; (d) Average reflectance curves under three backgrounds.
Figure 10. Original and average spectral signal intensity, as well as average spectral reflectance curves of rice leaves, under three backgrounds. (a) Original and average signal intensity curves under the clear background; (b) Original and average signal intensity curves under the muddy background; (c) Original and average signal intensity curves under the green algae-covered background; (d) Average reflectance curves under three backgrounds.
Agronomy 16 00192 g010
Figure 11. Background image data.
Figure 11. Background image data.
Agronomy 16 00192 g011
Figure 12. Statistical analysis of leaf chlorophyll content data under three backgrounds.
Figure 12. Statistical analysis of leaf chlorophyll content data under three backgrounds.
Agronomy 16 00192 g012
Figure 13. Accuracy and loss value curves.
Figure 13. Accuracy and loss value curves.
Agronomy 16 00192 g013
Figure 14. Confusion matrix.
Figure 14. Confusion matrix.
Agronomy 16 00192 g014
Figure 15. Scatter plot of chlorophyll content detection results using the optimal model under three backgrounds. (a) Clear background; (b) Muddy background; (c) Green algae-covered background.
Figure 15. Scatter plot of chlorophyll content detection results using the optimal model under three backgrounds. (a) Clear background; (b) Muddy background; (c) Green algae-covered background.
Agronomy 16 00192 g015
Table 1. Representative portable chlorophyll meter.
Table 1. Representative portable chlorophyll meter.
TypeManufacturerRef.
Konica Minolta SPAD-502 PlusJapan[5]
Opti-Sciences CCM-200 plusAmerica[6]
PhotosynQ MultispeQ V2.0America[7]
atLEAF CHL PLUSAmerica[8]
Hansatech CL-01Britain[9]
Pessl Instruments DualexAustria[10]
ZKWH TYS-4NChina[11]
JC-YLS01China[12]
LD-YBChina[13]
YLS-AChina[14]
Yaxin-1162China[15]
Table 2. Main parameters of rice chlorophyll content detection device.
Table 2. Main parameters of rice chlorophyll content detection device.
ParametersValue
Overall size (length × width × height)275.0 mm × 154.0 mm × 72.0 mm
Spectral resolution≤20 nm
Spectral wavelength range640–1050 nm
Spectral wavelength repeatability−0.5–0.5 nm
Image resolution2592 × 1944
Number of image pixels500W pixels
Power supply requirements5V 3A
Working temperature35.0–50.0 °C
Table 3. Model architecture of MobileNetV4-Conv-Small.
Table 3. Model architecture of MobileNetV4-Conv-Small.
InputBlockDW K1DW K2Extended DimOutput DimStride
2242 × 3Conv2D3 × 3323
1122 × 32FusedIB3 × 332322
562 × 32FusedIB3 × 396642
282 × 64ExtraDW5 × 55 × 5192962
142 × 96IB3 × 3192961
142 × 96IB3 × 3192961
142 × 96IB3 × 3192961
142 × 96IB3 × 3192961
142 × 96ConvNext3 × 3384961
142 × 96ExtraDW3 × 33 × 35761282
72 × 128ExtraDW5 × 55 × 55121281
72 × 128IB5 × 55121281
72 × 128IB5 × 53841281
72 × 128IB3 × 35121281
72 × 128IB3 × 35121281
72 × 128Conv2D1 × 19601
72 × 960AvgPool7 × 79601
12 × 960Conv2D1 × 112801
12 × 128Conv2D1 × 110001
Table 4. Statistical analysis of leaf chlorophyll content data under three backgrounds.
Table 4. Statistical analysis of leaf chlorophyll content data under three backgrounds.
Statistical
Parameter
Clear
Background
Muddy
Background
Green Algae-Covered
Background
Number of data100010001000
Upper quartile165.99156.86168.58
Median145.76142.59146.33
Lower quartile120.61114.87121.78
Mean142.92135.56145.87
Coefficient of
variation
0.230.220.22
Table 5. Test results of rice background classification.
Table 5. Test results of rice background classification.
BackgroundAccuracy/%Precision/%Recall/%F1 Score/%
Clear97.3398.9493.0095.88
Muddy97.3394.2999.0096.59
Green algae-covered97.3399.01100.0099.50
Table 6. Detection results of rice leaf chlorophyll content under the clear background.
Table 6. Detection results of rice leaf chlorophyll content under the clear background.
ModelTraining SetTest Set
RMSER2RPDRMSER2RPD
FD + CNN0.9903.18710.2310.9755.1916.318
DC + CNN0.9834.2697.6370.9705.7105.743
MC + CNN0.9794.7056.9300.9675.9935.473
SD + CNN0.9903.2699.9720.9646.1835.304
SNV + CNN0.9784.8266.7560.9646.2285.265
VN + CNN0.9794.7206.9070.9616.4405.093
MMS + CNN0.9844.0917.9700.9616.4405.092
MSC + CNN0.9666.0115.4250.9576.8304.802
CWT + CNN0.9745.2786.1770.9556.9484.720
SS + CNN0.9824.4247.3700.9487.4774.386
MAS + CNN0.83013.4522.4240.79214.9552.193
SGCS + CNN0.83113.4202.4300.79114.9922.188
Table 7. Detection results of rice leaf chlorophyll content under the muddy background.
Table 7. Detection results of rice leaf chlorophyll content under the muddy background.
ModelTraining SetTest Set
RMSER2RPDRMSER2RPD
SS + CNN0.85411.6132.6140.62718.2491.638
MMS + CNN0.77014.5432.0870.56919.6181.524
FD + CNN0.83612.2862.4710.54420.1901.481
SD + CNN0.83312.4002.4480.49221.2981.404
MC + CNN0.53420.7301.4640.43622.4461.332
CWT + CNN0.55320.2961.4960.41922.7861.312
SGCS + CNN0.48221.8471.3890.40922.9821.301
DC + CNN0.56619.9881.5190.38823.3881.278
MAS + CNN0.44322.6631.3390.38423.4581.274
VN + CNN0.39323.6411.2840.34424.2161.234
SNV + CNN0.41723.1731.3100.34224.2501.233
MSC + CNN0.24126.4401.1480.26925.5561.170
Table 8. Detection results of rice leaf chlorophyll content under the green algae-covered background.
Table 8. Detection results of rice leaf chlorophyll content under the green algae-covered background.
ModelTraining SetTest Set
RMSER2RPDRMSER2RPD
SS + CNN0.9527.2674.5430.71916.4171.885
SD + CNN0.9398.1274.0620.69617.0611.814
MMS + CNN0.89610.6333.1050.65318.2221.698
VN + CNN0.74816.5661.9930.60119.5341.584
FD + CNN0.87511.6662.8300.54720.8271.486
MC + CNN0.58921.1541.5600.49921.9051.413
SNV + CNN0.51522.9861.4360.44822.9951.346
DC + CNN0.57421.5371.5330.43023.3531.325
CWT + CNN0.48523.6981.3930.39324.1161.283
MSC + CNN0.40725.4231.2980.37424.4851.264
SGCS + CNN0.43924.7321.3350.35724.8141.247
MAS + CNN0.23328.9171.1420.22027.3321.132
Table 9. Field experiment results of rice background classification.
Table 9. Field experiment results of rice background classification.
ItemSample SizeNumber of Correct
Classifications
Correct Rate/%
Clear595186.44%
Muddy4848100.00%
Green algae-covered4343100.00%
Overall results15014294.67%
Table 10. Field experiment results of rice chlorophyll content detection.
Table 10. Field experiment results of rice chlorophyll content detection.
ItemSample
Size
Number of
Relative Error
<5%
Number of 5%
≤Relative Error
≤10%
Number of
Relative Error
>10%
Clear5952
(88.14%)
4
(6.78%)
3
(5.08%)
Muddy4821
(43.75%)
13
(27.08%)
14
(29.17%)
Green algae-covered4320
(46.51%)
16
(37.21%)
7
(16.28%)
Overall results15093
(62.00%)
33
(22.00%)
24
(16.00%)
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Chen, Y.; Wang, X.; Xing, X.; Xu, S.; Zhang, X.; Chen, S.; Yin, Y.; Wang, D.; Xi, R.; Xu, X. A Spectral Device for Rice Chlorophyll Content Detection with Background Classification Capability. Agronomy 2026, 16, 192. https://doi.org/10.3390/agronomy16020192

AMA Style

Chen Y, Wang X, Xing X, Xu S, Zhang X, Chen S, Yin Y, Wang D, Xi R, Xu X. A Spectral Device for Rice Chlorophyll Content Detection with Background Classification Capability. Agronomy. 2026; 16(2):192. https://doi.org/10.3390/agronomy16020192

Chicago/Turabian Style

Chen, Yanyu, Xiaochan Wang, Xiaoyang Xing, Sheng Xu, Xiaolei Zhang, Shengfeng Chen, Yue Yin, Dezhi Wang, Rui Xi, and Xin Xu. 2026. "A Spectral Device for Rice Chlorophyll Content Detection with Background Classification Capability" Agronomy 16, no. 2: 192. https://doi.org/10.3390/agronomy16020192

APA Style

Chen, Y., Wang, X., Xing, X., Xu, S., Zhang, X., Chen, S., Yin, Y., Wang, D., Xi, R., & Xu, X. (2026). A Spectral Device for Rice Chlorophyll Content Detection with Background Classification Capability. Agronomy, 16(2), 192. https://doi.org/10.3390/agronomy16020192

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop