Next Article in Journal
Lettuce Performance in a Tri-Trophic System Incorporating Crops, Fish and Insects Confirms the Feasibility of Circularity in Agricultural Production
Previous Article in Journal
Genomic Analysis of Cadmium-Resistant and Plant Growth-Promoting Burkholderia alba Isolated from Plant Rhizosphere
Previous Article in Special Issue
Unveiling the Multifaceted Roles of Root Exudates: Chemical Interactions, Allelopathy, and Agricultural Applications
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Comprehensive Review of Optical and AI-Based Approaches for Plant Growth Assessment

by
Juan Zapata-Londoño
1,
Juan Botero-Valencia
1,*,
Vanessa García-Pineda
1,
Erick Reyes-Vera
1 and
Ruber Hernández-García
2,3
1
Grupo Automática, Electrónica y Ciencias Computacionales, Faculty of Engineering, Instituto Tecnológico Metropolitano—ITM, Medellin 050034, Colombia
2
Laboratory of Technological Research in Pattern Recognition–LITRP, Facultad de Ciencias de la Ingeniería, Universidad Católica del Maule, Talca 3480112, Chile
3
Department of Computing and Industries, Facultad de Ciencias de la Ingeniería, Universidad Católica del Maule, Talca 3480112, Chile
*
Author to whom correspondence should be addressed.
Agronomy 2025, 15(8), 1781; https://doi.org/10.3390/agronomy15081781
Submission received: 12 June 2025 / Revised: 20 July 2025 / Accepted: 22 July 2025 / Published: 24 July 2025
(This article belongs to the Special Issue Advances in Agricultural Engineering for a Sustainable Tomorrow)

Abstract

Plant growth monitoring is a complex and challenging task, which depends on a variety of environmental variables, such as temperature, humidity, nutrient availability, and solar radiation. Advances in optical sensors have significantly enhanced data collection on plant growth. These developments enable the optimization of agricultural practices and crop management through the integration of artificial vision techniques. Despite advances in the application of these technologies, limitations and challenges persist. This review aims to analyze the current state-of-the-art methodologies for using artificial vision and optical sensors in plant growth assessment. The systematic review was conducted following the guidelines for Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA). Relevant studies were analyzed from the Scopus and Web of Science databases. The main findings indicate that data collection in agricultural environments is challenging. This is due to the variability of climatic conditions, the heterogeneity of crops, and the difficulty in obtaining accurately and homogeneously labeled datasets. Additionally, the integration of artificial vision models and advanced sensors would enable the assessment of plant responses to these environmental factors. The advantages and limitations were examined, as well as proposed research areas to further contribute to the improvement and expansion of these emerging technologies for plant growth assessment. Finally, a relevant research line focuses on evaluating AI-based models on low-power embedded platforms to develop accessible and efficient decision-making solutions in both agricultural and urban environments. This systematic review was registered in the Open Science Framework (OSF).

1. Introduction

Plant growth is a complex process that depends on a variety of environmental and genetic factors, which has generated the need to develop advanced and non-invasive measurement techniques. Also, monitoring plant growth is a challenging task due to the variability of environmental factors, such as temperature, humidity, nutrient availability, and solar radiation [1,2]. Traditional methods, such as manual measurement and the use of contact sensors, can be inefficient and lack scalability in large-scale agricultural systems. In addition, these methods can be affected by observer subjectivity and lack of standardization in measurements [3].
In recent decades, advances in optical sensors have significantly enhanced the collection of accurate plant growth data. Optical-based data collection optimizes agricultural practices and crop management by integrating artificial vision techniques [4]. The development of new methodologies based on artificial vision and optical sensors has enabled the overcoming of many previous limitations through automated solutions, allowing for continuous, real-time monitoring of plant morphology [4]. In particular, these technologies facilitate the accurate and systematic quantification of various morphological and structural attributes, such as canopy architecture [5], leaf area [5,6,7], height [8,9], and stem diameter [10], among other key parameters. State-of-the-art works allow the identification, segmentation, and analysis of plant structures with high precision, optimizing crop monitoring and decision-making [4,11,12].
Recent studies have shown that the combination of drones equipped with multispectral cameras enables high-accuracy crop health assessment and biomass estimation [3]. Depth cameras also allow the capture of three-dimensional morphological features as indicators to assess yield, growth, and overall plant development [13,14]. Multispectral and thermal infrared images of corn crops captured from an unmanned aerial vehicle (UAV) were employed to estimate aerial biomass (AGB) using machine learning models [15]. Additionally, a DJI M210 UAV equipped with a multispectral camera and thermal sensor was used to capture multispectral images to conduct a non-destructive monitoring of leaf area index (LAI) in corn crops [16].
Therefore, the integration of artificial vision systems and optical sensors has allowed the recent development of the so-called smart greenhouses. Using these technologies, environmental and structural variables of plants are monitored in real time, allowing for more efficient management of agricultural resources [17]. On the other hand, the processing of data captured using artificial intelligence algorithms has improved decision-making in crop management and early disease detection. These technologies have become key tools for evaluating plant growth and health. Some of the most widely used approaches include image segmentation for the identification of leaf areas and biomass calculation through contour extraction [18]; deep learning models for phenological stage classification and disease detection [19]; and the use of spectral sensors, including hyperspectral and fluorescence imaging, to evaluate photosynthetic activity and plant nutritional status [4]. Despite advances in the application of these technologies, limitations in their use have been identified. Therefore, it is essential to conduct a systematic review of the literature to consolidate existing knowledge and evaluate future trends.
The present review aims to analyze the state-of-the-art on using artificial vision and optical sensors in plant growth assessment. It addresses their advantages and limitations to further contribute to the improvement and expansion of these emerging technologies in this research area. Thus, by identifying current applications, methodologies, and future research areas, it is expected to have an impact on the optimization of processes and solutions for measuring plant growth. In order to achieve this objective, the following questions guide this research:
1.
What characteristics have been achieved in the development of methods for measuring plant growth?
2.
What are the main data and image processing techniques for plant growth measurement?
3.
How do artificial vision and optics contribute to the identification of plant growth measurements?
4.
Which sectors show the most significant advances or potential in the use of artificial vision and optics in plant growth?
5.
What are the main challenges to be addressed in the area of plant biology and horticulture from the use of artificial vision and optical techniques in the future society?
Based on the performed systematic review, the main contributions of this paper are twofold: (i) analyzing current research trends and emerging research areas on the use of artificial vision and optical sensors for plant growth assessment and (ii) identifying the most relevant limitations, persistent challenges, and future research directions to impact the optimization and creation of new solutions for measuring plant growth by integrating these technologies. To the best of our knowledge, the presented review is a pioneer in the study of integrating optical sensors and artificial vision-based approaches for assessing plant growth.
The rest of the article is structured as follows. Section 2 presents key theoretical concepts and background knowledge related to the research topic. Section 3 describes the review methodology to conduct the systematic literature review. Next, the obtained results are presented and discussed in Section 4 and Section 5, analyzing the findings and their relevance in depth. Finally, Section 6 gives conclusions with a synthesis of implications and recommendations for future research in the field of artificial vision and optics applied to plant growth.

2. Background Knowledge and Key Concepts

The optical techniques used in plant phenotyping are based on physical principles that enable the detection and characterization of plants’ structural, spectral, and physiological properties. Technologies such as hyperspectral imaging, stereo vision, LiDAR (Light Detection and Ranging) scanning, and fluorescence spectroscopy operate by capturing information based on the reflection, emission, or scattering of electromagnetic radiation. Understanding the rationale behind each of these techniques is crucial for accurately interpreting the acquired data and selecting the most suitable system in relation to the experimental objective.

2.1. RGB Optical Sensors

Optical RGB sensors capture images by analyzing the light reflected by objects in the visible spectrum. The general flow of this process is illustrated in Figure 1. The light is initially directed to the lens system, which serves as a focusing and concentrating device, directing the light rays onto the surface of the image sensor. Before reaching the sensor, the light passes through a color filter, commonly a Bayer matrix, which divides the spectrum into three main bands: red (R), green (G), and blue (B) [20]. This filter is composed of a specific arrangement of microfilters placed over the photodetectors, allowing each one to record only one of the three spectral components. Each color component is detected by a particular subset of photodetectors that measure the luminous intensity at its respective wavelength. This partial distribution of information is then compensated for by an interpolation process known as demosaicing, which enables the reconstruction of a complete color image. The result is a digital image where each pixel contains numerical values representing the intensity of each color channel.

2.2. Hyperspectral Sensors

A hyperspectral sensor is an optical device capable of capturing very detailed spectral information of a scene, covering a wide range of wavelengths of the electromagnetic spectrum. Its operation is based on the decomposition of light reflected by objects into multiple narrow spectral bands. This process generates an image in which each pixel contains a characteristic spectrum of the observed point. The result of the acquisition with a hyperspectral sensor is a three-dimensional set of data, commonly referred to as a “hyperspectral cube” or “hypercube”. This cube is made up of two spatial dimensions (X and Y), corresponding to the width and height of the image, and a spectral dimension ( l a m b d a ), which represents the information obtained at different wavelengths [21]. The hypercube can be analyzed from two perspectives: as a series of spatial images at different wavelengths or as a set of spectra obtained for each pixel of the image. In other words, one can observe how light behaves at a specific location along the spectrum or how the entire scene looks under different spectral bands.
Hyperspectral sensors are composed of several fundamental elements, as shown in Figure 2. These include the light source, which can be natural (such as sunlight) or artificial; the optical systems, such as lenses and mirrors, that focus the light; the dispersive elements that separate the light into different wavelengths, such as prisms or diffraction gratings; the detector (CCD or CMOS), which captures the intensity of the light in each band; and the acquisition and processing system, which digitizes and stores the data for later analysis.
Hyperspectral sensors can be mounted on various airborne platforms, including satellites, aircraft, and unmanned aerial vehicles (UAVs). However, UAVs offer significant advantages, as they are capable of acquiring images with high spatial resolution at a much lower cost [22]. Moreover, they have great flexibility in terms of flight mission planning and execution, making them an ideal choice for precision monitoring applications and repetitive tasks. In recent years, the development of new hyperspectral sensors and their implementation on various platforms (satellites, aircraft, and UAVs) has allowed for the expansion of their coverage and improvement of both spatial and spectral resolution. Hyperspectral imaging is a data-rich technology that effectively solves agricultural problems, including disease detection, weed detection, stress detection, crop monitoring, nutrient application, soil mineralogy, yield estimation, and sorting applications [23].
Hyperspectral data is initially a large volume of raw information that must be processed to be transformed into valuable data. To achieve this, it is essential to face the challenges associated with this type of data, such as noise and high dimensionality, to optimize the processing speed. To solve these problems, various processing techniques, such as atmospheric, radiometric, and spectral corrections, are applied to retrieve accurate spectral information [22].

2.3. 3D Sensors

Today, the use of 3D imaging technologies has become widespread in several application areas. This is due to several factors, such as the continuous increase in the processing power of computers, the reduction in the cost and size of electronics, the improvement in the efficiency of solid-state lighting, and the unique properties of artificial vision as a non-contact and non-destructive technology [6,24]. Among the most widely used techniques for 3D reconstruction are Time-of-Flight (ToF) cameras and LiDAR sensors. Both are based on the principle of measuring distances by calculating the time it takes for light to travel to an object and return to the sensor [14,25]. Another widely employed technique is stereo vision, which uses triangulation of pairs of images taken from different angles to reconstruct three-dimensional models. These tools enable advanced phenotype characterization, facilitating research in plant physiology and the optimization of agricultural management. Furthermore, studies have been conducted that implement these technologies in both open-field conditions and controlled environments [5,8,26].

2.4. Time-of-Flight Cameras

Time-of-Flight (ToF) cameras measure the time it takes for light to travel from an emitter to an object and back to the sensor. This principle allows the generation of three-dimensional images of the environment. This type of camera emits a beam of infrared light, which reflects off the surfaces in the environment and is subsequently detected again by the sensor. As shown in Figure 3, the transmitted signal travels to the object and, after reflection, returns to the sensor, allowing the distance to be calculated using the formula:
Z = c · t 2
where c is the light speed and t is the round-trip time of the signal.
The light reaching the optical sensor contains two components: ambient light and reflected light. Only the latter is used to calculate depth. Sensors such as LIDARs, Flash LIDARs, 360° 3D LIDARs, and ToF cameras use this same measurement principle based on the time of flight of light.

2.5. Stereo Vision and Triangulation

The operating principle of the stereo vision system is based on the triangulation principle. This principle utilizes the disparity between images captured by two cameras positioned at different locations to calculate the distance of an object in three-dimensional space. Each camera projects the object onto its respective image plane, generating two reference points ( d 1 and d 2 ). The displacements of these points depend on the location of the object with respect to the cameras. As shown in Figure 4, the difference between these offsets, known as disparity ( d 2 d 1 ), is inversely related to the depth of the object (Z). Using a mathematical formula that incorporates the distance between the cameras (x), the focal length of the lenses (f), and the disparity, the distance of the object from the plane of the cameras can be accurately calculated. This method leverages the distinct perspectives captured by the cameras, simulating the way human eyes perceive depth in space.

2.6. Fluorescence Spectroscopy

Fluorescence spectroscopy takes advantage of the ability of specific molecules (referred to as “fluorophores”) to absorb photons in the ultraviolet-visible region. These molecules subsequently re-emit the absorbed energy at longer wavelengths, typically in the visible or near-infrared range. This phenomenon, known as the Stokes shift, occurs because part of the absorbed energy is dissipated non-radiatively before being emitted. Both the energy loss and the subsequent radiative transition are depicted in a Jablonski diagram under the Franck-Condon principle (see Figure 5a) [27]. When a ground-state molecule ( S 0 ) absorbs a photon of energy h ν e x , an electron is promoted vertically to an excited singlet state ( S 1 or S 2 ) in approximately 10 15 s. This transition occurs without any instantaneous change in nuclear geometry. In less than 10 12 s, the fluorophore dissipates excess vibrational energy to its surroundings through vibrational relaxation. As a result, it settles into the lowest vibrational level of S 1 . From this equilibrated state, the electron may return to S 0 by emitting a photon in the process known as fluorescence [28]. Two key parameters used to quantify the fluorescence are the quantum yield:
Φ = k r k r + k n r
where k r and k n r are the radiative and non-radiative decay constants. The other one is the fluorescence lifetime, which is given as
τ = 1 k r + k n r
which is typically 1–10 ns and can be quantified using techniques like Time-Correlated Single-Photon Counting (TCSPC) or phase modulation. These approaches precisely measure the arrival times of individual photons after a pulsed stimulation, resulting in a decay histogram that shows how rapidly the fluorophore recovers to its ground state.
Experimentally, plant samples (leaf extracts, leaf discs in quartz cuvettes (1–10 mm path length), or cells) are diluted at the excitation wavelength below 0.1 absorbance units, ensuring the elimination of inner-filter effects. A “blank” spectrum (solvent or buffer only) is always subtracted to account for background fluorescence as well as Rayleigh and Raman scattering. Excitation sources may include xenon arc lamps (250–700 nm), mercury/Hg-Xe lamps, wavelength-specific LEDs (365–630 nm), or diode/Nd:YAG lasers (532 nm, 355 nm). Each source is coupled with monochromators or interference filters (5–15 nm bandwidth) to isolate the desired excitation wavelength. After excitation, the emitted light is collected either at 90° to the excitation beam or at a 22° front-face angle for turbid or opaque samples. Detection is performed using high-sensitivity detectors in time-correlated single-photon counting (TCSPC) setups or CCD cameras that capture the entire emission spectrum in a single acquisition. Finally, the absorption and emission spectra often appear nearly mirror images, illustrated in Figure 5b [28]. This symmetry simplifies electronic-state assignments and direct comparison of absorption versus emission transitions.
In the context of plant phenotyping, fluorescence spectroscopy is invaluable for quantifying chlorophyll fluorescence and calculating the variable-to-maximum fluorescence ratio F V / F M (where F V = F M F 0 , F 0 , and F M are the minimum and maximum fluorescence under dark-adapted conditions). F V / F M serves as an early indicator of drought or nutrient stress [29,30]. Likewise, Pulse Amplitude Modulation (PAM) fluorometry extends this by generating spatial stress maps across leaves and canopies, while full-spectrum analysis discriminates between primary and secondary pigments, such as flavonoids. When integrated with artificial vision and multispectral sensing platforms, fluorescence spectroscopy delivers a non-invasive, quantitative, high-resolution phenotyping toolkit for monitoring plant growth and health [31].

3. Methodology

This study uses the PRISMA-2020 methodology (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) to guarantee transparency, repeatability, and rigor in the selection and analysis of studies on the use of artificial vision and optics in monitoring plant growth. The adopted methodology optimizes the documentation of each stage of the process, from identifying studies to their final inclusion. It establishes guidelines to minimize bias and errors, as is explained by Page and collaborators [32]. This systematic review was registered in the Open Science Framework (OSF) under the registration number OSF.IO/VB79Y [33].

3.1. Eligibility Criteria

To ensure the relevance of the selected studies, strict inclusion and exclusion criteria were established. Only peer-reviewed academic publications addressing the use of artificial vision and optics in measuring plant growth were considered. Likewise, only articles published in English or Spanish and available in indexed journals with full-text access were included. Exclusion criteria comprised studies for which full-text access was unavailable, opinion articles, and reviews lacking methodological analysis. Furthermore, works that were not relevant to the use of artificial vision or optical methods were excluded. The exclusion procedure was divided into three stages: (i) removal of duplicate research; (ii) selection of articles based on titles and abstracts; and (iii) detailed review of the full texts.

3.2. Information Sources

The databases used for the search were Scopus and Web of Science, given their extensive coverage of scientific and technological research. Scopus indexes a large number of publications in science and technology, while Web of Science focuses on journals with high impact and quality standards. These platforms provided a robust corpus for the systematic review, ensuring the inclusion of relevant and high-quality studies [34].

3.3. Search Strategy

Specific search equations were formulated for each database, based on the established inclusion criteria. This approach ensured the identification of relevant studies on the use of optical methods combined with artificial vision in plant growth research. The equations employed Boolean operators such as AND and OR, along with related keywords. These combinations were designed to prioritize publications focused on the application of optical techniques within the fields of machine learning and artificial intelligence. The search process was refined through iterative reviews of the initial results, eliminating ambiguous terms and fine-tuning operators to enhance the precision and relevance of the selected studies.
  • In Scopus, the search equation applied was (TITLE (“grow*” OR “develop*”)) AND (TITLE (“artificial* vision*” OR “optic*”)) AND (TITLE (“flor*” OR “plant*”)) AND NOT (TITLE (thermometric)), focusing on article titles to prioritize studies directly relevant to the topic.
  • In Web of Science, the equation was adapted to its specific syntax as TI = (“grow*” OR “develop*”) AND (TI = (“artificial* vision*” OR “optic*”)) AND (TI = (“flor*” OR “plant*”)) NOT (TS = (thermometric)), aligning with the required search fields for this platform. These adaptations ensured consistency across both databases and alignment with the inclusion criteria.
Although the initial search strategy was designed with general terms to avoid introducing bias toward specific sensor types, crop species, or spectral indices, it is acknowledged that this approach may have limited the capture of highly specialized studies. To address this limitation, the review was complemented by the inclusion of high-impact and thematically aligned articles not retrieved through the original search equation. Furthermore, additional relevant studies were identified through a forward citation search, which involved exploring the reference lists of all eligible reports to screen for further eligible studies.
For example, studies evaluating plant responses using SPAD meters and spectroradiometers in maize under different organic treatments [35] offered valuable insights into spectral reflectance and chlorophyll content analysis for growth monitoring. Other contributions, although not focused on a specific crop, present advances in materials that enhance optical sensitivity in low-energy detection platforms [36]. These articles were manually selected for discussion to enrich the review with content that highlights recent progress in AI-compatible sensors and plant trait quantification techniques. This supplemental approach ensures broader coverage of multispectral and hyperspectral sensor applications, various crop types (e.g., maize, grape, and rice), and key physiological variables like biomass and vegetation indices.

3.4. Selection Process

The selection process was conducted in three distinct stages. First, duplicate studies were removed. Secondly, the titles and abstracts were evaluated to assess their relevance. Third, a comprehensive full-text review was performed to ensure the inclusion of studies that met the established criteria.
Each phase was meticulously documented following the PRISMA-2020 guidelines. This approach ensured transparency, consistency, and reproducibility throughout the process. The PRISMA flow diagram, presented in Figure 6, provides a visual summary of the decisions made at each stage of the selection process. It illustrates the number of studies excluded and included at each step.
The initial search yielded a total of 141 records, of which 81 were retrieved from Scopus and 107 from Web of Science. After removing 35 duplicate entries, 106 articles proceeded to the title and abstract screening phase. At this stage, 34 studies were excluded because they were not directly related to the application of artificial vision or optical techniques in plant growth. Subsequently, 30 full-text articles were selected for inclusion based on their methodological rigor, relevance, and contribution to the research questions.
The final selection emphasized high-impact studies, including the most cited articles in the field, to ensure scientific relevance. Furthermore, proceedings papers and conference abstracts were excluded. This exclusion was due to their typically limited methodological detail and lower peer review standards. The time frame for inclusion was unrestricted to capture the historical evolution of the topic. However, most of the included articles were published between 2000 and 2024, reflecting the recent technological advances in the area.

3.5. Data Management

The collected data were structured and managed in Microsoft Excel® (version 16.93.1), with studies categorized based on criteria such as authors, publication year, methodology, and key findings. This method allowed for the identification of trends and patterns in research on artificial vision and optical techniques used in plant development, facilitating a comprehensive comparative analysis.

3.6. Bias Risk Assessment

The risk of bias was assessed by considering aspects such as methodological quality, transparency of results presentation, and potential conflicts of interest among authors. Furthermore, the choice of databases and keywords was considered, as these could affect the diversity of the research included. To mitigate these risks, rigorous inclusion and analysis criteria were used throughout the process.

4. Results

The study’s findings are presented in a clear and organized manner. They are structured into subsections that address advancements in the application of artificial vision and optical methods for plant growth measurement. These include the primary methodologies employed, the sectors demonstrating the most significant progress in implementing these technologies, and the technical challenges that remain to be addressed. Furthermore, the role of these technologies in increasing agricultural efficiency and optimizing phenological monitoring is also examined.
Figure 7 depicts an examination of the evolution of publications linked to artificial vision and optical approaches for plant growth measurement, indicating a consistent increase in the number of scientific papers over the decades. From 1951 to the 1990s, scientific output in these domains was limited, with only intermittent publications. This restricted output can be attributed to the early phases of research in optical and artificial vision technology at that time. However, since the early 2000s, the number of publications has steadily increased, reflecting growing interest and advancements in optical and artificial vision technologies used in agricultural and crop monitoring.
A total of 69 publications were detected, with an average of 1.97 articles per year. However, a significant acceleration has occurred over the last five years. Most articles were published in 2023, with six, followed by 2024 and 2022, each with five. This trend reflects an ongoing increase in research on artificial vision and optical technologies for monitoring plant growth. Advancements in sensor technology, image processing, and machine learning drive it. The data indicate not only a rising interest in the topic but also the growth of the scientific community dedicated to utilizing these technologies in agriculture. This transition is consistent with the broader trend of digitalizing agriculture and integrating precision agricultural technologies.
The period from 2020 to 2024 represents a phase of consolidation, with 22 articles published. This period accounts for over 30% of all publications to date. This rise highlights the scientific community’s growing emphasis on integrating optical technologies and artificial intelligence in precision agriculture. Furthermore, the shift toward data-driven farming and real-time crop monitoring underscores the importance of these technologies in addressing global challenges such as climate change and food security.
Looking ahead, this trend is expected to continue. It is fueled by advances in artificial intelligence, the development of more precise sensors and ultra-fast cameras, and the growing demand for automated monitoring in agriculture. The integration of AI-driven models, such as deep learning algorithms, with optical and sensor technologies is expected to enhance predictive capabilities. This integration will enable more accurate forecasting, early disease detection, and optimized resource management. Consequently, it will contribute to the improvement of the sustainability of agricultural practices.
To enhance the relevance of the temporal analysis, a focused review of publication years from the last two decades (2005–2024) was emphasized, as this period captures the emergence and consolidation of optical and AI-based techniques in plant growth assessment. Most of the significant advances, such as the integration of hyperspectral imaging and AI algorithms like CNNs and SVMs, have occurred within this timeframe. It has been demonstrated how recent innovations in material science enable more efficient optical sensing [36]. Similarly, a recent study highlights current breakthroughs in luminescent materials for plant monitoring applications. Therefore, while the historical inclusion since the 1950s contextualized the roots of plant optical sensing, the emphasis is now placed on contemporary trends that reflect current capabilities and research directions.
Figure 8 presents an analysis of scientific publications by geographical distribution related to artificial vision and optical techniques for plant growth measurement. The results reveal that 21 countries have contributed at least one article, while 174 countries have not published in this area. This uneven distribution highlights a concentration of research in a limited number of countries. These countries are characterized by advanced scientific infrastructure and well-developed technological capabilities. The five leading countries in scientific production on this subject are the following:
  • United States (14 publications): The U.S. maintains its leadership in this field, supported by a robust academic infrastructure and close collaboration between universities, research centers, and the technological and agricultural industries. This synergy has enabled significant advances in the application of artificial vision and optical sensors in precision agriculture. Moreover, the country has pioneered the integration of emerging technologies, such as machine learning and artificial intelligence, into the monitoring and analysis of plant growth.
  • Japan (11 publications): Japan has distinguished itself with its innovative approach to miniaturization and hardware optimization for agricultural applications. Its ability to integrate advanced technologies into agriculture, with a focus on energy efficiency and sustainability, has led to significant progress in automation and plant growth monitoring. Additionally, Japan has been at the forefront of developing artificial vision systems for automated crop classification and harvesting.
  • China (5 publications): With its rapid growth in artificial intelligence and agricultural robotics, China is beginning to establish itself as a key player in research related to artificial vision for agriculture. Although its contribution to scientific publications remains limited, the country has demonstrated a strong drive for technological innovation. This trend is particularly evident in areas such as precision farming and AI-driven systems.
  • Russian Federation (5 publications): Russia’s contributions have focused on combining optical techniques with thermal sensors and spectroscopy for monitoring plant growth in extreme environments, such as cold or arid climates. Although these contributions are valuable, they have been constrained by limited international collaboration and a lack of resources for large-scale adoption of these technologies in recent years.
  • Canada (2 publications): Canada has developed significant research on the application of optical methods and artificial vision for the three-dimensional measurement of plant growth. These contributions are noteworthy in the field of agricultural phenotyping. However, the relatively low number of publications suggests that there is still room for further development in this area.
In contrast, some countries with minimal scientific production include Brazil, Mexico, Latvia, Japan, and Morocco (each with one publication). Although these countries have made contributions, their participation is limited. This limited output suggests that research in this field is still in its early stages or faces limitations related to funding and technological resources. These constraints may be associated with a lack of specialized research infrastructure, the absence of innovative technological programs, or reliance on traditional agricultural methods. Such factors hinder the adoption of new technologies.
The results indicate that approximately 174 countries have not contributed to this field, suggesting a significant gap in the scientific infrastructure for artificial vision and optical technologies applied to agriculture. Several factors, including the lack of scientific infrastructure, insufficient investment in advanced agricultural technologies, and the absence of specialized research and educational programs in these areas, can explain this gap. In many of these regions, agriculture remains traditional and has not yet effectively integrated technological innovations. This situation delays the advancement of agricultural productivity through more precise and sustainable methods.
The inclusion of both temporal and geographical analyses is essential to characterize the evolution and regional focus of research on optical and AI-based plant growth assessment. These elements are not presented as ends in themselves, but rather as part of the evaluation of the volume, distribution, and visibility of scientific production—fundamental indicators of research maturity and thematic consolidation. For example, the growing body of work originating in East and South Asia reveals not only technological advancements but also regional research priorities related to food security and sustainable agriculture [35,36]. By identifying leading regions and active publication periods, this analysis supports the review’s objective of mapping research trends and gaps.
Table 1 shows a comprehensive overview of the 30 papers selected for variable analysis, including information on the technique employed, applications addressed, and variables examined in each study. This systematic overview provides a solid foundation for discussion and conclusions. It enables a clear understanding of the key factors influencing the research and helps identify opportunities for future investigation.
According to the findings of the reviewed studies, critical characteristics were identified in the development of optical and artificial vision methods for evaluating plant growth, as shown in Figure 9. These findings highlight the areas that require the most development and implementation. The identified characteristics were grouped into seven main thematic categories based on their technological and functional nature:
  • Imaging and Sensor Technologies, which includes tools such as RGB cameras, depth sensors, multispectral sensors, hyperspectral sensors, fiber optic sensors, quantum sensors, electro-optical sensors, thermal imaging systems, chlorophyll fluorescence, and terahertz imaging technologies.
  • AI and Predictive Models, which comprises predictive machine learning models and neural networks for phenological pattern recognition.
  • Artificial Vision Techniques, oriented toward image processing using optical flow, image correlation, and contour analysis.
  • Advanced Optical Techniques, such as inductively coupled plasma optical emission spectroscopy (ICP-OES), photoluminescence, near-infrared spectroscopy, and ultraviolet-induced fluorescence.
  • Environmental Enhancement Tools, which includes elements such as structured lighting, artificial lighting systems, and passive fiber optic lighting systems.
  • Smart Monitoring Systems, offering functionalities such as automatic alerts to environmental changes.
  • Other Optical Approaches, which encompasses diverse techniques such as remote sensing, protoplast monitoring, self-aligning optical particle analyzers, spectral absorption analysis, and the use of LIDAR to assess structural growth.
In terms of technological development, the growing application of multispectral optical sensors has notably improved the precision in detecting physiological variables of plants. This advancement has enhanced the capabilities for monitoring plant growth. Moreover, the incorporation of machine learning models has facilitated the automation of image analysis, enabling the identification of growth patterns and anomalies. On the other hand, the use of RGB and depth cameras has been fundamental for obtaining three-dimensional representations of plant structures. These imaging techniques have contributed to a better understanding of their morphological development [4].
However, some characteristics appear less frequently in the literature. The limited frequency of these characteristics indicates less explored areas that nonetheless hold significant potential for future research. These include LIDAR for structural analysis (1 mention), terahertz imaging for water content measurement (1), quantum dot sensors for nutrient tracking (1), and ultraviolet-induced fluorescence for disease detection (1). The limited presence of these technologies can be attributed to high implementation costs or the lack of applied studies in agricultural settings. Nevertheless, their incorporation in future research could offer new insights into enhancing the precision and efficiency of plant growth monitoring [18]. Such observations emphasize the need to combine modern sensors with artificial intelligence and automated monitoring networks. Such integration is essential for improving agricultural productivity [2].
Regarding image and data processing techniques, the results indicated an approximate set of 20 methods used for measuring plant growth through artificial and optical vision, as presented in Figure 10. Among the most commonly used techniques are the following: machine learning models for predictive analysis (4 mentions); image segmentation for structural growth analysis (3); IoT (Internet of Things) data integration for real-time monitoring (3); optical flow analysis for motion tracking (3); and hyperspectral imaging for physiological assessment (3). These techniques can be grouped into seven key categories:
  • Machine learning and AI models: This category includes the use of machine learning models, neural networks, support vector machines (SVMs), random forests, deep learning, and XGBoost for classification and biomass estimation tasks.
  • Image analysis techniques: This group comprises image segmentation, edge detection, digital image correlation, and texture analysis, all of which are applied to assess leaf health and structural characteristics.
  • Three-dimensional and motion tracking: Techniques such as optical flow analysis and 3D reconstruction are used to assess volumetric growth and the motion tracking of plant structures.
  • Signal and dimensionality analysis: This includes methods like principal component analysis (PCA), Fourier transforms, and wavelet transforms, which serve to reduce dimensionality and eliminate noise from signals.
  • Spectral data processing: Multispectral image fusion and hyperspectral imaging techniques are employed for detailed physiological analysis of plant health and stress responses.
  • IoT and data integration: This category involves integrated real-time monitoring platforms utilizing smart sensors to collect and process environmental and plant-related data continuously.
  • Unsupervised classification: This group includes clustering algorithms designed for species differentiation and grouping based on spectral and visual patterns.
These techniques reflect a growing trend towards automation and integration of artificial intelligence in plant growth characterization [59,60]. The most commonly used techniques allow the capture, processing, and analysis of plant images with high accuracy. Image segmentation is used to extract key structural features, such as leaf area and plant morphology, facilitating detailed monitoring of plant development. Likewise, the integration of IoT sensors with image processing techniques has enabled real-time monitoring of crops, thereby optimizing decision-making in agricultural management [19,64].
On the other hand, some less frequent techniques in the literature include digital image correlation for deformation measurement (2 mentions), feature extraction by deep learning (2), clustering algorithms for plant species differentiation (2), 3D reconstruction for volumetric growth assessment (2), and wavelet transform for noise reduction in spectral data (2). Although these techniques have been applied to a lesser extent, they represent promising tools for improving the accuracy of plant growth measurement. Their lower adoption may be due to their computational complexity and the need for large volumes of data for their effective implementation [62,63].
Regarding the key contributions of artificial vision and optics to plant growth measurement, 20 different contributions can be observed in Figure 11. Among the most recurrent are 3D growth measurement (4 mentions), leaf area estimation (3), root structure analysis (3), biomass quantification (3), and stress detection in plants (3). These applications have enabled more detailed monitoring of plant development, with a focus on enhancing agricultural efficiency. In this regard, it has explored the use of optical sensors for growth monitoring in controlled environments [50], while 3D image reconstruction optimizes the assessment of plant morphology over time [48].
Another key aspect of applying artificial vision and optics in agriculture is the ability to measure leaf area and quantify biomass more accurately. These metrics are essential for assessing crop productivity and health. Several studies have demonstrated that hyperspectral sensors, when combined with deep learning models, can significantly enhance the accuracy of these measurements. For example, studies have shown that artificial intelligence applied to spectral image segmentation can detect variations in growth with high accuracy [60,65,66]. In addition, wide-field optical systems can provide real-time data on biomass distribution in both greenhouse and open-field crops [63].
On the other hand, emerging contributions have been identified that, although less frequent, demonstrate significant potential for advanced crop monitoring. These contributions involve the study of light absorption across different spectral ranges (2 mentions), nutrient uptake analysis (2), assessment of plant health status (2), evaluation of photosynthesis (2), and prediction of growth rates (2). These advanced technologies are being implemented to analyze the relationship between spectral reflectance and plant metabolism [64]. Furthermore, neural network-based prediction models have been employed to estimate growth rates in various crops [59].
The analysis of the reviewed studies identifies 10 key sectors where artificial vision and optics have driven significant advances in measuring plant growth, as presented in Figure 12. Among the sectors with the most significant development are smart agriculture (5 mentions), precision horticulture (4), vertical farming (4), forestry monitoring (3), and greenhouse automation (3). The contributions associated with these sectors can be grouped into seven categories:
  • Structural Analysis, which includes three-dimensional growth measurement, crown density analysis, and root structure.
  • Monitoring and Integration, which encompasses sensor integration and real-time monitoring with multichannel fusion.
  • Growth Quantification, aimed at estimating leaf area and predicting growth rates.
  • Resource and Biomass Estimation, with emphasis on biomass quantification, nutrient uptake analysis, water content, and soil moisture.
  • Plant Health Assessment, with emphasis on disease and stress detection, health assessment, and chlorophyll fluorescence.
  • Physiological Response Analysis, which includes photosynthesis assessments, light absorption studies, and temperature response.
  • Phenological and Trait Analysis, focused on detecting phenological phases such as flowering and trait mapping using hyperspectral sensors.
These sectors have integrated advanced optical imaging technologies and artificial intelligence algorithms to improve productivity and optimize crop management. The application of optical imaging in detecting and monitoring growth in forestry systems is highlighted in [48], while [59] demonstrated how recurrent neural network models can predict crop sorting in agriculture 4.0 systems. Additionally, advances in artificial vision have enabled more precise control of the growing environment in precision horticulture and vertical farming, thereby improving the quality and quantity of agricultural production. Digital image correlation is a key tool for evaluating root structure and its interaction with the soil, optimizing the use of fertilizers and water resources [48]. On the other hand, an optical system based on growth spheres is proposed in [63], which allows capturing spectral data to improve crop planning in greenhouses and vertical systems.
On the contrary, sectors with less development but high potential include biomass energy crops (2 mentions), agroforestry systems (2), urban agriculture (2), genetic trait analysis (2), and light-assisted growth systems (2). Although these applications have been explored to a lesser extent, they present significant opportunities for agricultural sustainability. For example, fiber optic sensors for genetic trait monitoring in experimental crops were developed by [62], while the integration of artificial vision with artificial lighting systems improved plant growth efficiency in urban and controlled environments [19].

5. Discussion

The discussion of this research is organized into key sections to contextualize and analyze the findings on the application of artificial vision and optics in plant growth measurement. First, the results obtained are examined to identify trends, limitations, and opportunities in the development of these technologies. These findings are then compared with previous studies to identify coincidences and differences with the existing literature, as well as to explore their possible causes. Based on the results, a conceptual framework is developed. This framework synthesizes the essential elements for the application of artificial vision and optics in precision agriculture, providing a structured foundation for future research and applications. Subsequently, theoretical and practical implications are analyzed. These considerations address the impact of the findings on research, technological advancement, and agricultural productivity. Finally, the study’s limitations are identified, including methodological and technological constraints. Lines of future research are proposed to address these limitations and to expand knowledge in this area.

5.1. Results Analysis

The results obtained in this research demonstrate a wide range of approaches in artificial vision and optics applied to measuring plant growth. Among the most commonly used methods are three-dimensional photogrammetry, hyperspectral image segmentation, and the use of deep neural networks for phenological classification. These findings are consistent with the work of [64], who applied digital image correlation for the assessment of root structures in agricultural soils. Furthermore, the influence of optical sensors on the early detection of crop growth in controlled environments was evaluated in [50].
In the context of precision agriculture, the use of drones equipped with multispectral sensors has enabled advances in detecting growth patterns and identifying nutritional deficiencies in crops. The application of LIDAR data has been demonstrated to improve vegetation characterization in forest plantations [48]. Additionally, the use of induced plasma spectroscopy has been highlighted for assessing metal uptake in plant tissues [47]. This method presents potential for adaptation in the nutritional analysis of commercial crops.
In terms of sectoral advances, significant developments in artificial vision and optics for agriculture are focused on greenhouse automation and crop production under controlled environments. Research such as [63] has proposed optical sensor-based systems for continuous measurement of plant growth, while fiber optic technologies for detecting physiological variations in large-scale crops were developed in [62].
The geographical distribution of the analyzed studies reveals that most publications originate from countries with a strong investment in agricultural research, including the United States, Japan, and China. However, a low participation of developing countries was identified, suggesting that technological and economic barriers hinder the implementation of these technologies in regions with limited infrastructure. Some studies have emphasized the importance of digitalization in emerging agriculture [65,67]. In addition, other authors have highlighted the role of artificial intelligence in optimizing agricultural models in emerging markets [61,68].
Table 2 summarizes the main applications investigated in the field of artificial vision and optical technologies for plant growth measurement. Each application is associated with the technological tools employed, the relevant application sectors, and the limitations identified based on the studies analyzed.
Based on the results obtained, Figure 13 presents the proposed flowchart for measuring plant growth using artificial vision and optical sensors. The model is structured of interconnected processes that enable the capture, processing, and analysis of data in real time. In the first stage, data requirements are identified. These may originate from hyperspectral sensors, RGB cameras, laser LIDAR systems, or fluorescence spectroscopy. The collected data can provide information on biomass, plant morphology, and the nutritional status of the crop. This enables a highly accurate characterization of plant growth. In the second phase of the model, the acquired data are processed using deep learning algorithms and image segmentation techniques. CNN-based models are employed to perform phenological classification of crops. In parallel, digital image correlation techniques are used to detect growth patterns. Finally, in the analysis and prediction phase, the processed data are used to generate predictive models of plant development. These models contribute to the optimization of agricultural management and enable the early detection of anomalies or adverse conditions.
Table 3 shows the main challenges associated with progress in the development of artificial vision and photonics systems applied to plant growth measurement. One of the main challenges in applying artificial vision and optics to plant growth measurement is the scalability of data processing. This imposes limitations on the capacity to analyze large volumes of information generated by hyperspectral sensors and high-resolution cameras. Research has shown that digital correlation algorithms can optimize image analysis in root structural assessment [64]. However, their large-scale implementation remains a challenge due to computational demands and massive data storage requirements. Additional studies highlight the need for efficient storage systems to manage three-dimensional images generated by optical sensors [66]. These studies suggest the adoption of hybrid artificial intelligence architectures to improve processing speed and efficiency [40].
Another relevant challenge is early disease detection. Optical and artificial vision technologies have shown significant advances in this area. However, they still face obstacles related to the accuracy and reliability of predictive models. The authors in [73] demonstrated the effectiveness of LIDAR data in detecting variations in leaf structure. Nevertheless, integration with deep learning models remains an area under active development. Similarly, the work presented in [19] developed an algorithm based on convolutional neural networks for disease identification in citrus. The study noted that variability in environmental conditions can affect model accuracy. This finding highlights the need for more robust approaches to data acquisition and analysis.

5.2. Main Findings and Limitations

One of the main constraints is the availability and quality of data used to train segmentation and prediction models. Data collection in agricultural environments presents significant challenges, including the variability of climatic conditions, the heterogeneity of crops, and the difficulty in obtaining accurately and homogeneously labeled datasets.
Another key limitation lies in the generalizability of machine learning models. Advanced algorithms have been developed for analyzing images and point clouds in agricultural contexts. However, their performance can decline when applied to conditions differing from those used during training. Environmental factors, including illumination, foliage density, and data noise, can reduce the accuracy of plant segmentation and classification.
Additionally, the implementation of these systems in real-world scenarios encounters operational and economic constraints. The adoption of advanced technologies in small- and medium-scale agricultural environments remains limited due to high hardware costs, the need for connectivity infrastructure, and a lack of technical expertise for operating these tools.
Moreover, the proposed technological solutions still need to address scalability and long-term sustainability better. Many current developments have been validated in controlled conditions. Nevertheless, their performance in dynamic and highly variable environments remains an open challenge. Monitoring crops under extreme conditions, such as drought or elevated temperatures resulting from climate change, exemplifies this difficulty. This situation highlights the need for more robust and adaptive approaches. These approaches should aim to improve the resilience of agricultural systems to increasingly unpredictable scenarios.

5.3. Future Research Directions

A promising line of research involves optimizing and adapting segmentation models for use in diverse crops and agricultural environments. Achieving this objective requires the development of more robust deep learning models that incorporate transfer learning techniques and advanced neural networks to improve accuracy across multiple scenarios. Variability in plant morphology, along with differences in foliage density and distribution across crops, presents significant challenges for current models. Consequently, adapting these models to different field conditions could substantially enhance their applicability in precision agriculture.
On the other hand, a crucial aspect of current agricultural research involves analyzing plant behavior under extreme conditions, given the increasing impact of climate change. Drought, heat stress, soil salinity, and exposure to extreme weather events are factors that can affect crop physiology and development. The integration of artificial vision models and advanced sensors allows for the assessment of plant responses to these factors, enabling the identification of patterns of adaptation and vulnerability. This approach not only facilitates the prediction of crop performance under adverse scenarios but also contributes to the development of more resilient and sustainable agricultural management strategies.
A relevant line of work involves evaluating the performance of these models on low-power embedded platforms. The aim is to develop accessible and efficient solutions for decision-making in both agricultural and urban environments. To address real-time processing latency, which remains a significant challenge in current AI-based plant growth monitoring systems, future research should focus on implementing optimized inference engines tailored for embedded platforms. Integrating quantized neural networks (QNNs) or pruning techniques on edge devices, such as the NVIDIA Jetson Nano or Raspberry Pi, has demonstrated the capacity to reduce model complexity without a substantial loss in accuracy. This reduction consequently decreases inference time and power consumption [74,75].
Moreover, overcoming the heterogeneity of environmental data, which complicates on-device model generalization, requires the exploration of adaptive learning techniques. Embedding feedback mechanisms to adjust parameters based on locally sensed environmental cues could enhance the responsiveness and contextual relevance of growth models. This gap remains underexplored in plant-oriented embedded AI applications and represents a critical step toward achieving robust, low-latency performance [4,76].
In addition, thermal management and energy efficiency should be explicitly incorporated into the design of embedded optical systems. Phosphor-based light sources doped with Eu3+ and Mn4+ exhibit high thermal stability and fulfill both illumination and sensing functions, thereby minimizing component redundancy [77,78]. These materials support the development of passive, low-power, and multifunctional platforms, making them suitable for integration into agricultural IoT devices.
Collaborative efforts should aim to develop standardized datasets and benchmarks for real-time plant growth assessment on low-power platforms. These efforts must include defining latency thresholds, power budgets, and accuracy trade-offs under diverse field conditions. Establishing such parameters would strengthen reproducibility and facilitate the deployment of solutions beyond experimental settings [46,77].
Recent developments in optical sensing combined with AI-driven models have demonstrated considerable improvements in the accuracy of plant monitoring. For instance, the integration of hyperspectral imaging with lightweight neural networks has enabled high-resolution classification of physiological traits, including chlorophyll content, particularly in precision crops like tea and maize. The study in [18] demonstrated the complementary strengths of synthetic aperture radar (SAR), which can penetrate cloud cover, and NDVI-derived data for robust, year-round monitoring. Similarly, the findings reported in [35] emphasized the advantages of integrating optical indices (e.g., PRI and NDVI) with growth modeling to assess the impact of soil amendments.
However, sensor-AI combinations are not without limitations. As demonstrated in [36], material degradation and signal instability under environmental stress can reduce accuracy in outdoor agricultural settings, particularly when using fluorescence spectroscopy. In contrast, the findings in [78] emphasized the durability and cost-effectiveness of perovskite-integrated photodetectors. These characteristics make them viable for long-term field deployment, especially when coupled with convolutional neural networks (CNNs).
Another important aspect relates to the latency and power consumption of AI-driven processing, which remains a critical barrier to real-time analysis. The study presented in [79] addresses this bottleneck by proposing optimization strategies based on pruning and quantization of AI models. Additionally, the authors in [72] demonstrate that decision-tree-based classifiers require substantially less processing time. This characteristic makes them suitable for edge computing in agricultural applications, where minimal energy consumption is essential.
The need for crop-specific sensor-AI customization has also been emphasized. For example, the analysis in [80] reveals that convolutional models tuned for spectral patterns in rice exhibit poor performance when applied to broadleaf crops unless retrained with domain-specific datasets. Similarly, the study in [81] found that incorporating temperature gradients enhances prediction accuracy. However, the complexity associated with thermal-optical data fusion imposes limitations on the real-time scalability in large-scale plantations.
Overall, while recent advancements show promise, consistent limitations remain. These include challenges related to data generalization, deployment in heterogeneous environments, and the interpretability of deep models. Future work should focus on explainable AI approaches that enable end-users, including agronomists, to validate model decisions based on interpretable features, as suggested in [71].

6. Conclusions

This review provides a comprehensive analysis of the current state-of-the-art optical and AI-based approaches for plant growth assessment. The study presents several key findings related to the research questions addressed, offering a valuable foundation for future directions in automatic plant growth monitoring. It highlights both the transformative potential of these technologies and the challenges that remain.
The integration of advanced optical sensors, including RGB and hyperspectral imaging, fluorescence spectroscopy, and LiDAR, with artificial intelligence techniques has significantly enhanced the capacity to monitor, model, and understand plant development. These systems enable high spatial, spectral, and temporal resolution. They also support non-invasive, real-time, and scalable phenotyping, thereby opening new possibilities for precision agriculture, plant breeding, and sustainable resource management.
The results of the review reveal a growing trend toward the convergence of artificial vision, sensor miniaturization, and data-driven models. Recent innovations in deep learning, image segmentation, and embedded sensing platforms have enabled increasingly accurate predictions of biomass, leaf area, and physiological stress. These advances also support early disease detection and automated phenological classification. Nevertheless, significant limitations remain. These include restricted data availability and labeling, high environmental variability, and limited model generalization across diverse crop species and field conditions.
As future directions, research should prioritize the development of robust AI-based models capable of operating under varied environmental conditions and diverse plant morphologies. Additionally, there is a critical need to advance low-power sensing solutions suitable for deployment in resource-limited contexts. Finally, investigating plant responses to extreme environmental stress through integrated optical and AI-based systems will contribute to the development of more resilient and sustainable agricultural solutions.

Author Contributions

Conceptualization, J.B.-V., V.G.-P. and E.R.-V.; methodology, J.Z.-L., J.B.-V., V.G.-P. and E.R.-V.; software, J.Z.-L.; validation, J.Z.-L., J.B.-V., V.G.-P. and E.R.-V.; formal analysis, J.B.-V., V.G.-P., E.R.-V. and R.H.-G.; investigation, J.Z.-L., J.B.-V., V.G.-P. and E.R.-V.; resources, J.B.-V., E.R.-V. and R.H.-G.; data curation, J.Z.-L. and V.G.-P.; writing—original draft preparation, J.Z.-L., J.B.-V., V.G.-P. and E.R.-V.; writing—review and editing, J.B.-V., E.R.-V. and R.H.-G.; visualization, V.G.-P.; supervision, J.B.-V., E.R.-V. and R.H.-G.; project administration, J.B.-V. and E.R.-V.; funding acquisition, J.B.-V. and R.H.-G. All authors have read and agreed to the published version of the manuscript.

Funding

This work has been carried out under contract RC130-2024, corresponding to project code 109922, titled “Diversificación de fuentes de proteínas para uso alimentario mediante el empleo de terrazas de cultivo aeropónicas o hidropónicas, integradas con sistemas automatizados, inteligencia artificial y energía renovable para la creación de comunidades autosostenibles”, led by the “Grupo de Sistemas de Control y Robótica COL0123701” of the “Instituto Tecnológico Metropolitano de Medellín”.

Data Availability Statement

The data and materials that supported this bibliometric study are publicly available and can be accessed in the OSF repository: Zapata-Londoño, J.D., Botero-Valencia, J.S., García-Pineda, V., Vera, E.R., & Hernández-García, R. (19 June 2025). A Comprehensive Review of Optical and AI-based Approaches for Plant Growth Assessment. https://doi.org/10.17605/osf.io/g4hkq (accessed on 18 June 2025).

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
CNNConvolutional Neural Network
DTIDifference Texture Index
IoTInternet of Things
LAILeaf Area Index
LiDARLight Detection and Ranging
PRISMAPreferred Reporting Items for Systematic Reviews and Meta-Analyses
SVMSupport Vector Machines
ToFTime-of-Flight
TCSPCTime-Correlated Single-Photon Counting
UAVsUnmanned Aerial Vehicles

References

  1. Morales-Guerra, J.; Suarez-Cortez, S.; Morales-Duran, J.; Reyes-Vera, E.; Botero-Valencia, J. SmartGrow DataControl: An IoT architecture for the acquisition of environmental physiological parameters in Cannabis sativa cultivations. SoftwareX 2024, 27, 101880. [Google Scholar] [CrossRef]
  2. Lai, X.; Ge, X.; Li, J.; Zhu, J.; Du, P. Regulating the luminescence properties of Eu2W3O12 red-emitting phosphor via rare-earth ions doping for optical thermometry, white light-emitting diode and plant growth applications. J. Lumin. 2024, 275, 120815. [Google Scholar] [CrossRef]
  3. Peng, Y.; Ma, X.; Wang, Y.; Li, M.; Gao, F.; Zhou, K.; Aemixay, V. Energy performance assessment of photovoltaic greenhouses in summer based on coupled optical-electrical-thermal models and plant growth requirements. Energy Convers. Manag. 2023, 287, 117086. [Google Scholar] [CrossRef]
  4. Chakradhar, S.P.; Krushna, B.R.; Sharma, S.; Tripathi, S.; Indhu, C.; Jaiganesh, I.; Manjunatha, K.; Wu, S.Y.; Das, B.; Nagabhushana, H. Novel red-emitting CDs@LaCaAl3O7:Eu3+ nanocomposites: A sustainable breakthrough for optical thermometry, indoor plant growth and intelligent security labels. Mater. Chem. Phys. 2025, 335, 130540. [Google Scholar] [CrossRef]
  5. Dandrifosse, S.; Bouvry, A.; Leemans, V.; Dumont, B.; Mercatoris, B. Imaging wheat canopy through stereo vision: Overcoming the challenges of the laboratory to field transition for morphological features extraction. Front. Plant Sci. 2020, 11, 96. [Google Scholar] [CrossRef] [PubMed]
  6. Sampaio, G.S.; Silva, L.A.; Marengoni, M. 3D Reconstruction of Non-Rigid Plants and Sensor Data Fusion for Agriculture Phenotyping. Sensors 2021, 21, 4115. [Google Scholar] [CrossRef] [PubMed]
  7. Song, P.; Li, Z.; Yang, M.; Shao, Y.; Pu, Z.; Yang, W.; Zhai, R. Dynamic detection of three-dimensional crop phenotypes based on a consumer-grade RGB-D camera. Front. Plant Sci. 2023, 14, 1097725. [Google Scholar] [CrossRef] [PubMed]
  8. Kim, W.S.; Lee, D.H.; Kim, Y.J.; Kim, T.; Lee, W.S.; Choi, C.H. Stereo-vision-based crop height estimation for agricultural robots. Comput. Electron. Agric. 2021, 181, 105937. [Google Scholar] [CrossRef]
  9. Zhu, Q.; Bai, M.; Yu, M. Maize Phenotypic Parameters Based on the Constrained Region Point Cloud Phenotyping Algorithm as a Developed Method. Agronomy 2024, 14, 2446. [Google Scholar] [CrossRef]
  10. Bao, Y.; Tang, L.; Srinivasan, S.; Schnable, P.S. Field-based architectural traits characterisation of maize plant using time-of-flight 3D imaging. Biosyst. Eng. 2019, 178, 86–101. [Google Scholar] [CrossRef]
  11. Wang, C.; Yang, S.; Zhu, P.; Zhang, L. Extraction of Winter Wheat Planting Plots with Complex Structures from Multispectral Remote Sensing Images Based on the Modified Segformer Model. Agronomy 2024, 14, 2433. [Google Scholar] [CrossRef]
  12. Franchetti, B.; Ntouskos, V.; Giuliani, P.; Herman, T.; Barnes, L.; Pirri, F. Vision based modeling of plants phenotyping in vertical farming under artificial lighting. Sensors 2019, 19, 4378. [Google Scholar] [CrossRef] [PubMed]
  13. Paulus, S. Measuring crops in 3D: Using geometry for plant phenotyping. Plant Methods 2019, 15, 103. [Google Scholar] [CrossRef] [PubMed]
  14. Akhtar, M.S.; Zafar, Z.; Nawaz, R.; Fraz, M.M. Unlocking plant secrets: A systematic review of 3D imaging in plant phenotyping techniques. Comput. Electron. Agric. 2024, 222, 109033. [Google Scholar] [CrossRef]
  15. Li, Y.; Li, C.; Cheng, Q.; Duan, F.; Zhai, W.; Li, Z.; Mao, B.; Ding, F.; Kuang, X.; Chen, Z. Estimating Maize Crop Height and Aboveground Biomass Using Multi-Source Unmanned Aerial Vehicle Remote Sensing and Optuna-Optimized Ensemble Learning Algorithms. Remote Sens. 2024, 16, 3176. [Google Scholar] [CrossRef]
  16. Sun, X.; Yang, Z.; Su, P.; Wei, K.; Wang, Z.; Yang, C.; Wang, C.; Qin, M.; Xiao, L.; Yang, W.; et al. Non-destructive monitoring of maize LAI by fusing UAV spectral and textural features. Front. Plant Sci. 2023, 14, 1158837. [Google Scholar] [CrossRef] [PubMed]
  17. Wang, W.; Xie, J.; Zhong, Y.; Su, T.; Li, X.; Pan, Y.; Wei, X.; Li, Y. BaGa12O19: Cr3+/Mn2+ phosphors for optical thermometry and plant growth lighting applications. J. Photochem. Photobiol. A Chem. 2025, 462, 116223. [Google Scholar] [CrossRef]
  18. Mitra, M.; Haldar, D.; Patel, N. Understanding tea plantation growth stages and cultural operations by synergy of optical and Synthetic Aperture Radar (SAR) indices. J. Spat. Sci. 2024, 1–12. [Google Scholar] [CrossRef]
  19. Varas, S.; Rodríguez, J.; Santos Gonzales, C. Development of an artificial vision algorithm to detect the Huanglonbing disease in the citrus lemon plant of the “Fundo Amada”. In Proceedings of the 22nd LACCEI International Multi-Conference for Engineering, Education and Technology (LACCEI 2024), San Jose, Costa Rica, 17–19 July 2024. [Google Scholar] [CrossRef]
  20. Gonzalez, R.C. Digital Image Processing; Springer: Berlin/Heidelberg, Germany, 2009. [Google Scholar]
  21. Sun, D.W. Hyperspectral Imaging for Food Quality Analysis and Control; Elsevier: Amsterdam, The Netherlands, 2010. [Google Scholar]
  22. Lu, B.; Dao, P.D.; Liu, J.; He, Y.; Shang, J. Recent advances of hyperspectral imaging technology and applications in agriculture. Remote Sens. 2020, 12, 2659. [Google Scholar] [CrossRef]
  23. Ram, B.G.; Oduor, P.; Igathinathane, C.; Howatt, K.; Sun, X. A systematic review of hyperspectral imaging in precision agriculture: Analysis of its current state and future prospects. Comput. Electron. Agric. 2024, 222, 109037. [Google Scholar] [CrossRef]
  24. Guo, Q.; Wu, F.; Pang, S.; Zhao, X.; Chen, L.; Liu, J.; Xue, B.; Xu, G.; Li, L.; Jing, H.; et al. Crop 3D—A LiDAR based platform for 3D high-throughput crop phenotyping. Sci. China Life Sci. 2018, 61, 328–339. [Google Scholar] [CrossRef] [PubMed]
  25. Forero, M.G.; Murcia, H.F.; Méndez, D.; Betancourt-Lozano, J. LiDAR platform for acquisition of 3D plant phenotyping database. Plants 2022, 11, 2199. [Google Scholar] [CrossRef] [PubMed]
  26. Ruigrok, T.; van Henten, E.J.; Kootstra, G. Stereo Vision for Plant Detection in Dense Scenes. Sensors 2024, 24, 1942. [Google Scholar] [CrossRef] [PubMed]
  27. Albani, J.R. Principles and Applications of Fluorescence Spectroscopy; John Wiley & Sons: Hoboken, NJ, USA, 2008. [Google Scholar]
  28. Zacharioudaki, D.E.; Fitilis, I.; Kotti, M. Review of fluorescence spectroscopy in environmental quality applications. Molecules 2022, 27, 4801. [Google Scholar] [CrossRef] [PubMed]
  29. Belasque Jr, J.; Gasparoto, M.; Marcassa, L.G. Detection of mechanical and disease stresses in citrus plants by fluorescence spectroscopy. Appl. Opt. 2008, 47, 1922–1926. [Google Scholar] [CrossRef] [PubMed]
  30. Xia, Q.; Tang, H.; Fu, L.; Tan, J.; Govindjee, G.; Guo, Y. Determination of Fv/Fm from chlorophyll a fluorescence without dark adaptation by an LSSVM model. Plant Phenomics 2023, 5, 0034. [Google Scholar] [CrossRef] [PubMed]
  31. Arya, S.; Sahoo, R.N.; Sehgal, V.; Bandyopadhyay, K.; Rejith, R.; Chinnusamy, V.; Kumar, S.; Kumar, S.; Manjaiah, K. High-throughput chlorophyll fluorescence image-based phenotyping for water deficit stress tolerance in wheat. Plant Physiol. Rep. 2024, 29, 278–293. [Google Scholar] [CrossRef]
  32. Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. BMJ 2021, 372, n71. [Google Scholar] [CrossRef] [PubMed]
  33. Zapata-Londoño, J.D.; Botero-Valencia, J.S.; García-Pineda, V.; Vera, E.R.; Hernández-García, R. A Comprehensive Review of Optical and AI-based Approaches for Plant Growth Assessment. 2025. Available online: https://doi.org/10.17605/osf.io/g4hkq (accessed on 18 July 2025).
  34. Mongeon, P.; Paul-Hus, A. The journal coverage of Web of Science and Scopus: A comparative analysis. Scientometrics 2016, 106, 213–228. [Google Scholar] [CrossRef]
  35. Oyege, I.; Balaji Bhaskar, M.S. Evaluation of vermicompost and vermicompost tea application on corn (Zea mays) growth and physiology using optical plant sensors. J. Plant Nutr. 2025, 48, 1275–1293. [Google Scholar] [CrossRef]
  36. El-naggar, A.M.; Heiba, Z.K.; Kamal, A.M.; Mohamed, M.B. Adaptation and development in the optical and fluorescence features of PMMA-ZnMoO4 nanocomposites films. J. Mater. Sci. Mater. Electron. 2025, 36, 1006–1022. [Google Scholar] [CrossRef]
  37. Walter, A.; Scharr, H.; Gilmer, F.; Zierer, R.; Nagel, K.A.; Ernst, M.; Wiese, A.; Virnich, O.; Christ, M.M.; Uhlig, B.; et al. Dynamics of seedling growth acclimation towards altered light conditions can be quantified via GROWSCREEN: A setup and procedure designed for rapid optical phenotyping of different plant species. New Phytol. 2007, 174, 447–455. [Google Scholar] [CrossRef] [PubMed]
  38. Wang, M.; Han, Z.; Huang, J.; Liao, J.; Sun, Y.; Huang, H.; Wen, H.R. NaLaMgWO6:Mn4+/Pr3+/Bi3+ bifunctional phosphors for optical thermometer and plant growth illumination matching phytochrome PR and PFR. Spectrochim. Acta Part A Mol. Biomol. Spectrosc. 2021, 259, 119915. [Google Scholar] [CrossRef] [PubMed]
  39. MANDOLI, D.F.; BRIGGS, W.R. Fiber-optic plant tissues: Spectral dependence in dark-grown and green tissues. Photochem. Photobiol. 1984, 39, 419–424. [Google Scholar] [CrossRef]
  40. Barron, J.; Liptay, A. Measuring 3-D plant growth using optical flow. Bioimaging 1997, 5, 82–86. [Google Scholar] [CrossRef]
  41. Murabayashi, A.; Masuko, M.; Niikawa, M.; Shirane, N.; Furuta, T.; Hayashi, Y.; Makisumi, Y. Antifungal and plant growth inhibitory activities of stereo and optical isomers of 2-triazolylcycloalkanol derivatives. J. Pestic. Sci. 1991, 16, 419–427. [Google Scholar] [CrossRef]
  42. Barron, J.; Liptay, A. Optical flow to measure minute increments in plant growth. Bioimaging 1994, 2, 57–61. [Google Scholar] [CrossRef]
  43. Han, B.; Zhu, J.; Chu, C.; Yang, X.; Wang, Y.; Li, K.; Hou, Y.; Li, K.; Copner, N.; Teng, P. Sm3+-Mn4+ activated Sr2GdTaO6 red phosphor for plant growth lighting and optical temperature sensing. Sens. Actuators A Phys. 2023, 349, 114089. [Google Scholar] [CrossRef]
  44. Aoyagi, H.; Jitsufuchi, T.; Tanaka, H. Development of an optical method for monitoring protoplast formation from cultured plant cells. J. Ferment. Bioeng. 1993, 75, 201–206. [Google Scholar] [CrossRef]
  45. Kato, K.; Tanaka, S.; Fujii, S.; Katayama, M.; Kimoto, H. Preparation of optically active trifluoromethylated (3-indolyl) thiacarboxylic acids, novel plant growth regulators, through lipase-catalyzed enantioselective hydrolysis. J. Biosci. Bioeng. 1999, 87, 76–81. [Google Scholar] [CrossRef] [PubMed]
  46. Mi, R.; Liu, Y.g.; Mei, L.; Huang, Z.; Fang, M.; Wu, X.; Min, X. Multi-site occupancies and dependent photoluminescence of Ca9Mg1.5(PO4)7:Eu2+ phosphors: A bifunctional platform for optical thermometer and plant growth lighting. J. Rare Earths 2023, 41, 1503–1511. [Google Scholar] [CrossRef]
  47. Rofkar, J.R.; Dwyer, D.F.; Frantz, J.M. Analysis of arsenic uptake by plant species selected for growth in Northwest Ohio by inductively coupled plasma–optical emission spectroscopy. Commun. Soil Sci. Plant Anal. 2007, 38, 2505–2517. [Google Scholar] [CrossRef]
  48. Zhou, J.; Proisy, C.; Couteron, P.; Descombes, X.; Zerubia, J.; le Maire, G.; Nouvellon, Y. Tree crown detection in high resolution optical images during the early growth stages of eucalyptus plantations in Brazil. In Proceedings of the The First Asian Conference on Pattern Recognition, Beijing, China, 28 November 2011; pp. 623–627. [Google Scholar]
  49. Kato, K.; Katayama, M.; Fujii, S.; Kimoto, H. Effective preparation of optically active 4,4,4-trifluoro-3-(indole-3-)butyric acid, a novel plant growth regulator, using lipase from Pseudomonas fluorescens. J. Ferment. Bioeng. 1996, 82, 355–360. [Google Scholar] [CrossRef]
  50. Tikhomirov, A.; Sidko, F.Y. Optical characteristics of individual plant elements and plant canopies grown under radiation regimes of different spectral composition and intensity. Appl. Opt. 1983, 22, 2874–2881. [Google Scholar] [CrossRef] [PubMed]
  51. Aoyagi, H.; Takayanagi, T.; Jitsufuchi, T.; Tanaka, H. Development of an apparatus for monitoring protoplast isolation from plant tissues based on both dielectric and optical methods. J. Biosci. Bioeng. 1999, 87, 762–768. [Google Scholar] [CrossRef] [PubMed]
  52. Liew, O.W.; Chen, J.W.; Asundi, A.K. Development of fiber optic spectroscopy for in-vitro and in-planta detection of fluorescent proteins. In Advanced Photonic Sensors and Applications II; SPIE: Bellingham, WA, USA, 2001; Volume 4596, pp. 208–218. [Google Scholar]
  53. Halmann, M. Synthetic plant growth regulators. Adv. Agron. 1990, 43, 47–105. [Google Scholar]
  54. Tabacco, M.; Zhou, Q.; DiGiuseppe, T. Optical sensors for monitoring and control of plant growth systems. Adv. Space Res. 1994, 14, 223–226. [Google Scholar] [CrossRef] [PubMed]
  55. Bassini, A.; Musazzi, S.; Paganini, E.; Perini, U.; Ferri, F. Self-aligning optical particle sizer for the monitoring of particle growth processes in industrial plants. Rev. Sci. Instrum. 1998, 69, 2484–2494. [Google Scholar] [CrossRef]
  56. Ausanka, K.; Asiabanpour, B. Evaluation of a passive optical fiber daylighting system for plant growth. In Proceedings of the 2019 IEEE Texas Power and Energy Conference (TPEC), College Station, TX, USA, 7–8 February 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 1–6. [Google Scholar]
  57. Leal, M.; Toranzo, A.; Lopez, Y.; Vargas, A. Smart agriculture: An alternative with more efficiency and quality. Ser. Científica Fac. Cienc. Técnicas 2024, 17, 86–98. [Google Scholar]
  58. Shimizu, H.; Heins, R. Computer-vision-based system for plant growth analysis. Trans. ASAE 1995, 38, 959–964. [Google Scholar] [CrossRef]
  59. Rabhi, L.; Jabir, B.; Falih, N.; Afraites, L.; Bouikhalene, B. Digital transformation metamodel in smart farming: Crop classification prediction based on recurrent neural network. Foods Raw Mater. 2025, 13, 107–118. [Google Scholar] [CrossRef]
  60. Bernotas, G.; Scorza, L.C.; Hansen, M.F.; Hales, I.J.; Halliday, K.J.; Smith, L.N.; Smith, M.L.; McCormick, A.J. A photometric stereo-based 3D imaging system using computer vision and deep learning for tracking plant growth. GigaScience 2019, 8, giz056. [Google Scholar] [CrossRef] [PubMed]
  61. Li, X.; Liu, Z.; Lin, H.; Wang, G.; Sun, H.; Long, J.; Zhang, M. Estimating the growing stem volume of Chinese pine and larch plantations based on fused optical data using an improved variable screening method and stacking algorithm. Remote Sens. 2020, 12, 871. [Google Scholar] [CrossRef]
  62. Pilipova, V.; Davydov, V.; Rud, V. Development of a fiber-optic system for testing instruments for monitoring nuclear power plants. J. Phys. Conf. Ser. 2021, 2086, 012160. [Google Scholar] [CrossRef]
  63. Mijares, N.; González, E.; Alaniz, D.; Olvera, E.; Ivanov, R.; De la Rosa, I. Growth Sphere for Optical Measurements in Plants. In Proceedings of the 2020 International Conference on Mechatronics, Electronics and Automotive Engineering (ICMEAE), Cuernavaca, Mexico, 16–21 November 2020; pp. 131–136. [Google Scholar]
  64. Wu, W.; Zhao, J. 2D Kinematic Quantification of Soil Particles around Growing Plant Root based on Optical Mechanics. Am. J. Biochem. Biotechnol. 2020, 16, 494–506. [Google Scholar] [CrossRef]
  65. Dutta, P.K.; Mitra, S. Application of agricultural drones and IoT to understand food supply chain during post COVID-19. In Agricultural Informatics: Automation Using the IoT and Machine Learning; Wiley: Hoboken, NJ, USA, 2021; pp. 67–87. [Google Scholar]
  66. Kawata, Y.; Koizumi, K.; Hashimoto, M.; Sakaya, H. 3D Modeling of Kanazawa City Center from Airborne LiDAR Data. In Proceedings of the 34th Asian Conference on Remote Sensing (ACRS 2013), Bali, Indonesia, 20–24 October 2013; pp. 1372–1379. [Google Scholar]
  67. Saha, J.K.; Dutta, A. A review of graphene: Material synthesis from biomass sources. Waste and Biomass Valorization; Springer: Berlin/Heidelberg, Germany, 2022; pp. 1–45. [Google Scholar]
  68. León León, R.A.; Nicolle, G.M.N.; Tirado Palacios, E.T. Development an artificial vision algorithm to detect the Alternaria Alternata disease in the citrus limon plant of the “Fundo Amada”. In Proceedings of the 22nd LACCEI International Multi-Conference for Engineering, Education, and Technology, LACCEI, San Jose, Costa Rica, 17–19 July 2024. [Google Scholar]
  69. Zheng, C.; Abd-Elrahman, A.; Whitaker, V. Remote sensing and machine learning in crop phenotyping and management, with an emphasis on applications in strawberry farming. Remote Sens. 2021, 13, 531. [Google Scholar] [CrossRef]
  70. Botero-Valencia, J.; Valencia-Aguirre, J.; Gonzalez-Montoya, D.; Ramos-Paja, C. A low-cost system for real-time measuring of the sunlight incident angle using IoT. HardwareX 2022, 11, e00272. [Google Scholar] [CrossRef] [PubMed]
  71. Zhao, X.; Ren, L.; Zhang, Y.; Liu, R.; Wang, X. Multi-modal sensing fusion for AI-based phenotyping: Challenges and future directions. Mater. Lett. 2025, 355, 134022. [Google Scholar]
  72. Yu, H.; Wang, K.; Xie, L.; Zhang, P.; Hu, Y. Sustainable irrigation control using embedded AI and low-cost optical sensors. J. Mater. Sci. Mater. Electron. 2024, 35, 14734–14749. [Google Scholar]
  73. Jo, H.; Kim, E. New Monte Carlo localization using deep initialization: A three-dimensional LiDAR and a camera fusion approach. IEEE Access 2020, 8, 74485–74496. [Google Scholar] [CrossRef]
  74. Yang, T.; Jay, S.; Gao, Y.; Liu, S.; Baret, F. The balance between spectral and spatial information to estimate straw cereal plant density at early growth stages from optical sensors. Comput. Electron. Agric. 2023, 215, 108458. [Google Scholar] [CrossRef]
  75. Ashooriyan, P.; Mohammadi, M.; Darzi, G.N.; Nikzad, M. Development of Plantago ovata seed mucilage and xanthan gum-based edible coating with prominent optical and barrier properties. Int. J. Biol. Macromol. 2023, 248, 125938. [Google Scholar] [CrossRef] [PubMed]
  76. Yang, C.; Long, J.; Li, B.; Ma, R.; Cao, B.; Deng, C.; Huang, W. Highly thermally stable CaLaMgSbO6:Sm3+ double perovskite phosphors for optical thermometer and plant growth. J. Alloys Compd. 2025, 1010, 177035. [Google Scholar] [CrossRef]
  77. Wang, F.; Chen, H. Optical properties of novel deep-red phosphor LaMg3SbO7:Mn4+ for temperature sensing and plant growth illumination. J. Alloy. Compd. 2025, 1020, 179582. [Google Scholar] [CrossRef]
  78. Wang, Z.; Zhou, R.; Liu, N.; Kong, J.; Wang, Y. Remarkable thermal stability of Ca9MgK(VO4)7:Sm3+ phosphor for sensitive optical thermometry and plant-growth lighting. Mater. Res. Bull. 2026, 193, 113647. [Google Scholar] [CrossRef]
  79. Liu, Q.; Li, X.; Zhang, J.; Sun, W.; Feng, L. Real-time implementation of machine learning-based weed detection using hyperspectral sensors. J. Environ. Chem. Eng. 2025, 13, 109621. [Google Scholar]
  80. Zhou, J.; He, Q.; Wu, S.; Zhang, L. Characterization of rice crop stages using deep learning with UAV-derived multispectral data. Opt. Laser Technol. 2024, 171, 110132. [Google Scholar]
  81. Martínez, D.; Valero, C.; Pastor, G.; Gómez, J. Optical-thermal fusion models for biomass prediction in grapevines. Biosyst. Eng. 2025, 230, 34–45. [Google Scholar] [CrossRef]
Figure 1. General flow of the capture process of RGB optical sensors.
Figure 1. General flow of the capture process of RGB optical sensors.
Agronomy 15 01781 g001
Figure 2. Main components of a hyperspectral sensor.
Figure 2. Main components of a hyperspectral sensor.
Agronomy 15 01781 g002
Figure 3. Capture process of TOF cameras.
Figure 3. Capture process of TOF cameras.
Agronomy 15 01781 g003
Figure 4. Stereo vision system based on the triangulation principle.
Figure 4. Stereo vision system based on the triangulation principle.
Agronomy 15 01781 g004
Figure 5. (a) Simplified Perrin-Jablonski diagram illustrating the main photophysical processes: vertical absorption from S 0 to S 2 (solid arrow), vibrational relaxation (solid arrow), followed by radiative emission pathways—fluorescence (dotted arrow from S 1 to S 0 ). (b) Example excitation (absorption) and emission spectra of a fluorophore, depicted as near-mirror images to emphasize the Stokes shift between the peaks λ e x (excitation wavelength) and λ e m (emission wavelength).
Figure 5. (a) Simplified Perrin-Jablonski diagram illustrating the main photophysical processes: vertical absorption from S 0 to S 2 (solid arrow), vibrational relaxation (solid arrow), followed by radiative emission pathways—fluorescence (dotted arrow from S 1 to S 0 ). (b) Example excitation (absorption) and emission spectra of a fluorophore, depicted as near-mirror images to emphasize the Stokes shift between the peaks λ e x (excitation wavelength) and λ e m (emission wavelength).
Agronomy 15 01781 g005
Figure 6. PRISMA flow diagram. Own creation based on data extracted from Scopus and Web of Science.
Figure 6. PRISMA flow diagram. Own creation based on data extracted from Scopus and Web of Science.
Agronomy 15 01781 g006
Figure 7. Evolution of publications per year.
Figure 7. Evolution of publications per year.
Agronomy 15 01781 g007
Figure 8. Geographical distribution of publications.
Figure 8. Geographical distribution of publications.
Agronomy 15 01781 g008
Figure 9. Relevant characteristics associated with the measurement of plant growth from artificial vision and optics.
Figure 9. Relevant characteristics associated with the measurement of plant growth from artificial vision and optics.
Agronomy 15 01781 g009
Figure 10. Main techniques (inner) and contributions (outer) associated with data and image processing in plant growth.
Figure 10. Main techniques (inner) and contributions (outer) associated with data and image processing in plant growth.
Agronomy 15 01781 g010
Figure 11. Major contributions of artificial vision and optics to plant growth measurement.
Figure 11. Major contributions of artificial vision and optics to plant growth measurement.
Agronomy 15 01781 g011
Figure 12. Main sectors with advances in the development of optical artificial vision for the measurement of plant growth.
Figure 12. Main sectors with advances in the development of optical artificial vision for the measurement of plant growth.
Agronomy 15 01781 g012
Figure 13. Plant growth measurement model based on the use of artificial vision and optics.
Figure 13. Plant growth measurement model based on the use of artificial vision and optics.
Agronomy 15 01781 g013
Table 1. Comprehensive overview of state-of-the-art articles.
Table 1. Comprehensive overview of state-of-the-art articles.
Work TitleCountryMethodologyApplicationsKey Factors Analyzed
Dynamics of seedling growth acclimation towards altered light conditions can be quantified via GROWSCREEN: A setup and procedure designed for rapid optical phenotyping of different plant species [37]GermanyImage analysis and color segmentation to assess seedling growth Color-based segmentation using HSV to distinguish plants from backgroundRapid phenotyping of seedling growth under different light conditions.Optical phenotyping, Plant growth, Light conditions, Nutrients, Plant growth
NaLaMgWO6: Mn4+/Pr3+/Bi3+ bifunctional phosphors for optical thermometer and plant growth illumination matching phytochrome P R and P F R [38]ChinaSynthesis by sol-gel method and spectroscopic analysisPlant growth illumination and optical thermometryPhotoluminescence, Plant growth illumination, Rare-earth phosphors, Optical sensors
Fiber optic plant tissues: spectral dependence in dark-grown and green tissues [39]United KingdomOptical experiments and spectrophotometry on etiolated and green plant tissuesAnalysis of light transmission in plant tissues to understand phytochrome-mediated physiological responsesOptical properties, Spectrophotometry, Photobiology, Phytochemistry
Measuring 3D plant growth using optical flow [40]CanadaOptical flow analysis in three-dimensional imagesNon-contact 3D measurement of plant growthartificial vision, Plant growth, 3D analysis, Optical flow
Antifungal and Plant Growth Inhibitory Activities of Stereo and Optical Isomers of 2-Triazolylcycloalkanol Derivatives [41]JapanComparison of stereochemical and optical isomers in growth inhibition assaysEvaluation of antifungal and growth-regulating activity in plantsAntifungal activity, Growth regulation, Optical isomerism, Triazolylcycloalkanol
Optical flow to measure minute increments in plant growth [42]CanadaOptical flow analysis in imaging sequences for growth measurementAccurate non-contact seedling growth measurementartificial vision, Plant growth, Image analysis, Optical flow
Sm3+-Mn4+ activated Sr2GdTaO6 red phosphor for plant growth lighting and optical temperature sensing [43]ChinaRare-earth phosphor synthesis and spectroscopic analysisIllumination for plant growth and optical temperature sensingPhotoluminescence, Plant growth illumination, Optical thermometry, Rare-earth phosphors
Development of an optical method for monitoring protoplast formation from cultured plant cells [44]JapanOptical spectrophotometry for optical density monitoring in protoplast formationMonitoring of cellular processes in plant biotechnologyPlant biotechnology, Protoplasts, Optical monitoring, Enzymatic and Enzyme digestion
Preparation of optically active trifluoromethylated (3’-indolyl) thiacarboxylic acids, novel plant growth regulators, through lipase-catalyzed enantioselective hydrolysis [45]JapanEnantioselective enzymatic hydrolysis of fluorinated carboxylic acidsPlant growth regulatorsGrowth regulation, Chemical synthesis, Enantioselectivity, Fluorinated carboxylic acids
Multi-site occupancies and dependent photoluminescence of Ca9Mg1.5(PO4)7:Eu2+ phosphors: A bifunctional platform for optical thermometer and plant growth lighting [46]ChinaLuminescent materials synthesis and spectroscopic analysis Eu2+-doped phosphors for optical thermometry and plant growth lightingIllumination for plant growth and optical thermometryPhotoluminescence, Optical thermometry, Illumination of plant growth, Rare-earth phosphors
Analysis of arsenic uptake by plant species selected for growth in northwest Ohio by inductively coupled plasma-optical emission spectroscopy [47]USAInductively coupled plasma-optical emission spectroscopy (ICP-OES) for the measurement of arsenic in plant tissuesEvaluation of arsenic phytoremediation in native Ohio plant speciesPhytoremediation, Arsenic accumulation, Arsenic accumulation, Optical spectroscopy, Native species
Tree crown detection in high resolution optical images during the early growth stages of Eucalyptus plantations in Brazil [48]BrazilHigh resolution optical image analysis Multi-date tree crown detection using marked point process modelingTree canopy detection in growing eucalyptus plantationsartificial vision, Optics, Plant growth
Effective preparation of optically active 4,4,4-trifluoro-3-(indole-3-)butyric acid, a novel plant growth regulator, using lipase from Pseudomonas fluorescens [49]USAInductively coupled plasma optical emission spectroscopyPhytoremediation of arsenic in plant speciesOptics, Plant growth, Phytoremediation
Optical characteristics of individual plant elements and plant canopies grown under radiation regimes of different spectral composition and intensity [50]USSRSpectrophotometric reflectance and absorption measurementsAnalysis of the impact of different radiation regimes on photosynthesisPlant optics, Radiation regimes, Spectral analysis, Photosynthesis, Plant optics, Photosynthesis
Development of an apparatus for monitoring protoplast isolation from plant tissues based on both dielectric and optical methods [51]JapanOptical and dielectric method for protoplast isolation monitoringIsolation and monitoring of protoplasts in culturesOptics, Biotechnology, Plant growth
Development of fiber optic spectroscopy for in vitro and in planta detection of fluorescent proteins [52]SingaporeFiber optic spectroscopy for fluorescent protein detectionProtein monitoring in genetically engineered plantsOptics, Spectroscopy, Transgenic crops
Synthetic plant growth regulators [53]IsraelAnalysis of synthetic plant growth regulatorsOptimization of crop growth by chemical regulatorsPlant growth, Plant growth regulators, Agricultural chemistry
Optical sensors for monitoring and control of plant growth systems [54]USAOptical sensors with optical fibers and porous polymersNutrient and contaminant monitoring in plant growth systemsOptics, Crop monitoring, Plant growth
Self-aligning optical particle sizer for the monitoring of particle growth processes in industrial plants [55]ItalyOptical particle measurement system Diffraction and extinction based particle sizing for process monitoringParticulate growth monitoring in industrial plantsOptics, Particle measurement, Industry
Evaluation of a passive optical fiber daylighting system for plant growth [56]USAPassive fiber optic lighting systemCultivation of plants in closed environmentsOptics, Illumination, Plant growth
Smart agriculture: an alternative with more efficiency and quality [57]CubaIntelligent farming system with sensors and machine learningResource optimization and agricultural productionartificial vision, Automation, Plant growth
Computer-vision-based system for plant growth analysis [58]USAComputer vision and automatic analysis Infrared-based 3D computer vision for plant growth analysisComputer vision analysis of plant growthArtificial vision, Plant growth, Automated analysis
Digital transformation metamodel in smart farming: Crop classification prediction based on recurrent neural network [59]MoroccoRecurrent neural networks for crop classificationCrop prediction and classification in digital agricultureDigital agriculture, Neural networks, Crop classification, Business modeling
A photometric stereo-based 3D imaging system using computer vision and deep learning for tracking plant growth [60]ChinaPhotometric stereo and deep learningThree-dimensional plant growth monitoringThree-dimensional imaging, Deep learning, artificial vision, Plant growth
Estimating the Growing Stem Volume of Chinese Pine and Larch Plantations based on Fused Optical Data Using an Improved Variable Screening Method and Stacking Algorithm [61]RussiaRemote sensing and machine learning algorithmsEstimation of stem growth volume in forest plantationsRemote sensing, Machine learning, Forestry management
Regulating the luminescence properties of Eu2W3O12 red-emitting phosphor via rare-earth ions doping for optical thermometry [2]ChinaLuminescence and doping of rare earth ionsOptical thermometry and plant growth enhancementLuminescence, Optical thermometry, Plant growth, Rare-earth ions
Development of a fiber-optic system for testing instruments for monitoring nuclear power plants [62]RussiaFiber optics and pulsed laser radiationTesting of instruments for monitoring nuclear power plantsFiber optics, Laser radiation, Instrumentation, Nuclear monitoring
Growth Sphere for Optical Measurements in Plants [63]MexicoOptical measurement techniquesMeasuring plant growth with optical technologiesPlant growth, Optical measurements, Growth analysis
Development of an artificial vision algorithm to detect the Huanglonbing (HLB) disease in the citrus lemon plant of the “Fundo Amada” [19]PeruConvolutional neural networksDetection of citrus diseases by artificial visionartificial vision, Disease detection, Neural networks, Citrus plants
2D kinematic quantification of soil particles around growing plant root based on optical mechanics [64]ChinaDigital image correlation for ground deformation measurementAnalysis of soil deformation around roots using optical mechanicsSoil deformation, Root interaction, Digital image correlation, Kinematic analysis
Table 2. Main applications associated with the measurement of plant growth through the use of artificial and optical vision techniques.
Table 2. Main applications associated with the measurement of plant growth through the use of artificial and optical vision techniques.
ApplicationsVision AI ToolsOptical ToolsAI Techniques
Real-time Crop Monitoring [54]Multi-spectral image classificationNear-infrared spectroscopySupport Vector Machines (SVM)
Leaf Area Estimation [15]Hyperspectral image processingMultispectral camerasRandom Forest (RF)
Root Structure Analysis [69]LIDAR-based root mappingLaser scanning systemsReinforcement Learning
Early Disease Detection [19]CNN for disease identificationUV-induced fluorescenceNeural Networks
Photosynthesis Efficiency Assessment [60]AI-driven chlorophyll fluorescence trackingPassive optical fiber daylightingPrincipal Component Analysis (PCA)
Soil Moisture Estimation [64]Remote sensing image fusionThermal imagingClustering Algorithms
Automated Phenotyping [50]AI-driven phenotypic trait analysisStructured lighting systemsSelf-learning AI models
Table 3. Main challenges associated with plant biology and horticulture from the use of artificial vision and optics.
Table 3. Main challenges associated with plant biology and horticulture from the use of artificial vision and optics.
ChallengeDescriptionSector
Real-time plant monitoring [1,70]Ensuring continuous, real-time plant growth monitoringSmart agriculture
High-resolution imaging costs [8]Reducing the cost of high-resolution imaging technologiesPrecision horticulture
Data processing scalability [48,69]Managing large-scale plant imaging and processing efficientlyData science in agriculture
Multi-sensor data fusion [71]Integrating data from multiple imaging and sensor sourcesRemote sensing
Light spectrum optimization [2]Optimizing light conditions for maximum plant growthGreenhouse automation
Disease early detection [19]Detecting diseases at early stages for proactive treatmentCrop protection
Automated phenotyping [69]Automating the collection of plant traits using AIPlant breeding
Plant-water interaction analysis [72]Analyzing how plants interact with water at a micro-levelWater resource management
Environmental variability impact [3]Understanding how environmental changes affect plant growthClimate adaptation studies
Non-invasive nutrient assessment [35]Developing non-invasive methods for analyzing plant nutrientsSoil and nutrient management
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zapata-Londoño, J.; Botero-Valencia, J.; García-Pineda, V.; Reyes-Vera, E.; Hernández-García, R. A Comprehensive Review of Optical and AI-Based Approaches for Plant Growth Assessment. Agronomy 2025, 15, 1781. https://doi.org/10.3390/agronomy15081781

AMA Style

Zapata-Londoño J, Botero-Valencia J, García-Pineda V, Reyes-Vera E, Hernández-García R. A Comprehensive Review of Optical and AI-Based Approaches for Plant Growth Assessment. Agronomy. 2025; 15(8):1781. https://doi.org/10.3390/agronomy15081781

Chicago/Turabian Style

Zapata-Londoño, Juan, Juan Botero-Valencia, Vanessa García-Pineda, Erick Reyes-Vera, and Ruber Hernández-García. 2025. "A Comprehensive Review of Optical and AI-Based Approaches for Plant Growth Assessment" Agronomy 15, no. 8: 1781. https://doi.org/10.3390/agronomy15081781

APA Style

Zapata-Londoño, J., Botero-Valencia, J., García-Pineda, V., Reyes-Vera, E., & Hernández-García, R. (2025). A Comprehensive Review of Optical and AI-Based Approaches for Plant Growth Assessment. Agronomy, 15(8), 1781. https://doi.org/10.3390/agronomy15081781

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop