Next Article in Journal
Resonance Control of VO2 Thin-Film-Based THz Double-Split Rectangular Metamaterial According to Aspect Ratio
Previous Article in Journal
Tight Focusing Properties of Ring Pearcey Beams with a Cross Phase
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Communication

Fluorescence Mapping of Agricultural Fields Utilizing Drone-Based LIDAR

by
Vasily N. Lednev
1,*,
Mikhail Ya. Grishin
1,
Pavel A. Sdvizhenskii
1,
Rashid K. Kurbanov
2,
Maksim A. Litvinov
2,
Sergey V. Gudkov
1 and
Sergey M. Pershin
1
1
Prokhorov General Physics Institute of the Russian Academy of Sciences, 38 Vavilova Street, 119991 Moscow, Russia
2
FSBSI “Federal Scientific Agroengineering Center VIM”, 109428 Moscow, Russia
*
Author to whom correspondence should be addressed.
Photonics 2022, 9(12), 963; https://doi.org/10.3390/photonics9120963
Submission received: 16 November 2022 / Revised: 6 December 2022 / Accepted: 8 December 2022 / Published: 10 December 2022
(This article belongs to the Topic Advances in Optical Sensors)

Abstract

:
A compact and low-weight LIDAR instrument has been developed for laser-induced fluorescence spectroscopy sensing of maize fields. Fluorescence LIDAR had to be installed on a small industrial drone so that its mass was <2 kg and power consumption was <5 W. The LIDAR instrument utilized a continuous wave diode laser (405 nm, 150 mW) for inducing fluorescence and a small spectrometer for backscattered photons acquisition. For field testing, the LIDAR instrument was installed on a quadcopter for remote sensing of plants in a maize field in three periods of the plant’s life. The obtained fluorescence signal maps have demonstrated that the average chlorophyll content is rather non-uniform over the field and tends to increase through the plant vegetation cycle. Field tests proved the feasibility and perspectives of autonomous LIDAR sensing of agricultural fields from drones for the detection and location of plants under stress.

1. Introduction

Modern agriculture and farming require timely monitoring of vegetation to estimate crop conditions [1], increase yield [2], and respond promptly to climate changes. Conventional onsite monitoring instrumentation is rather powerful [3,4] but large-scale areas can be studied effectively only by remote sensing techniques [5,6,7,8]. The development of remote sensing techniques allowed for assessing various kinds of cultivated and wild vegetation areas [9]. Remote sensing techniques include satellite and airborne installed instrumentation and are based on passive and active sounding. Laser remote sensing is an active sounding technique that was developed in the early 1960s when pulsed lasers became available [10,11]. However, the laser remote sensing instruments (LIDAR—Light Detection And Ranging) were rather heavy and bulky, so they had to be installed on manned airplanes or helicopters. This resulted in a reduction in LIDAR utilization for solving real-life goals starting from fish school detection, forest canopy studies, and precise agriculture applications [10,11,12,13]. Nowadays, the blossoming of unmanned aircraft vehicle (UAV) development has completely changed the exploitation costs [14,15]. The drop in UAV exploitation costs triggered a renaissance in LIDAR instrument development [16,17,18,19,20,21,22,23,24,25]. However, modern UAVs can carry only a few kg weight of compact sizes (10 × 10 × 10 cm3) and low power consumption (<50 W), so the developed LIDAR instruments have to fit these requirements. UAVs are very effective for precise agriculture and farming with sensing instruments that are not based on laser sensing—these include multispectral and hyperspectral cameras [26,27]. However, active sensing with LIDARs provides benefits compared to passive sensing by multispectral imaging: there is no need for precise calibration before every measurement; sunlight conditions have a smaller impact on the measurements; synchronous pumping and sensing is an effective way to distinguish tiny differences in the object properties.
The first LIDARs utilized onboard a UAV were used to generate dense 3D point clouds of the regions of interest of growing plants, fields, and forests [28,29]. Lately, non-elastic scattering LIDARs have been utilized for remote fluorescence spectroscopy measurements [30,31,32,33]. Recently, Zhao et al. [33] demonstrated the possibility of multi-wavelength fluorescence LIDAR for 3D fluorescence imaging of plants. The same team utilized a 1-W UV laser for a compact LIDAR to estimate water quality in the Zhujiang River [34]. The same authors mounted a LIDAR based on a UV laser on a commercial drone to demonstrate the possibility of vegetation and marine monitoring. Both systems cannot be used during the daytime because of the sunlight (background) [35,36]. In this study, we present for the first time the results of maize field fluorescence mapping utilizing a lightweight LIDAR system with a low-power diode laser during daytime from aboard an automated industrial drone.

2. Experiment

We developed a low-weight compact fluorescence LIDAR (Figure 1) based on a continuous wave UV laser (405 nm, 150 mW) and a diffraction mini-spectrometer (STS-VIS, Ocean Optics). The choice of the excitation wavelength was dictated by two reasons: (a) the 400–450 nm wavelength range is rather effective for inducing chlorophyll fluorescence [37,38], and (b) the high output efficiency and low cost of the lasers available in the market. The spectrometer has to be very compact and steady for vibrations, but capable of capturing spectra in the 400–800 nm range with a spectral resolution of at least 5 nm. The laser beam was deflected by a dichroic mirror (DMLP425, Thorlabs Inc., Newton, USA) to the target, and the same mirror separated the backscattered laser photons. The receiving telescope had a diameter of 20 mm and a focal length of 30 mm. The dichroic mirror was installed before the focusing lens, which focused the captured backscattered photons onto the fiberoptic input. The fiberoptic output was connected to the spectrometer input slit. We chose a fiberoptic connection between the telescope and spectrometer so these components could be spatially separated to fit the balance and windage requirements of the UAV. The spectrometer acquired spectra in the 350–820 nm range by a linear diode array of 1024 pixels with 7.8 × 125 µm dimensions, resulting in a 3.5 nm spectral resolution. The laser and spectrometer were synchronized by a microcontroller (Atmega 328P, Atmel, USA). Fluorescence spectra acquisition and control were performed in a custom-developed program in the LabVIEW (National Instruments, USA) environment, running on a small single-board computer (Intel NUC, 5th generation). In order to suppress the impact of daylight on the measured laser fluorescence spectra, the following procedure was implemented: (a) the laser was turned on and the fluorescence spectrum was acquired and stored in the PC memory; (b) after that, the laser was turned off, and a background spectrum was acquired; (c) the fluorescence spectrum was corrected for the background in the program, and the resulting data were stored on the PC. An external GPS module was also utilized to acquire geographical coordinates, which were also stored in the spectral data file. Remote control of the LIDAR was carried out via the remote desktop protocol (RDP) to the single-board computer through a Wi-Fi connection (up to 120 m distance). The LIDAR components were assembled in a small instrument case (Figure 1a), which was designed to fit the requirements of the UAV balance to minimize the impact of the installed instrument [39]. The case was 3D-printed from PLA (polylactide) plastic. The LIDAR dimensions were 10 cm × 15 cm × 5 cm, and it had a mass of 310 g. The LIDAR control computer case was also changed to a 3D-printed plastic case to preserve the drone’s center of mass. The computer had nearly the same dimensions as LIDAR (11 cm × 11 cm × 3 cm) but was heavier (600 g), and in future LIDAR versions, the computer is to be substituted by smaller and lighter variants such as the Raspberry Pi. The summed power consumption of the LIDAR and control computer was below 30 W. The LIDAR and the computer were installed in an industrial drone (Matrice 200 v2, DJI) capable of transporting up to 2 kg weight. The LIDAR was installed on the drone’s bottom (Figure 2) in such a way that bottom distance sensors and camera were still active, thus sustaining the safety of flight.

3. Results and Discussion

The developed LIDAR was utilized for maize Zéa máys field diagnostics at different periods of its growth: a month after planting, two months after planting, and a week before harvesting. Firstly, we measured laser-induced fluorescence spectra for different parts of the maize plant from a distance of 2 m (Figure 3) including healthy and yellow leaves and stems. The fluorescence bands were rather familiar with the two bands at 680 and 740 nm peaks corresponding to the chlorophyll-a and chlorophyll-b components. The healthy leaves had the lowest fluorescence intensity while the damaged leaves had the highest. The intensity ratio of 680 and 740 nm peaks is proportional to the chlorophyll concentration [40], so we were able to compare the relative chlorophyll content in different parts of the maize plant judging by their fluorescence spectra. The chlorophyll fluorescence signal was defined by the following formula:
S = I 680 / I 740
where S is the chlorophyll signal, I680 is the background-corrected spectrum integral in the 660–700 nm range, and I740 is the background-corrected spectrum integral in the 720–760 nm range. For the spectra presented in Figure 3b, the chlorophyll signal was Sstem = 1.46 for the stem and Shealthy = 7.31 for a healthy leaf, thus the chlorophyll concentration was five-fold higher in leaves compared to the stem as was previously published in the literature [40,41].
Field experiments on maize field sensing were carried out near Semenovskoe Village in the Pushkin area of the Moscow region. The aerial photograph of the maize field is presented in Figure 4. As was described above, each laser-induced fluorescence measurement includes spectra acquisition with the laser switched on and off so the background can be corrected. To get a high signal-to-noise ratio, spectra were taken with a gate of 500 ms so each measurement can be taken once a second. Both corrected fluorescence and background spectra were stored so we could estimate the impact of the sunlight. The sunlight impact was rather significant, and thus we had to carry out measurements 1–2 h before sunset to get a good signal-to-noise ratio (Figure 5). The quadcopter altitude was kept at 2 m above the maize plant head to ensure high reproducibility of the LIDAR measurements. The quadcopter was set to automatic flight over the scanning area (white rectangle in Figure 4) with a speed of 1.5 m/s, and the average time of a single flight was 10 min.
The maize field was measured by the LIDAR, and spectral data were processed to construct signal maps. The maize leaf fluorescence spectrum was quantified for 680 and 740 nm bands as well as its ratio as shown in Figure 6. The 680/740 nm ratio is proportional to the chlorophyll concentration [40,41]. Examples of LIDAR signal maps for the 680 (integral A), 740 (integral B), and 680/740 nm (A/B) ratios are presented in Figure 7. Both 680 and 740 nm signals had a tendency to increase in the right part of the maps, which can be explained by the continuous increase in the field plane. The ratio of 680/740 nm was rather stable, and some problematic spots could be detected (center of Figure 7c). The increased value of the 680/740 nm ratio is a good indicator of plant stress.
LIDAR fluorescence mapping of the maize field was carried out on 24 June, 17 August, and 27 September. These dates were chosen as characteristic points in the maize plant vegetation period: plant growth start, flowering, and a week before harvesting. Photographs and typical spectra of single maize plants from these three periods are presented in Figure 8.
One can see from Figure 8d that chlorophyll concentration represented by the ratio of 680 nm and 740 nm band integrals increases during the plant life cycle: according to formula (1), SJun24 = 1.36, SAug17 = 7.73 and SSep27 = 10.23.
The maize field was mapped by the fluorescence LIDAR during the described periods from the drone, and Figure 9 illustrates the maps of the 680/740 nm integral ratio.
As follows from Figure 9, chlorophyll distribution over the investigated maize field is rather non-uniform in June, which can be explained by the non-uniformity of maize sprout height and the small average leaf area (i.e., the soil is not fully covered by the maize leaves). In August, the average chlorophyll concentration over the field was higher than in June due to the larger leaf area. Further, the chlorophyll distribution becomes the most non-uniform in September when the maize vegetation cycle is close to its end and the leaves start to wither: looking back on the spectra presented in Figure 3b, after a maize plant has produced its fruit, photosynthetic activity in its leaves diminishes, resulting in an increase in the 740 nm band, which in turn leads to the decrease in the LIDAR fluorescence signal defined by the 680/740 nm intensity ratio. Thus, drone-based LIDAR fluorescence mapping is a good tool for quick assessment of plant growth and health status as the main indicator of plant stress (680/740 nm integrals ratio [40]) can be measured and mapped automatically. This indicator reflects many aspects of possible plant stress such as herbicide treatment and dehydration, and fast detection of changes in this indicator would allow the prevention of possible crop losses.

4. Conclusions

A compact and lower power consumption fluorescence LIDAR was developed for maize field diagnostics. The LIDAR was installed on an industrial quadcopter to map maize plants at different periods of its vegetation. Spectral data were processed and the 680/740 nm fluorescence band ratio was shown to be a good indicator of the plant under stress. The LIDAR chlorophyll fluorescence signal was defined as the ratio of 680/740 nm band integrals, and maps of this signal were constructed for three periods of plant growth. The obtained signal maps have demonstrated that the chlorophyll signal is non-uniform over the field in the period of maize canopy growth, and further, it increases due to a larger leaf area. Overall, the chlorophyll signal tends to increase through the plant vegetation cycle, which reflects the changes in maize plants taking place during their life cycle. Spots of increased chlorophyll signal were also detected in the obtained maps, which may point at regions in the field with plants under stress (e.g., because of excess herbicide use). Field tests have proven the feasibility and perspectives of autonomous LIDAR sensing from drones for the detection and location of field locations with plants under stress.

Author Contributions

V.N.L. Conceptualization, Methodology, Writing—Original Draft; M.Y.G., Visualization, Writing—Review & Editing; P.A.S. Investigation, Visualization, Writing—Review & Editing; R.K.K. Investigation, Writing—Review & Editing; M.A.L. Investigation, Writing—Review & Editing; S.V.G. Writing—Review & Editing, Supervision; S.M.P. Conceptualization, Writing—Review & Editing. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by a grant from the Ministry of Science and Higher Education of the Russian Federation (075-15-2022-315) for the organization and development of a world-class research center ‘‘Photonics”.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Chang, A.; Jung, J.; Maeda, M.M.; Landivar, J. Crop height monitoring with digital imagery from Unmanned Aerial System (UAS). Comput. Electron. Agric. 2017, 141, 232–237. [Google Scholar] [CrossRef]
  2. Rudorff, B.; Batista, G. Yield estimation of sugarcane based on agrometeorological-spectral models. Remote Sens. Environ. 1990, 33, 183–192. [Google Scholar] [CrossRef]
  3. Cozzolino, D. Use of Infrared Spectroscopy for In-Field Measurement and Phenotyping of Plant Properties: Instrumentation, Data Analysis, and Examples. Appl. Spectrosc. Rev. 2014, 49, 564–584. [Google Scholar] [CrossRef]
  4. Evangelista, C.; Basiricò, L.; Bernabucci, U. An overview on the use of near infrared spectroscopy (NIRS) on farms for the management of dairy cows. Agriculture 2021, 11, 296. [Google Scholar] [CrossRef]
  5. Brisco, B.; Brown, R.J.; Hirose, T.; McNairn, H.; Staenz, K. Precision agriculture and the role of remote sensing: A review. Can. J. Remote Sens. 1998, 24, 315–327. [Google Scholar] [CrossRef]
  6. Liaghat, S.; Balasundram, S.K. A review: The role of remote sensing in precision agriculture. Am. J. Agric. Biol. Sci. 2010, 5, 50–55. [Google Scholar] [CrossRef] [Green Version]
  7. Sishodia, R.P.; Ray, R.L.; Singh, S.K. Applications of remote sensing in precision agriculture: A review. Remote Sens. 2020, 12, 3136. [Google Scholar] [CrossRef]
  8. Kior, A.; Sukhov, V.; Sukhova, E. Application of reflectance indices for remote sensing of plants and revealing actions of stressors. Photonics 2021, 8, 582. [Google Scholar] [CrossRef]
  9. Khanal, S.; Kc, K.; Fulton, J.P.; Shearer, S.; Ozkan, E. Remote sensing in agriculture—Accomplishments, limitations, and opportunities. Remote Sens. 2020, 12, 3783. [Google Scholar] [CrossRef]
  10. Measures, R.M. Laser Remote Sensing: Fundamentals and Applications; John Wiley & Sons, Ltd.: New York, NY, USA, 1984; ISBN 0894646192. [Google Scholar]
  11. Bunkin, A.; Voliak, K. Laser Remote Sensing of the Ocean: Methods and Applications; Wiley: New York, NY, USA, 2001; ISBN 0471389277. [Google Scholar]
  12. Lednev, V.N.; Bunkin, A.F.; Pershin, S.M.; Grishin, M.Y.; Artemova, D.G.; Zavozin, V.A.; Sdvizhenskii, P.A.; Nunes, R.A. Remote Laser Induced Fluorescence of Soils and Rocks. Photonics 2021, 8, 411. [Google Scholar] [CrossRef]
  13. Yang, G.; Tian, Z.; Bi, Z.; Cui, Z.; Sun, F.; Liu, Q. Measurement of the Attenuation Coefficient in Fresh Water Using the Adjacent Frame Difference Method. Photonics 2022, 9, 713. [Google Scholar] [CrossRef]
  14. Mogili, U.M.R.; Deepak, B. Review on application of drone systems in precision agriculture. Procedia Comput. Sci. 2018, 133, 502–509. [Google Scholar] [CrossRef]
  15. Mohsan, S.A.H.; Khan, M.A.; Noor, F.; Ullah, I.; Alsharif, M.H. Towards the unmanned aerial vehicles (UAVs): A comprehensive review. Drones 2022, 6, 147. [Google Scholar] [CrossRef]
  16. Panday, U.S.; Pratihast, A.K.; Aryal, J.; Kayastha, R.B. A Review on Drone-Based Data Solutions for Cereal Crops. Drones 2020, 4, 41. [Google Scholar] [CrossRef]
  17. Christiansen, M.; Laursen, M.; Jørgensen, R.; Skovsen, S.; Gislum, R. Designing and Testing a UAV Mapping System for Agricultural Field Surveying. Sensors 2017, 17, 2703. [Google Scholar] [CrossRef] [Green Version]
  18. Grishin, M.Y.; Lednev, V.N.; Sdvizhenskii, P.A.; Pavkin, D.Y.; Nikitin, E.A.; Bunkin, A.F.; Pershin, S.M. Lidar Monitoring of Moisture in Biological Objects. In Proceedings of the Doklady Physics; Springer: Berlin, Germany, 2021; Volume 66, pp. 273–276. [Google Scholar]
  19. Pershin, S.M.; Bunkin, A.F.; Klinkov, V.K.; Lednev, V.N.; Lushnikov, D.; Morozov, E.G.; Yul’metov, R.N. Remote sensing of Arctic Fjords by Raman lidar: Heat transfer screening by layer of glacier’s relict water. Phys. Wave Phenom. 2012, 20, 212–222. [Google Scholar] [CrossRef]
  20. Myasnikov, A.V.; Pershin, S.M.; Grishin, M.Y.; Zavozin, V.A.; Makarov, V.S.; Ushakov, A.A. Estimation of the Influence of Meteorological Factors on the Aerosol Lidar Signal in Tunnels above the Elbrus Volcano Chamber. Phys. Wave Phenom. 2022, 30, 119–127. [Google Scholar] [CrossRef]
  21. Pershin, S.M.; Sobisevich, A.L.; Zavozin, V.A.; Grishin, M.Y.; Lednev, V.N.; Makarov, V.S.; Petkov, V.B.; Ponurovskii, Y.Y.; Fedorov, A.N.; Artemova, D.G. LIDAR Detection of Aerosols in the Tunnel above the Elbrus Volcano Chamber. Bull. Lebedev Phys. Inst. 2022, 49, 36–41. [Google Scholar] [CrossRef]
  22. Pershin, S.M.; Sobisevich, A.L.; Grishin, M.Y.; Gravirov, V.V.; Zavozin, V.A.; Kuzminov, V.V.; Lednev, V.N.; Likhodeev, D.V.; Makarov, V.S.; Myasnikov, A.V.; et al. Volcanic activity monitoring by unique LIDAR based on a diode laser. Laser Phys. Lett. 2020, 17, 115607. [Google Scholar] [CrossRef]
  23. Pershin, S.M.; Grishin, M.Y.; Zavozin, V.A.; Kuzminov, V.V.; Lednyov, V.N.; Makarov, V.S.; Myasnikov, A.V.; Tyurin, A.V.; Fedorov, A.N.; Petkov, V.B. Lidar Sensing of Multilayer Fog Evolution in the Inclined Tunnel of the Baksan Neutrino Observatory. Bull. Lebedev Phys. Inst. 2019, 46, 328–332. [Google Scholar] [CrossRef]
  24. Tsouros, D.C.; Bibi, S.; Sarigiannidis, P.G. A review on UAV-based applications for precision agriculture. Information 2019, 10, 349. [Google Scholar] [CrossRef] [Green Version]
  25. Tao, H.; Xu, S.; Tian, Y.; Li, Z.; Ge, Y.; Zhang, J.; Wang, Y.; Zhou, G.; Deng, X.; Zhang, Z.; et al. Proximal and remote sensing in plant phenomics: 20 years of progress, challenges, and perspectives. Plant Commun. 2022, 3, 100344. [Google Scholar] [CrossRef] [PubMed]
  26. Zhang, C.; Kovacs, J.M. The application of small unmanned aerial systems for precision agriculture: A review. Precis. Agric. 2012, 13, 693–712. [Google Scholar] [CrossRef]
  27. Guo, W.; Carroll, M.E.; Singh, A.; Swetnam, T.L.; Merchant, N.; Sarkar, S.; Singh, A.K.; Ganapathysubramanian, B. UAS-Based Plant Phenotyping for Research and Breeding Applications. Plant Phenomics 2021, 2021, 9840192. [Google Scholar] [CrossRef]
  28. Lin, Y.-C.; Habib, A. Quality control and crop characterization framework for multi-temporal UAV LiDAR data over mechanized agricultural fields. Remote Sens. Environ. 2021, 256, 112299. [Google Scholar] [CrossRef]
  29. Yang, A.; Cao, J.; Cheng, Y.; Chen, C.; Hao, Q. Three-Dimensional Laser Imaging with a Variable Scanning Spot and Scanning Trajectory. Photonics 2021, 8, 173. [Google Scholar] [CrossRef]
  30. Wulder, M.A.; White, J.C.; Nelson, R.F.; Næsset, E.; Ørka, H.O.; Coops, N.C.; Hilker, T.; Bater, C.W.; Gobakken, T. Lidar sampling for large-area forest characterization: A review. Remote Sens. Environ. 2012, 121, 196–209. [Google Scholar] [CrossRef] [Green Version]
  31. Jin, S.; Sun, X.; Wu, F.; Su, Y.; Li, Y.; Song, S.; Xu, K.; Ma, Q.; Baret, F.; Jiang, D.; et al. Lidar sheds new light on plant phenomics for plant breeding and management: Recent advances and future prospects. ISPRS J. Photogramm. Remote Sens. 2021, 171, 202–223. [Google Scholar] [CrossRef]
  32. Grishin, M.Y.; Lednev, V.N.; Pershin, S.M.; Kapralov, P.O. Ultracompact Fluorescence Lidar Based on a Diode Laser (405 nm, 150 mW) for Remote Sensing of Waterbodies and the Underlying Surface from Unmanned Aerial Vehicles. Dokl. Phys. 2021, 66, 153–155. [Google Scholar] [CrossRef]
  33. Zhao, X.; Shi, S.; Yang, J.; Gong, W.; Sun, J.; Chen, B.; Guo, K.; Chen, B. Active 3D Imaging of Vegetation Based on Multi-Wavelength Fluorescence LiDAR. Sensors 2020, 20, 935. [Google Scholar] [CrossRef]
  34. Lu, J.; Yuan, Y.; Duan, Z.; Zhao, G.; Svanberg, S. Short-range remote sensing of water quality by a handheld fluorosensor system. Appl. Opt. 2020, 59, C1. [Google Scholar] [CrossRef] [PubMed]
  35. Wang, X.; Duan, Z.; Brydegaard, M.; Svanberg, S.; Zhao, G. Drone-based area scanning of vegetation fluorescence height profiles using a miniaturized hyperspectral lidar system. Appl. Phys. B Lasers Opt. 2018, 124, 207. [Google Scholar] [CrossRef]
  36. Duan, Z.; Li, Y.; Wang, X.; Wang, J.; Brydegaard, M.; Zhao, G.; Svanberg, S. Drone-Based Fluorescence Lidar Systems for Vegetation and Marine Environment Monitoring. In Proceedings of the EPJ Web of Conferences; EDP Sciences: Hefei, China, 2020; Volume 237. [Google Scholar]
  37. Cerovic, Z.G.; Samson, G.; Morales, F.; Tremblay, N.; Moya, I. Ultraviolet-induced fluorescence for plant monitoring: Present state and prospects. Agronomie 1999, 19, 543–578. [Google Scholar] [CrossRef] [Green Version]
  38. Senesi, G.S.; De Pascale, O.; Marangoni, B.S.; Caires, A.R.L.; Nicolodelli, G.; Pantaleo, V.; Leonetti, P. Chlorophyll Fluorescence Imaging (CFI) and Laser-Induced Breakdown Spectroscopy (LIBS) Applied to Investigate Tomato Plants Infected by the Root Knot Nematode (RKN) Meloidogyne incognita and Tobacco Plants Infected by Cymbidium Ringspot Virus. Photonics 2022, 9, 627. [Google Scholar] [CrossRef]
  39. Polukhin, A.A.; Litvinov, M.A.; Kurbanov, R.K.; Klimova, S.P. Development of the Parrot Sequoia Multispectral Camera Mount for the DJI Inspire 1 UAV. In Smart Innovation in Agriculture; Popkova, E.G., Sergi, B.S., Eds.; Springer Nature: Singapore, 2022; pp. 217–225. ISBN 978-981-16-7633-8. [Google Scholar]
  40. Lichtenthaler, H.K.; Rinderle, U. The Role of Chlorophyll Fluorescence in The Detection of Stress Conditions in Plants. CRC Crit. Rev. Anal. Chem. 1988, 19, S29–S85. [Google Scholar] [CrossRef]
  41. Ciganda, V.; Gitelson, A.; Schepers, J. Non-destructive determination of maize leaf and canopy chlorophyll content. J. Plant Physiol. 2009, 166, 157–167. [Google Scholar] [CrossRef]
Figure 1. (a) Principal scheme of the LIDAR components; (b) general view of the ultracompact LIDAR.
Figure 1. (a) Principal scheme of the LIDAR components; (b) general view of the ultracompact LIDAR.
Photonics 09 00963 g001
Figure 2. Photo of the LIDAR installed on the quadcopter (a) and during maize field sensing (b).
Figure 2. Photo of the LIDAR installed on the quadcopter (a) and during maize field sensing (b).
Photonics 09 00963 g002
Figure 3. (a)—Photo of a Zéa máys plant (arrows indicate laser-induced fluorescence measurement points); (b)—laser-induced fluorescence spectra for different parts of the maize plant (green leaf—top arrow in (a); stem—central arrow in (a); yellow leaf—bottom arrow in (a)).
Figure 3. (a)—Photo of a Zéa máys plant (arrows indicate laser-induced fluorescence measurement points); (b)—laser-induced fluorescence spectra for different parts of the maize plant (green leaf—top arrow in (a); stem—central arrow in (a); yellow leaf—bottom arrow in (a)).
Photonics 09 00963 g003
Figure 4. Aerial photo of the maize field. The white rectangle (size ~50 m × 100 m) represents the area where LIDAR measurements were carried out.
Figure 4. Aerial photo of the maize field. The white rectangle (size ~50 m × 100 m) represents the area where LIDAR measurements were carried out.
Photonics 09 00963 g004
Figure 5. Laser-induced fluorescence spectrum (black color) and background emission spectrum (red color). Spectra reproducibility estimated by 10 parallel measurements is indicated by grey and light red shaded areas.
Figure 5. Laser-induced fluorescence spectrum (black color) and background emission spectrum (red color). Spectra reproducibility estimated by 10 parallel measurements is indicated by grey and light red shaded areas.
Photonics 09 00963 g005
Figure 6. Example of the maize leaf fluorescence spectrum. A two-shouldered band of chlorophyll fluorescence can be seen at ~650–800 nm. Shaded regions indicate the spectrum integrals used for calculating the metrics.
Figure 6. Example of the maize leaf fluorescence spectrum. A two-shouldered band of chlorophyll fluorescence can be seen at ~650–800 nm. Shaded regions indicate the spectrum integrals used for calculating the metrics.
Photonics 09 00963 g006
Figure 7. Fluorescence map for maize field acquired on 17 August 2021: (a) map of 680 nm band integral; (b) map of 740 nm band integral; (c) map of 680/740 nm bands ratio.
Figure 7. Fluorescence map for maize field acquired on 17 August 2021: (a) map of 680 nm band integral; (b) map of 740 nm band integral; (c) map of 680/740 nm bands ratio.
Photonics 09 00963 g007
Figure 8. Maize photos taken during characteristic periods of the plant life cycle: leaf growth (a), flowering (b), and fruit ripening (c), and spectra of laser-induced fluorescence measured at corresponding periods (d).
Figure 8. Maize photos taken during characteristic periods of the plant life cycle: leaf growth (a), flowering (b), and fruit ripening (c), and spectra of laser-induced fluorescence measured at corresponding periods (d).
Photonics 09 00963 g008
Figure 9. Maps of the 680/740 nm integral ratio acquired during three periods of maize plant life cycle.
Figure 9. Maps of the 680/740 nm integral ratio acquired during three periods of maize plant life cycle.
Photonics 09 00963 g009
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Lednev, V.N.; Grishin, M.Y.; Sdvizhenskii, P.A.; Kurbanov, R.K.; Litvinov, M.A.; Gudkov, S.V.; Pershin, S.M. Fluorescence Mapping of Agricultural Fields Utilizing Drone-Based LIDAR. Photonics 2022, 9, 963. https://doi.org/10.3390/photonics9120963

AMA Style

Lednev VN, Grishin MY, Sdvizhenskii PA, Kurbanov RK, Litvinov MA, Gudkov SV, Pershin SM. Fluorescence Mapping of Agricultural Fields Utilizing Drone-Based LIDAR. Photonics. 2022; 9(12):963. https://doi.org/10.3390/photonics9120963

Chicago/Turabian Style

Lednev, Vasily N., Mikhail Ya. Grishin, Pavel A. Sdvizhenskii, Rashid K. Kurbanov, Maksim A. Litvinov, Sergey V. Gudkov, and Sergey M. Pershin. 2022. "Fluorescence Mapping of Agricultural Fields Utilizing Drone-Based LIDAR" Photonics 9, no. 12: 963. https://doi.org/10.3390/photonics9120963

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop