Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (67)

Search Parameters:
Keywords = NIR camera sensor

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
18 pages, 16142 KB  
Article
Unmanned Aerial Vehicles and Low-Cost Sensors for Monitoring Biophysical Parameters of Sugarcane
by Maurício Martello, Mateus Lima Silva, Carlos Augusto Alves Cardoso Silva, Rodnei Rizzo, Ana Karla da Silva Oliveira and Peterson Ricardo Fiorio
AgriEngineering 2025, 7(12), 403; https://doi.org/10.3390/agriengineering7120403 - 1 Dec 2025
Viewed by 691
Abstract
Unmanned Aerial Vehicles (UAVs) equipped with low-cost RGB and near-infrared (NIR) cameras represent efficient and scalable technology for monitoring sugarcane crops. This study evaluated the potential of UAV imagery and three-dimensional crop modeling to estimate sugarcane height and yield under different nitrogen fertilization [...] Read more.
Unmanned Aerial Vehicles (UAVs) equipped with low-cost RGB and near-infrared (NIR) cameras represent efficient and scalable technology for monitoring sugarcane crops. This study evaluated the potential of UAV imagery and three-dimensional crop modeling to estimate sugarcane height and yield under different nitrogen fertilization levels. The experiment comprised 28 plots subjected to four nitrogen rates, and images were processed using a Structure from Motion (SfM) algorithm to generate Digital Surface Models (DSMs). Crop Height Models (CHMs) were obtained by subtracting DSMs from Digital Terrain Models (DTMs). The most accurate CHM was derived from the combination of the reference DTM and the NIR-based DSM (R2 = 0.957; RMSE = 0.162 m), while the strongest correlation between height and yield was observed at 200 days after cutting (R2 = 0.725; RMSE = 4.85 t ha−1). The NIR-modified sensor, developed at a total cost of USD 61.59, demonstrated performance comparable with commercial systems that are up to two hundred times more expensive. These results demonstrate that the proposed low-cost NIR sensor provides accurate, reliable, and accessible data for three-dimensional modeling of sugarcane. Full article
(This article belongs to the Section Remote Sensing in Agriculture)
Show Figures

Figure 1

12 pages, 3845 KB  
Proceeding Paper
Exploring the Application of UAV-Multispectral Sensors for Proximal Imaging of Agricultural Crops
by Tarun Teja Kondraju, Rabi N. Sahoo, Selvaprakash Ramalingam, Rajan G. Rejith, Amrita Bhandari, Rajeev Ranjan and Devanakonda Venkata Sai Chakradhar Reddy
Eng. Proc. 2025, 118(1), 91; https://doi.org/10.3390/ECSA-12-26542 - 7 Nov 2025
Viewed by 227
Abstract
UAV-mounted multispectral sensors are widely used to study crop health. Utilising the same cameras to capture close-up images of crops can significantly improve crop health evaluations through multispectral technology. Unlike RGB cameras that only detect visible light, these sensors can identify additional spectral [...] Read more.
UAV-mounted multispectral sensors are widely used to study crop health. Utilising the same cameras to capture close-up images of crops can significantly improve crop health evaluations through multispectral technology. Unlike RGB cameras that only detect visible light, these sensors can identify additional spectral bands in the red-edge and near-infrared (NIR) ranges. This enables early detection of diseases, pests, and deficiencies through the calculation of various spectral indices. In this work, the ability to use UAV-multispectral sensors for close-proximity imaging of crops was studied. Images of plants were taken with a Micasense Rededge-MX from top and side views at a distance of 1 m. The camera has five sensors that independently capture blue, green, red, red-edge, and NIR light. The slight misalignment of these sensors results in a shift in the swath. This shift needs to be corrected to create a proper layer stack that could allow for further processing. This research utilised the Oriented FAST and Rotated BRIEF (ORB) method to detect features in each image. Random sample consensus (RANSAC) was used for feature matching to find similar features in the slave images compared to the master image (indicated by the green band). Utilising homography to warp the slave images ensures their perfect alignment with the master image. After alignment, the images were stacked, and the alignment accuracy was visually checked using true colour composites. The side-view images of the plants were perfectly aligned, while the top-view images showed errors, particularly in the pixels far from the centre. This study demonstrates that UAV-mounted multispectral sensors can capture images of plants effectively, provided the plant is centred in the frame and occupies a smaller area within the image. Full article
Show Figures

Figure 1

11 pages, 3390 KB  
Article
Material Sensing with Spatial and Spectral Resolution Based on an Integrated Near-Infrared Spectral Sensor and a CMOS Camera
by Ben Delaney, Sjors Buntinx, Don M. J. van Elst, Anne van Klinken, René P. J. van Veldhoven and Andrea Fiore
Sensors 2025, 25(11), 3295; https://doi.org/10.3390/s25113295 - 23 May 2025
Viewed by 1856
Abstract
Measuring the composition of materials at a distance is a key requirement in industrial process monitoring, recycling, precision agriculture, and environmental monitoring. Spectral imaging in the visible or near-infrared (NIR) spectral bands provides a potential solution by combining spatial and spectral information, and [...] Read more.
Measuring the composition of materials at a distance is a key requirement in industrial process monitoring, recycling, precision agriculture, and environmental monitoring. Spectral imaging in the visible or near-infrared (NIR) spectral bands provides a potential solution by combining spatial and spectral information, and its application has seen significant growth over recent decades. Low-cost solutions for visible multispectral imaging (MSI) have been developed due to the widespread availability of silicon detectors, which are sensitive in this spectral region. In contrast, development in the NIR has been slower, primarily due to the high cost of indium gallium arsenide (InGaAs) detector arrays required for imaging. This work aims to bridge this gap by introducing a standoff material sensing concept which combines spatial and spectral resolution without the hardware requirements of traditional spectral imaging systems. It combines spatial imaging in the visible range with a CMOS camera and NIR spectral measurement at selected points of the scene using an NIR spectral sensor. This allows the chemical characterization of different objects of interest in a scene without acquiring a full spectral image. We showcase its application in plastic classification, a key functionality in sorting and recycling systems. The system demonstrated the capability to classify visually identical plastics of different types in a standoff measurement configuration and to produce spectral measurements at up to 100 points in a scene. Full article
Show Figures

Figure 1

30 pages, 12255 KB  
Article
Unmanned Aerial Vehicle-Based Hyperspectral Imaging for Potato Virus Y Detection: Machine Learning Insights
by Siddat B. Nesar, Paul W. Nugent, Nina K. Zidack and Bradley M. Whitaker
Remote Sens. 2025, 17(10), 1735; https://doi.org/10.3390/rs17101735 - 15 May 2025
Viewed by 3016
Abstract
The potato is the third most important crop in the world, and more than 375 million metric tonnes of potatoes are produced globally on an annual basis. Potato Virus Y (PVY) poses a significant threat to the production of seed potatoes, resulting in [...] Read more.
The potato is the third most important crop in the world, and more than 375 million metric tonnes of potatoes are produced globally on an annual basis. Potato Virus Y (PVY) poses a significant threat to the production of seed potatoes, resulting in economic losses and risks to food security. Current detection methods for PVY typically rely on serological assays for leaves and PCR for tubers; however, these processes are labor-intensive, time-consuming, and not scalable. In this proof-of-concept study, we propose the use of unmanned aerial vehicles (UAVs) integrated with hyperspectral cameras, including a downwelling irradiance sensor, to detect the PVY in commercial growers’ fields. We used a 400–1000 nm visible and near-infrared (Vis-NIR) hyperspectral camera and trained several standard machine learning and deep learning models with optimized hyperparameters on a curated dataset. The performance of the models is promising, with the convolutional neural network (CNN) achieving a recall of 0.831, reliably identifying the PVY-infected plants. Notably, UAV-based imaging maintained performance levels comparable to ground-based methods, supporting its practical viability. The hyperspectral camera captures a wide range of spectral bands, many of which are redundant in identifying the PVY. Our analysis identified five key spectral regions that are informative in identifying the PVY. Two of them are in the visible spectrum, two are in the near-infrared spectrum, and one is in the red-edge spectrum. This research shows that early-season PVY detection is feasible using UAV hyperspectral imaging, offering the potential to minimize economic and yield losses. It also highlights the most relevant spectral regions that carry the distinctive signatures of PVY. This research demonstrates the feasibility of early-season PVY detection using UAV hyperspectral imaging and provides guidance for developing cost-effective multispectral sensors tailored to this task. Full article
Show Figures

Figure 1

36 pages, 26652 KB  
Article
Low-Light Image Enhancement for Driving Condition Recognition Through Multi-Band Images Fusion and Translation
by Dong-Min Son and Sung-Hak Lee
Mathematics 2025, 13(9), 1418; https://doi.org/10.3390/math13091418 - 25 Apr 2025
Viewed by 1397
Abstract
When objects are obscured by shadows or dim surroundings, image quality is improved by fusing near-infrared and visible-light images. At night, when visible and NIR lights are insufficient, long-wave infrared (LWIR) imaging can be utilized, necessitating the attachment of a visible-light sensor to [...] Read more.
When objects are obscured by shadows or dim surroundings, image quality is improved by fusing near-infrared and visible-light images. At night, when visible and NIR lights are insufficient, long-wave infrared (LWIR) imaging can be utilized, necessitating the attachment of a visible-light sensor to an LWIR camera to simultaneously capture both LWIR and visible-light images. This camera configuration enables the acquisition of infrared images at various wavelengths depending on the time of day. To effectively fuse clear visible regions from the visible-light spectrum with those from the LWIR spectrum, a multi-band fusion method is proposed. The proposed fusion process subsequently combines detailed information from infrared and visible-light images, enhancing object visibility. Additionally, this process compensates for color differences in visible-light images, resulting in a natural and visually consistent output. The fused images are further enhanced using a night-to-day image translation module, which improves overall brightness and reduces noise. This night-to-day translation module is a trained CycleGAN-based module that adjusts object brightness in nighttime images to levels comparable to daytime images. The effectiveness and superiority of the proposed method are validated using image quality metrics. The proposed method significantly contributes to image enhancement, achieving the best average scores compared to other methods, with a BRISQUE of 30.426 and a PIQE of 22.186. This study improves the accuracy of human and object recognition in CCTV systems and provides a potential image-processing tool for autonomous vehicles. Full article
Show Figures

Figure 1

30 pages, 4911 KB  
Article
In-Field Forage Biomass and Quality Prediction Using Image and VIS-NIR Proximal Sensing with Machine Learning and Covariance-Based Strategies for Livestock Management in Silvopastoral Systems
by Claudia M. Serpa-Imbett, Erika L. Gómez-Palencia, Diego A. Medina-Herrera, Jorge A. Mejía-Luquez, Remberto R. Martínez, William O. Burgos-Paz and Lorena A. Aguayo-Ulloa
AgriEngineering 2025, 7(4), 111; https://doi.org/10.3390/agriengineering7040111 - 8 Apr 2025
Cited by 2 | Viewed by 2005
Abstract
Controlling forage quality and grazing are crucial for sustainable livestock production, health, productivity, and animal performance. However, the limited availability of reliable handheld sensors for timely pasture quality prediction hinders farmers’ ability to make informed decisions. This study investigates the in-field dynamics of [...] Read more.
Controlling forage quality and grazing are crucial for sustainable livestock production, health, productivity, and animal performance. However, the limited availability of reliable handheld sensors for timely pasture quality prediction hinders farmers’ ability to make informed decisions. This study investigates the in-field dynamics of Mombasa grass (Megathyrsus maximus) forage biomass production and quality using optical techniques such as visible imaging and near-infrared (VIS-NIR) hyperspectral proximal sensing combined with machine learning models enhanced by covariance-based error reduction strategies. Data collection was conducted using a cellphone camera and a handheld VIS-NIR spectrometer. Feature extraction to build the dataset involved image segmentation, performed using the Mahalanobis distance algorithm, as well as spectral processing to calculate multiple vegetation indices. Machine learning models, including linear regression, LASSO, Ridge, ElasticNet, k-nearest neighbors, and decision tree algorithms, were employed for predictive analysis, achieving high accuracy with R2 values ranging from 0.938 to 0.998 in predicting biomass and quality traits. A strategy to achieve high performance was implemented by using four spectral captures and computing the reflectance covariance at NIR wavelengths, accounting for the three-dimensional characteristics of the forage. These findings are expected to advance the development of AI-based tools and handheld sensors particularly suited for silvopastoral systems. Full article
Show Figures

Graphical abstract

18 pages, 6700 KB  
Article
NightHawk: A Low-Cost, Nighttime Light Wildfire Observation Platform and Its Radiometric Calibration
by Chase A. Fuller, Steve Tammes, Philip Kaaret, Jun Wang, Carlton H. Richey, Marc Linderman, Emmett J. Ientilucci, Thomas Schnell, William Julstrom, Jarret McElrath, Will Meiners, Jack Kelley and Francis Mawanda
Sensors 2025, 25(7), 2049; https://doi.org/10.3390/s25072049 - 25 Mar 2025
Viewed by 1859
Abstract
We present a low-cost prototype of a visible and near-infrared (VIS-NIR) remote sensing platform, optimized to detect and characterize natural flaming fire fronts from airborne nighttime light (NTL) observations, and its radiometric calibration. It uses commercially available CMOS sensor cameras and filters with [...] Read more.
We present a low-cost prototype of a visible and near-infrared (VIS-NIR) remote sensing platform, optimized to detect and characterize natural flaming fire fronts from airborne nighttime light (NTL) observations, and its radiometric calibration. It uses commercially available CMOS sensor cameras and filters with roughly 100 nm bandwidths to effectively discriminate burning biomass from other sources of NTL, a critical ability for wildfire monitoring near populated areas. Our filter choice takes advantage of the strong potassium line emission near 770 nm present in natural flaming. The calibrated cameras operate at 20 ms of exposure time and boast radiance measurements with a sensitivity floor, depending on the filter, in the range 3–5 × 106 W m−2 sr−1 nm−1 with uncertainties lower than 5% and dynamic ranges near 3000–4000. An additional exposure time with a tenth of the duration is calibrated and extends the dynamic range by a factor of 10. We show images of a spatially resolved fire front from an airborne observation of flaming biomass within this radiance range. Full article
(This article belongs to the Section Remote Sensors)
Show Figures

Figure 1

22 pages, 6757 KB  
Article
Co-Registration of Multi-Modal UAS Pushbroom Imaging Spectroscopy and RGB Imagery Using Optical Flow
by Ryan S. Haynes, Arko Lucieer, Darren Turner and Emiliano Cimoli
Drones 2025, 9(2), 132; https://doi.org/10.3390/drones9020132 - 11 Feb 2025
Cited by 2 | Viewed by 2015
Abstract
Remote sensing from unoccupied aerial systems (UASs) has witnessed exponential growth. The increasing use of imaging spectroscopy sensors and RGB cameras on UAS platforms demands accurate, cross-comparable multi-sensor data. Inherent errors during image capture or processing can introduce spatial offsets, diminishing spatial accuracy [...] Read more.
Remote sensing from unoccupied aerial systems (UASs) has witnessed exponential growth. The increasing use of imaging spectroscopy sensors and RGB cameras on UAS platforms demands accurate, cross-comparable multi-sensor data. Inherent errors during image capture or processing can introduce spatial offsets, diminishing spatial accuracy and hindering cross-comparison and change detection analysis. To address this, we demonstrate the use of an optical flow algorithm, eFOLKI, for co-registering imagery from two pushbroom imaging spectroscopy sensors (VNIR and NIR/SWIR) to an RGB orthomosaic. Our study focuses on two ecologically diverse vegetative sites in Tasmania, Australia. Both sites are structurally complex, posing challenging datasets for co-registration algorithms with initial georectification spatial errors of up to 9 m planimetrically. The optical flow co-registration significantly improved the spatial accuracy of the imaging spectroscopy relative to the RGB orthomosaic. After co-registration, spatial alignment errors were greatly improved, with RMSE and MAE values of less than 13 cm for the higher-spatial-resolution dataset and less than 33 cm for the lower resolution dataset, corresponding to only 2–4 pixels in both cases. These results demonstrate the efficacy of optical flow co-registration in reducing spatial discrepancies between multi-sensor UAS datasets, enhancing accuracy and alignment to enable robust environmental monitoring. Full article
Show Figures

Figure 1

21 pages, 20183 KB  
Article
Predicting Particle Size and Soil Organic Carbon of Soil Profiles Using VIS-NIR-SWIR Hyperspectral Imaging and Machine Learning Models
by Karym Mayara de Oliveira, João Vitor Ferreira Gonçalves, Renato Herrig Furlanetto, Caio Almeida de Oliveira, Weslei Augusto Mendonça, Daiane de Fatima da Silva Haubert, Luís Guilherme Teixeira Crusiol, Renan Falcioni, Roney Berti de Oliveira, Amanda Silveira Reis, Arney Eduardo do Amaral Ecker and Marcos Rafael Nanni
Remote Sens. 2024, 16(16), 2869; https://doi.org/10.3390/rs16162869 - 6 Aug 2024
Cited by 9 | Viewed by 4178
Abstract
Modeling spectral reflectance data using machine learning algorithms presents a promising approach for estimating soil attributes. Nevertheless, a comprehensive investigation of the most effective models, parameters, wavelengths, and data acquisition techniques is essential to ensure optimal predictive accuracy. This work aimed to (a) [...] Read more.
Modeling spectral reflectance data using machine learning algorithms presents a promising approach for estimating soil attributes. Nevertheless, a comprehensive investigation of the most effective models, parameters, wavelengths, and data acquisition techniques is essential to ensure optimal predictive accuracy. This work aimed to (a) explore the potential of the soil spectral signature obtained in different spectral bands (VIS-NIR, SWIR, and VIS-NIR-SWIR) and, by using hyperspectral imaging and non-imaging sensors, in the predictive modeling of soil attributes; and (b) analyze the accuracy of different ML models in predicting particle size and soil organic carbon (SOC) applied to the spectral signature of different spectral bands. Six soil monoliths, located in the central north region of Parana, Brazil, were collected and scanned via hyperspectral cameras (VIS-NIR camera and SWIR camera) and spectroradiometer (VIS-NIR-SWIR) in the laboratory. The spectral signature of the soils was analyzed and subsequently applied to ML models to predict particle size and SOC. Each set of data obtained by the different sensors was evaluated separately. The algorithms used were k-nearest neighbors (KNN), support vector machine (SVM), random forest (RF), linear regression (LR), artificial neural network (NN), and partial least square regression (PLSR). The most promising predictive performance was observed for the complete VIS-NIR-SWIR spectrum, followed by SWIR and VIS-NIR. Meanwhile, KNN, RF, and NN models were the most promising algorithms in estimating soil attributes for the dataset obtained from both sensors. The general mean R2 (determination coefficient) values obtained using these models, considering the different spectral bands evaluated, were around 0.99, 0.98, and 0.97 for sand prediction, and around 0.99, 0.98, and 0.96 for clay prediction. The lower performances, obtained for the datasets from both sensors, were observed for silt and SOC, with R2 results between 0.40 and 0.59 for these models. KNN demonstrated the best predictive performance. Integrating effective ML models with robust sample databases, obtained by advanced hyperspectral imaging and spectroradiometers, can enhance the accuracy and efficiency of soil attribute prediction. Full article
(This article belongs to the Special Issue Remote Sensing for Soil Environments)
Show Figures

Figure 1

14 pages, 13233 KB  
Communication
Radiometric Calibration of the Near-Infrared Bands of GF-5-02/DPC for Water Vapor Retrieval
by Yanqing Xie, Qingyu Zhu, Sifeng Zhu, Weizhen Hou, Liguo Zhang, Xuefeng Lei, Miaomiao Zhang, Yunduan Li, Zhenhai Liu, Yuan Wen and Zhengqiang Li
Remote Sens. 2024, 16(10), 1806; https://doi.org/10.3390/rs16101806 - 20 May 2024
Cited by 6 | Viewed by 1801
Abstract
The GaoFen (GF)-5-02 satellite is one of the new generations of hyperspectral observation satellites launched by China in 2021. The directional polarimetric camera (DPC) is an optical sensor onboard the GF-5-02 satellite. The precipitable water vapor (PWV) is a key detection parameter of [...] Read more.
The GaoFen (GF)-5-02 satellite is one of the new generations of hyperspectral observation satellites launched by China in 2021. The directional polarimetric camera (DPC) is an optical sensor onboard the GF-5-02 satellite. The precipitable water vapor (PWV) is a key detection parameter of DPC. However, the existing PWV data developed using DPC data have significant errors due to the lack of the timely calibration of the two bands (865, 910 nm) of DPC used for PWV retrieval. In order to acquire DPC PWV data with smaller errors, a calibration method is developed for these two bands. The method consists of two parts: (1) calibrate the 865 nm band of the DPC using the cross-calibration method, (2) calibrate the 910 nm band of the DPC according to the calibrated 865 nm band of the DPC. This method effectively addresses the issue of the absence of a calibration method for the water vapor absorption band (910 nm) of the DPC. Regardless of whether AERONET PWV data or SuomiNET PWV data are used as the reference data, the accuracy of the DPC PWV data developed using calibrated DPC data is significantly superior to that of the DPC PWV data retrieved using data before recalibration. This means that the calibration method for the NIR bands of the DPC can effectively enhance the quality of DPC PWV data. Full article
Show Figures

Graphical abstract

22 pages, 7249 KB  
Article
The Retrieval of Ground NDVI (Normalized Difference Vegetation Index) Data Consistent with Remote-Sensing Observations
by Qi Zhao and Yonghua Qu
Remote Sens. 2024, 16(7), 1212; https://doi.org/10.3390/rs16071212 - 29 Mar 2024
Cited by 44 | Viewed by 15135
Abstract
The Normalized Difference Vegetation Index (NDVI) is widely used for monitoring vegetation status, as accurate and reliable NDVI time series are crucial for understanding the relationship between environmental conditions, vegetation health, and productivity. Ground digital cameras have been recognized as important potential data [...] Read more.
The Normalized Difference Vegetation Index (NDVI) is widely used for monitoring vegetation status, as accurate and reliable NDVI time series are crucial for understanding the relationship between environmental conditions, vegetation health, and productivity. Ground digital cameras have been recognized as important potential data sources for validating remote-sensing NDVI products. However, differences in the spectral characteristics and imaging methods between sensors onboard satellites and ground digital cameras hinder direct consistency analyses, thereby limiting the quantitative application of camera-based observations. To address this limitation and meet the needs of vegetation monitoring research and remote-sensing NDVI validation, this study implements a novel NDVI camera. The proposed camera incorporates narrowband dual-pass filters designed to precisely separate red and near-infrared (NIR) spectral bands, which are aligned with the configuration of sensors onboard satellites. Through software-controlled imaging parameters, the camera captures the real radiance of vegetation reflection, ensuring the acquisition of accurate NDVI values while preserving the evolving trends of the vegetation status. The performance of this NDVI camera was evaluated using a hyperspectral spectrometer in the Hulunbuir Grassland over a period of 93 days. The results demonstrate distinct seasonal characteristics in the camera-derived NDVI time series using the Green Chromatic Coordinate (GCC) index. Moreover, in comparison to the GCC index, the camera’s NDVI values exhibit greater consistency with those obtained from the hyperspectral spectrometer, with a mean deviation of 0.04, and a relative root mean square error of 9.68%. This indicates that the narrowband NDVI, compared to traditional color indices like the GCC index, has a stronger ability to accurately capture vegetation changes. Cross-validation using the NDVI results from the camera and the PlanetScope satellite further confirms the potential of the camera-derived NDVI data for consistency analyses with remote sensing-based NDVI products, thus highlighting the potential of camera observations for quantitative applications The research findings emphasize that the novel NDVI camera, based on a narrowband spectral design, not only enables the acquisition of real vegetation index (VI) values but also facilitates the direct validation of vegetation remote-sensing NDVI products. Full article
Show Figures

Figure 1

20 pages, 3010 KB  
Article
Yield Prediction Using NDVI Values from GreenSeeker and MicaSense Cameras at Different Stages of Winter Wheat Phenology
by Sándor Zsebő, László Bede, Gábor Kukorelli, István Mihály Kulmány, Gábor Milics, Dávid Stencinger, Gergely Teschner, Zoltán Varga, Viktória Vona and Attila József Kovács
Drones 2024, 8(3), 88; https://doi.org/10.3390/drones8030088 - 5 Mar 2024
Cited by 25 | Viewed by 7471
Abstract
This work aims to compare and statistically analyze Normalized Difference Vegetation Index (NDVI) values provided by GreenSeeker handheld crop sensor measurements and calculate NDVI values derived from the MicaSense RedEdge-MX Dual Camera, to predict in-season winter wheat (Triticum aestivum L.) yield, improving [...] Read more.
This work aims to compare and statistically analyze Normalized Difference Vegetation Index (NDVI) values provided by GreenSeeker handheld crop sensor measurements and calculate NDVI values derived from the MicaSense RedEdge-MX Dual Camera, to predict in-season winter wheat (Triticum aestivum L.) yield, improving a yield prediction model with cumulative growing degree days (CGDD) and days from sowing (DFS) data. The study area was located in Mosonmagyaróvár, Hungary. A small-scale field trial in winter wheat was constructed as a randomized block design including Environmental: N-135.3, P2O5-77.5, K2O-0; Balance: N-135.1, P2O5-91, K2O-0; Genezis: N-135, P2O5-75, K2O-45; and Control: N, P, K 0 kg/ha. The crop growth was monitored every second week between April and June 2022 and 2023, respectively. NDVI measurements recorded by GreenSeeker were taken at three pre-defined GPS points for each plot; NDVI values based on the MicaSense camera Red and NIR bands were calculated for the same points. Results showed a significant difference (p ≤ 0.05) between the Control and treated areas by GreenSeeker measurements and Micasense-based calculated NDVI values throughout the growing season, except for the heading stage. At the heading stage, significant differences could be measured by GreenSeeker. However, remotely sensed images did not show significant differences between the treated and Control parcels. Nevertheless, both sensors were found suitable for yield prediction, and 226 DAS was the most appropriate date for predicting winter wheat’s yield in treated plots based on NDVI values and meteorological data. Full article
(This article belongs to the Special Issue Advances of UAV in Precision Agriculture)
Show Figures

Figure 1

9 pages, 2941 KB  
Communication
Near-Infrared Fluorescence Imaging Sensor with Laser Diffuser for Visualizing Photoimmunotherapy Effects under Endoscopy
by Toshihiro Takamatsu, Hideki Tanaka and Tomonori Yano
Sensors 2024, 24(5), 1487; https://doi.org/10.3390/s24051487 - 25 Feb 2024
Cited by 2 | Viewed by 2875
Abstract
The drug efficacy evaluation of tumor-selective photosensitive substances was expected to be enabled by imaging the fluorescence intensity in the tumor area. However, fluorescence observation is difficult during treatments that are performed during gastrointestinal endoscopy because of the challenges associated with including the [...] Read more.
The drug efficacy evaluation of tumor-selective photosensitive substances was expected to be enabled by imaging the fluorescence intensity in the tumor area. However, fluorescence observation is difficult during treatments that are performed during gastrointestinal endoscopy because of the challenges associated with including the fluorescence filter in the camera part. To address this issue, this study developed a device that integrates a narrow camera and a laser diffuser to enable fluorescence imaging through a forceps port. This device was employed to demonstrate that a laser diffuser with an NIR fluorescence imaging sensor could be delivered through a 3.2 mm diameter port. In addition, fluorescence images of Cetuximab-IR700 were successfully observed in two mice, and the fluorescence intensity confirmed that the fluorescence decayed within 330 s. This device is expected to have practical application as a tool to identify the optimal irradiation dose for tumor-selective photosensitive substances under endoscopy. Full article
(This article belongs to the Special Issue Technology Trends in Fluorescence Detection Based on Biosensor)
Show Figures

Figure 1

14 pages, 7138 KB  
Article
Design and Development of Large-Band Dual-MSFA Sensor Camera for Precision Agriculture
by Vahid Mohammadi, Pierre Gouton, Matthieu Rossé and Kossi Kuma Katakpe
Sensors 2024, 24(1), 64; https://doi.org/10.3390/s24010064 - 22 Dec 2023
Cited by 10 | Viewed by 3190
Abstract
The optimal design and construction of multispectral cameras can remarkably reduce the costs of spectral imaging systems and efficiently decrease the amount of image processing and analysis required. Also, multispectral imaging provides effective imaging information through higher-resolution images. This study aimed to develop [...] Read more.
The optimal design and construction of multispectral cameras can remarkably reduce the costs of spectral imaging systems and efficiently decrease the amount of image processing and analysis required. Also, multispectral imaging provides effective imaging information through higher-resolution images. This study aimed to develop novel, multispectral cameras based on Fabry–Pérot technology for agricultural applications such as plant/weed separation, ripeness estimation, and disease detection. Two multispectral cameras were developed, covering visible and near-infrared ranges from 380 nm to 950 nm. A monochrome image sensor with a resolution of 1600 × 1200 pixels was used, and two multispectral filter arrays were developed and mounted on the sensors. The filter pitch was 4.5 μm, and each multispectral filter array consisted of eight bands. Band selection was performed using a genetic algorithm. For VIS and NIR filters, maximum RMS values of 0.0740 and 0.0986 were obtained, respectively. The spectral response of the filters in VIS was significant; however, in NIR, the spectral response of the filters after 830 nm decreased by half. In total, these cameras provided 16 spectral images in high resolution for agricultural purposes. Full article
(This article belongs to the Special Issue Perception and Imaging for Smart Agriculture)
Show Figures

Figure 1

26 pages, 2173 KB  
Article
Uvsq-Sat NG, a New CubeSat Pathfinder for Monitoring Earth Outgoing Energy and Greenhouse Gases
by Mustapha Meftah, Cannelle Clavier, Alain Sarkissian, Alain Hauchecorne, Slimane Bekki, Franck Lefèvre, Patrick Galopeau, Pierre-Richard Dahoo, Andrea Pazmino, André-Jean Vieau, Christophe Dufour, Pierre Maso, Nicolas Caignard, Frédéric Ferreira, Pierre Gilbert, Odile Hembise Fanton d’Andon, Sandrine Mathieu, Antoine Mangin, Catherine Billard and Philippe Keckhut
Remote Sens. 2023, 15(19), 4876; https://doi.org/10.3390/rs15194876 - 8 Oct 2023
Cited by 10 | Viewed by 5180
Abstract
Climate change is undeniably one of the most pressing and critical challenges facing humanity in the 21st century. In this context, monitoring the Earth’s Energy Imbalance (EEI) is fundamental in conjunction with greenhouse gases (GHGs) in order to comprehensively understand and address climate [...] Read more.
Climate change is undeniably one of the most pressing and critical challenges facing humanity in the 21st century. In this context, monitoring the Earth’s Energy Imbalance (EEI) is fundamental in conjunction with greenhouse gases (GHGs) in order to comprehensively understand and address climate change. The French Uvsq-Sat NG pathfinder mission addresses this issue through the implementation of a Six-Unit CubeSat, which has dimensions of 111.3 × 36.6 × 38.8 cm in its unstowed configuration. Uvsq-Sat NG is a satellite mission spearheaded by the Laboratoire Atmosphères, Observations Spatiales (LATMOS), and supported by the International Satellite Program in Research and Education (INSPIRE). The launch of this mission is planned for 2025. One of the Uvsq-Sat NG objectives is to ensure the smooth continuity of the Earth Radiation Budget (ERB) initiated via the Uvsq-Sat and Inspire-Sat satellites. Uvsq-Sat NG seeks to achieve broadband ERB measurements using state-of-the-art yet straightforward technologies. Another goal of the Uvsq-Sat NG mission is to conduct precise and comprehensive monitoring of atmospheric gas concentrations (CO2 and CH4) on a global scale and to investigate its correlation with Earth’s Outgoing Longwave Radiation (OLR). Uvsq-Sat NG carries several payloads, including Earth Radiative Sensors (ERSs) for monitoring incoming solar radiation and outgoing terrestrial radiation. A Near-Infrared (NIR) Spectrometer is onboard to assess GHGs’ atmospheric concentrations through observations in the wavelength range of 1200 to 2000 nm. Uvsq-Sat NG also includes a high-definition camera (NanoCam) designed to capture images of the Earth in the visible range. The NanoCam will facilitate data post-processing acquired via the spectrometer by ensuring accurate geolocation of the observed scenes. It will also offer the capability of observing the Earth’s limb, thus providing the opportunity to roughly estimate the vertical temperature profile of the atmosphere. We present here the scientific objectives of the Uvsq-Sat NG mission, along with a comprehensive overview of the CubeSat platform’s concepts and payload properties as well as the mission’s current status. Furthermore, we also describe a method for the retrieval of atmospheric gas columns (CO2, CH4, O2, H2O) from the Uvsq-Sat NG NIR Spectrometer data. The retrieval is based on spectra simulated for a range of environmental conditions (surface pressure, surface reflectance, vertical temperature profile, mixing ratios of primary gases, water vapor, other trace gases, cloud and aerosol optical depth distributions) as well as spectrometer characteristics (Signal-to-Noise Ratio (SNR) and spectral resolution from 1 to 6 nm). Full article
Show Figures

Figure 1

Back to TopTop