Next Article in Journal
Autonomous UAV-Based System for Scalable Tactile Paving Inspection
Previous Article in Journal
Intelligent Queue Scheduling Method for SPMA-Based UAV Networks
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

All-Weather Drone Vision: Passive SWIR Imaging in Fog and Rain

1
Quantum Advanced Solutions Ltd., Oxford OX5 1RD, UK
2
Department of Computer Science & Engineering, University of Minnesota, Minneapolis, MN 55455, USA
3
Topodrone SA, 1820 Montreux, Switzerland
*
Author to whom correspondence should be addressed.
The authors contributed equally to this work.
Drones 2025, 9(8), 553; https://doi.org/10.3390/drones9080553
Submission received: 22 June 2025 / Revised: 27 July 2025 / Accepted: 4 August 2025 / Published: 7 August 2025
(This article belongs to the Section Drone Design and Development)

Abstract

Short-wave-infrared (SWIR) imaging can extend drone operations into fog and rain, yet the optimum spectral strategy remains unclear. We evaluated a drone-borne quantum-dot SWIR camera inside a climate-controlled tunnel that generated calibrated advection fog, radiation fog, and rain. Images were captured with a broadband 400–1700 nm setting and three sub-band filters, each at four lens apertures (f/1.8–5.6). Entropy, structural-similarity index (SSIM), and peak signal-to-noise ratio (PSNR) were computed for every weather–aperture–filter combination. Broadband SWIR consistently outperformed all filtered configurations. The gain stems from higher photon throughput, which outweighs the modest scattering reduction offered by narrowband selection. Under passive illumination, broadband SWIR therefore represents the most robust single-camera choice for unmanned aerial vehicles (UAVs), enhancing situational awareness and flight safety in fog and rain.

1. Introduction

Unmanned aerial vehicles (UAVs) have become increasingly valuable for a wide range of civil and industrial applications, including infrastructure inspection, emergency response, precision agriculture, and surveillance. Many of these tasks depend on reliable visual perception systems that enable drones to operate safely and autonomously. However, environmental conditions such as fog, haze, and rain significantly degrade visibility in the visible spectrum, reducing image contrast and obscuring critical scene details. These challenges pose serious limitations to drone-based operations, especially in time-sensitive or safety-critical scenarios.
To overcome the limitations of visible-band imaging in degraded visual environments (DVEs), imaging systems operating in the short-wave infrared (SWIR) band, which typically span wavelengths from 900 to 2500 nm, have received growing attention. SWIR wavelengths are scattered less by small water droplets than visible light, enabling clearer imaging through fog and haze due to favorable behavior in the Rayleigh and Mie scattering regimes [1]. Furthermore, the SWIR spectral region contains atmospheric transmission “windows” where absorption by water vapor is relatively low, most notably between 1.5 and 1.7 µm, allowing light to propagate with reduced attenuation in humid and foggy conditions [1]. Unlike thermal cameras, which image emitted heat and often lack fine detail, SWIR sensors capture reflected light much like visible cameras, offering sharper textures and improved scene comprehension when ambient illumination or active lighting is available [2].
Numerous studies have validated these advantages. Driggers et al. [1] conducted a foundational comparison of SWIR, visible, and near infrared (NIR) imaging under foggy conditions and found that longer wavelengths provided superior target/background contrast, haze penetration, and long-range detection. Lang et al. [2] demonstrated in multi-band experiments that SWIR imagery maintains high signal-to-noise ratio (SNR) and scene clarity over distances up to 20 km in haze, while visible images deteriorate quickly. Jobert et al. [3] further confirmed SWIR’s advantage over visible sensors in long-range field trials under diverse atmospheric conditions, emphasizing SWIR’s resilience in real-world scenarios. In the context of autonomous navigation, Judd et al. [4] compared SWIR, visible, long-wave infrared (LWIR), and light detection and ranging (LiDAR) under artificial fog and concluded that longer wavelengths may extend visibility. St-Laurent et al. [5] further demonstrated that combining SWIR with NIR and LWIR sensors improved navigation safety in snowy and foggy driving conditions.
Although fog has been more extensively studied, rain presents an equally significant obstacle to optical systems due to strong scattering and glare from larger water droplets. In controlled experiments, Willitsford et al. [6] demonstrated that a range-gated active SWIR system using 1550 nm laser illumination could suppress near-field backscatter and form clear images of targets at 10 km, even when ambient visibility dropped to 3–4 km. Similarly, automotive testing has shown that SWIR cameras suffer less degradation in light rain compared to red-green-blue (RGB) systems [7]. These results indicate that SWIR offers both passive and active imaging potential under rainfall conditions. Additionally, Sheeny et al. [8] explored polarization-based imaging in the thermal IR for degraded weather conditions, highlighting the potential of multispectral systems—including SWIR—for robust visibility under complex atmospheric scenarios.
Complementing hardware advances, algorithmic techniques have been developed to enhance SWIR utility in fog. Duan et al. [9] proposed a dual-band defogging model that fuses SWIR and visible images to reconstruct obscured scene details. Their approach uses the deeper penetration of SWIR imagery to guide the enhancement of visible-light features, resulting in improved clarity over traditional single-band defogging methods. Meanwhile, Pavlović et al. [10] developed a deep learning framework that enables accurate object detection in SWIR imagery captured under fog and haze by applying cross-spectral training and domain adaptation. These works demonstrate how image processing can leverage SWIR’s physical advantages to support automated perception in degraded environments.
Despite these developments, little research has been conducted on the quantitative performance of SWIR sensors mounted on UAVs operating in realistic fog and rain conditions. Most prior work has been ground-based or focused on automotive or manned systems [1,2,3,4,6,7,10]. While drone-compatible SWIR cameras have become increasingly available [11], their operational capabilities in repeatable, controlled atmospheric conditions remain poorly documented. This represents a key knowledge gap in enabling UAV missions in adverse weather.
In this study, we address that gap through a systematic evaluation of a SWIR imaging system tested in the CEREMA PAVIN tunnel—a unique full-scale climate test facility designed to simulate realistic fog (advection and radiative) and rain environments under controlled conditions (Figure 1a) [12]. This work tests three interrelated hypotheses regarding passive VNIR–SWIR imaging in fog and rain from an unmanned platform: (i) within the 0.4–1.7 µm window, specific sub-bands should yield higher perceptual detail than a broadband collection once scattering, absorption, and sensor throughput are considered; (ii) practical image quality requires sufficient photon flux, and thus, sub-bands should approach this limit in low-illumination fog; (iii) stopping the lens down modestly can reduce optical aberrations and flare, potentially yielding a perceived-contrast gain in fog or rain. We evaluated a Q.Fly drone payload (Figure 1b) with integrated colloidal quantum dot (CQD) image sensors, which offer greater affordability and simpler/broader spectral tunability compared to epitaxial infrared sensors (Figure 1c). Unlike conventional SWIR systems, the camera core is an ultracompact, thermoelectric cooler (TEC)-less module that redefines size, weight, and power (SWaP) for field-deployable short-wave infrared imaging in VTOL UAV applications. We analyzed image quality across different lens apertures and spectral filters, using quantitative metrics such as entropy, structural similarity index, and peak signal-to-noise ratio. Our results provide one of the first controlled, quantitative assessments of UAV-mounted SWIR performance in low-visibility environments and support the adoption of broadband SWIR systems for robust, all-weather drone operations.

2. Materials and Methods

2.1. SWIR Camera

The quantum dot SWIR camera (Q.Cam, Quantum Advanced Solutions Ltd., Oxford, UK), as part of the UAV payload (Q.Fly, Quantum Advanced Solutions Ltd., Oxford, UK), was chosen owing to its ultracompact (35 × 25 × 25 mm3), lightweight (30 g), and power-efficient (1.3 W) core based on a 640 × 512-pixel front-illuminated and uncooled focal plane array with 5 µm pitch and a broad 400–1700 nm (VNIR-SWIR) spectral response, operated in global-shutter mode at 30–60 fps [13]. The SWIR-optimized lens with a focal length of 50 mm and manual f-stop was paired with the sensor, providing a field of view (FOV) of 3.7°, suitable for long-range imaging. Under fixed-exposure, shot-noise-limited conditions, the SNR is proportional to the pupil diameter (D) and thus inversely proportional to the f-number. Accordingly, stopping down from f/1.8 to f/5.6 is expected to reduce the SNR by roughly the same 3.1-fold. For outdoor reliability, the camera performs on-board corrections (bad-pixel replacement and two-point non-uniformity corrections for gain, offset, illumination, and temperature) in real time. The Q.Fly payload is a lightweight (600 g) multi-sensor module engineered for DJI Matrice 300/350/400 RTK platforms. The SWIR channel accepts quick-swap spectral filters, allowing band-selective acquisition. Three discrete filters were chosen to probe complementary regions of the SWIR spectrum while respecting the sensor’s responsivity curve: (i) 900 nm longpass captures both the first and second SWIR atmospheric windows and sits just beyond the visible/NIR transition; (ii) 1450 nm longpass marks a second atmospheric window and avoids a water-vapor absorption peak; (iii) 1550 nm narrowband coincides with the eye-safe telecom line used by most SWIR range-gated illuminators and drone-borne LiDARs. The SWIR camera is co-aligned with an RGB and thermal module for all-weather situational awareness. The payload is also equipped with a PPK GNSS module for precise geo-tagging.

2.2. Experiment Setup

The experimental pipeline consisted of a spectrally tunable payload mounted on a drone, climatic tunnel cycles producing calibrated rain and fog, varied optical settings of the camera, and offline batch computation of image quality metrics over multiple image stacks. All imaging trials were carried out inside CEREMA’s PAVIN climatic tunnel—a 50 m long enclosed test facility equipped to generate repeatable fog and rain environments under tightly controlled temperature, droplet-size, and droplet-density conditions [12]. A DJI Matrice 350 RTK UAV (SZ DJI Technology Co., Ltd., Shenzhen, China) was positioned on a static landing pad 100 cm above the floor at one portal of the tunnel and fitted with the Q.Fly payload described above. Stand-alone cameras were mounted on a tripod. The SWIR camera recorded image frames for selected runs (Figure 2). The distance between the cameras and targets was approximately 50 m.
Two fog regimes were produced in separate test blocks: advection (sea/maritime) fog and radiation (ground/continental) fog. The advection fog, generated with demineralized water, featured droplet radii with peaks at 0.8 µm and 8 µm, whereas the radiative fog, generated with mineralized (tap) water, had droplet radii peaking at approximately 0.5 µm. Each fog type was stepped through calibrated meteorological visibilities of 20, 40, 60, and 80 m, as measured by Konica-Minolta T-10A illuminance meters (Konica Minolta, Tokyo, Japan) installed at mid-tunnel. Rain trials followed, using the overhead sprinkler array to deliver uniform intensities of 30, 120, 180, and 250 mm/h. A high-contrast target, a human-silhouette, and road signs were mounted at the back of the tunnel. For every distance/obscurant combination, we captured images with four SWIR filter settings (broadband, 900 nm, 1450 nm, and 1550 nm) and variable f-stop (f/1.8, 2.8, 4.0, and 5.6). Environmental data (air temperature, relative humidity, droplet density, and rainfall rate) were logged. Representative images were saved in 16-bit TIFF, then processed offline to compute image quality metrics and target-contrast curves as a function of visibility. This controlled tunnel protocol ensured that all cameras viewed identical, repeatable fog and rain scenes, providing a robust basis for the performance comparisons reported in the following sections.

2.3. Image Evaluation and Content Loss

The selection of appropriate image quality assessment (IQA) metrics is crucial for objectively evaluating the performance of imaging systems and processing algorithms, particularly in challenging environments such as those involving SWIR cameras under varying weather and visibility conditions. Among the myriad of IQA metrics, entropy, SSIM, and PSNR have been widely adopted due to their distinct yet complementary approaches to quantifying image information.
Entropy (Equation (1)), rooted in information theory, provides a measure of the richness or randomness of information within an image. A higher entropy value generally indicates a greater amount of detail and textural complexity, which can be particularly relevant when assessing images captured in conditions where information might be obscured or degraded, such as in foggy [14] and rainy conditions [15]. Its application in computer vision and image processing allows for the quantification of information content, which is essential for tasks such as segmentation and feature extraction, making it a valuable tool for understanding the impact of environmental factors on image information.
H x = P x log 2 P x
where   x is image pixel; P x is pixel intensity.
While entropy focuses on the statistical distribution of pixel intensities, the structural similarity index (SSIM) (Equation (2)) offers a perceptually oriented assessment by comparing the structural information, luminance, and contrast between a reference and a distorted image. Developed as an improvement over traditional metrics like mean squared error (MSE), SSIM aims to mimic the human visual system’s ability to perceive structural similarities, making it a more reliable indicator of perceived image quality [16,17]. Its widespread use in evaluating the effects of compression, noise, and blur aligns well with the challenges encountered in outdoor imaging scenarios [18,19,20].
S S I M x , y = ( 2 μ x μ y + C 1 ) ( 2 σ x y + C 2 ) ( μ x 2 + μ y 2 + C 1 ) ( σ x 2 + σ y 2 + C 2 )
where x , y are two images, μ x and μ y are average pixel intensities, σ x y is the covariance of pixel intensities between the two images, σ x and σ y are the variance of the pixel intensities, and C 1 and C 2 are small constants preventing numerical instability.
Complementing these, the peak signal-to-noise ratio (PSNR) (Equation (3)) remains a widely used metric due to its simplicity and computational efficiency, providing a quantitative measure of the difference between an original and a processed (distorted) image. PSNR is defined via MSE and represents the ratio between the maximum possible power of a signal and the power of corrupting noise [21]. Although PSNR’s correlation with human perception can sometimes be limited, especially when compared to SSIM, it serves as a valuable benchmark for signal fidelity and is particularly useful for quantifying the level of noise or distortion introduced by various factors, including atmospheric conditions or sensor limitations. Its common application in evaluating lossy compression and image reconstruction quality makes it a standard metric for assessing the raw signal integrity in diverse imaging applications [18,19].
P S N R = 20 · log 10 M A X x , y M S E ( x , y )
where x , y are two images, M S E ( x , y ) represents mean squared error between them, and M A X x , y is the maximum pixel intensity of the two images. The SSIM/PSNR were calculated relative to the clear image without fog or rain applied.
Entropy, SSIM, and PSNR were chosen because together they capture complementary aspects of image fidelity: statistical information content, perceptual similarity, and signal integrity. Entropy traces back to Shannon’s information theory [20] and quantifies the average surprise of pixel intensities; it rises with scene detail and falls as fog or rain obliterate texture [19]. SSIM, proposed by Wang et al. [16], compares local luminance, contrast, and structure, closely mirroring human visual perception. PSNR remains the de-facto engineering metric for signal degradation; although less correlated with perception than SSIM, it provides an intuitive dB-scale measure of noise or blur relative to full-scale intensity [20,22]. In the literature, entropy, SSIM, and PSNR have been routinely applied to assess imagery quality in adverse-visibility research including various UAV systems [23,24,25,26].
It is important to note that higher entropy does not necessarily indicate better visual quality in this context; it merely quantifies complexity. While both SSIM and PSNR improve with image quality (higher is better), they respond to distortions differently. By analyzing all three, we obtain a holistic view of how SWIR settings preserve (or lose) information as visibility degrades. The combined use of entropy, SSIM, and PSNR thus offers a multifaceted evaluation, capturing aspects of information content, perceptual similarity, and signal fidelity, all of which are critical for a thorough analysis of image quality under varying environmental conditions.

3. Results

The selected metrics were derived from images captured with various optical filters (including broadband, 900 nm, 1450 nm, and 1550 nm) and apertures (f/1.8, 2.8, 4.0, and 5.6) under three types of weather conditions: rain, advection fog, and radiation fog. In practice, very high f-numbers, although they improve depth of focus, significantly reduce light intake and may require longer exposures—an important trade-off for real-time UAV operations. Therefore, we limited our analysis to moderate aperture values. Additionally, the captured images may contain various types of noise and optical distortions (e.g., lens aberrations, sensor noise, etc.), which can affect the computed values of entropy, SSIM, and PSNR and should be considered when interpreting the results.

3.1. Rain Conditions

Under rain conditions, SWIR imaging performance varied with filter configuration, exhibiting both expected trends and some anomalies. Figure 3 compares narrowband and broadband SWIR imagery during light, moderate, and heavy rain. Performance metrics were computed for all four filter types at four aperture settings. Overall, the broadband (unfiltered) and narrowband-filtered SWIR imagery provided consistent visibility of the target scene across rain intensities. Longpass SWIR did show occasional advantages in rain, but no firm conclusion can be drawn, as any advantage likely depends on other optical settings. These mixed results in rain highlight that while broadband SWIR performs robustly overall, there are scenarios where a SWIR filter can enhance specific image features. Still, during moderate rain conditions, both narrowband and broadband SWIR suffered degradation according to all the metrics, underscoring that a high density of small droplets and multiple scattering can overwhelm any passive optical imaging system. More generally, these findings highlight the importance of properly choosing camera settings for a particular application, an issue rarely discussed in the literature. Previous studies have generally examined only a narrow range of parameters without varying camera settings [15].

3.2. Advection Fog

Results for advection fog (a maritime-type fog with relatively large droplet sizes) showed modest differences between broadband and narrowband SWIR imagery (Figure 4). The SWIR cameras with all the varied settings were able to penetrate this fog to a degree, revealing objects that were barely perceptible in the visible-light reference at 40 m visibility; since the tunnel was 50 m long, some data points were omitted. Quantitatively, however, the image quality metrics in advection fog did not strongly favor either configuration. In fact, the faster optics slightly outperformed the slower ones under the most severe advection fog conditions. We attribute this to the fully open aperture collecting more light, thus compensating for signal loss that occurs in the partially closed lens. Contrary to our hypothesis (iii) that stopping down might improve image contrast, in advection fog, the wider aperture (f/1.8) produced better metrics due to greater photon collection. These results suggest that, for fog with larger droplets and moderate optical depth, neither a narrowband SWIR filter nor different aperture settings provided significant benefit. The broadband SWIR and fast optics already capitalize on the reduced scattering of longer wavelengths (relative to visible light) and benefit from a higher overall photon count, yielding comparable clarity in advection fog without the need for spectral filtering.

3.3. Radiation Fog

In radiation fog (a ground/continental fog with typically smaller droplet sizes but often high number density), the broadband SWIR configuration produced a higher entropy, SSIM, and (partly) PSNR than the narrower band filters. Figure 4 presents all the metrics for the target images under varying radiation fog densities. Entropy and SSIM demonstrate that the broadband approach preserved image features more effectively, yielding the top values of all configurations in this fog type. However, as the visibility decreased, the difference between filters became less significant. Initially, we hypothesized that the narrowband filters might excel in radiation fog due to reduced scattering at that longer wavelength. However, the data show the opposite effect. Nevertheless, considering the amount of light being blocked by filtering, one could argue that the 1550 nm showed the lowest relative decrease in each metric as radiation fog visibility decreased, which might be owing to less pronounced scattering. Notably, these findings apply to moderate fog conditions (visibility between 40 and 80 m) and limited illumination, essentially a photon-starved environment. Under dense fog (≤20 m), both SWIR and visible bands become saturated by scattering, leading to a near-complete white-out. In this regime, scattering becomes largely wavelength-independent, overwhelming any advantages from aperture or spectral filtering.

3.4. Statistical Evaluation

To assess whether the observed differences in image quality are statistically meaningful, we applied non-parametric procedures that do not assume normality or equal variances. For each image-quality metric (PSNR, entropy, and SSIM) we first ran a Kruskal–Wallis H-test to evaluate the null hypothesis that the metric comes from the same distribution across the levels of each experimental factor—weather condition, aperture, and spectral filter. Whenever a Kruskal–Wallis test was significant (α = 0.05), we performed post hoc Dunn pairwise comparisons with Benjamini–Hochberg false-discovery-rate (FDR) correction to control the expected proportion of false positives within that factor.
PSNR. Weather exerted a very strong effect on PSNR (Kruskal–Wallis p < 0.0001). Dunn tests confirmed that advection fog differed significantly from both radiation fog and rain (both adjusted p < 0.0001). Aperture and filter showed no significant influence on PSNR (p = 0.754 and p = 0.176, respectively).
Entropy. Aperture had a moderate main effect on entropy (p = 0.022), driven by lower entropy at f/5.6 compared with f/1.8 (p = 0.044) and f/2.8 (p = 0.024). Filter produced a highly significant main effect (p < 0.0001); key pairwise differences included broadband vs. 1450 nm and broadband vs. 1550 nm (both p < 0.0001) and 900 nm vs. 1450 nm (p = 0.0036). Weather had no significant impact on entropy (p = 0.252).
SSIM. Weather again showed a significant main effect (p = 0.0077), with advection fog differing from rain (p = 0.0067). Filter was significant as well (p = 0.0041); broadband vs. 1550 nm (p = 0.0081) and 1450 nm vs. 1550 nm (p = 0.042) were the principal contrasts. Aperture did not significantly influence SSIM (p = 0.577).
Overall, weather consistently governed PSNR and SSIM, whereas aperture and filter exerted metric-dependent effects. The divergent sensitivity of PSNR, SSIM, and entropy underscores their distinct formulations and justifies their joint use when quantifying content degradation under adverse conditions.

4. Discussion

The experimental results indicate that the broadband SWIR approach offers consistently high image quality in diverse obscurant conditions, calling for a reinterpretation of earlier assumptions about narrowband performance in fog. In fact, the broadband configuration marginally outperformed the narrowband filter in small-particle fog, which we interpret as evidence that maximizing photon capture generally improves image fidelity when scattering is not extreme. In large-droplet fog, however, the experiments produced mixed results. This finding shows that the narrowband’s theoretical advantage in fine-particle fog does not necessarily translate to superior real-world performance under our test conditions. Notably, across every spectral and optical setting, all quality metrics (entropy, SSIM, and PSNR) registered their highest values in rain. As theory predicts, the larger and less optically dense droplets in rain preserved more information.
Our tunnel results align with and, in several cases, extend the trends reported in recent flight trials and multispectral studies. Lang et al. [2] showed that single-band 1550 nm SWIR improves haze penetration versus RGB, yet their outdoor data also hinted at photon starvation, an effect we quantify here through the broadband–narrowband gap. Likewise, the multisensor benchmark of Bijelic et al. [14] showed that SWIR degrades more slowly than is visible, as the liquid-water path increases in a controlled fog chamber, mirroring our trends. Conversely, the active range-gated system of Willitsford et al. [6] achieved superior rain penetration by injecting 1550 nm laser power, highlighting that passive broadband and active narrowband are complementary rather than contradictory approaches. Taken together, these cross-study comparisons reinforce our conclusion that photon throughput, whether provided by broad passive collection or active illumination, dominates SWIR performance in degraded visual environments.
A key factor in broadband SWIR’s superior performance is likely the greater total light throughput and the associated improvement in SNR. The broadband sensor captures a wide range of wavelengths simultaneously, delivering a higher overall photon count to the detector. Even though shorter SWIR wavelengths within that band may experience more scattering, the sheer increase in collected light can enhance image signal and reduce noise after image processing. In contrast, the narrowband filters restrict the sensor to a much narrower slice of the spectrum, dramatically reducing the incoming light. In the low-illumination, high-scattering environment of fog, this reduced signal can lead to a lower SNR, effectively eroding image quality. Thus, any scattering reduction benefit at 1550 nm may be offset by the narrowband’s poorer photon statistics. In addition, no significant improvement in fog/rain image clarity was observed from stopping down the lens; on the contrary, the loss of light with higher f-number hurt SNR more than it helped image quality. This trade-off should be acknowledged when interpreting the image quality metrics in our results.
Radiation fog is characterized by a high concentration of small droplets, so one would expect longer wavelengths to have an edge. However, many of those droplets are still on the order of a few microns in diameter, meaning both 1.55 µm and the shorter wavelengths in the SWIR range undergo Mie scattering rather than Rayleigh scattering. The difference in scattering cross-section may not be large enough to significantly boost image clarity on its own. In addition, the spectral region around 1.4 µm coincides with a strong water absorption band in the atmosphere; a moisture-rich fog could further attenuate the narrowband signal. The broadband camera, by covering a continuum of wavelengths, likely captures portions of the spectrum with better transmission, for example, around 0.9–1.2 µm where water absorption is lower. This means the broadband system inherently leverages the penetration benefits of longer wavelengths and gains extra signal from wavelengths that are less affected by absorption, yielding a more robust image overall.
Revisiting our initial theoretical assumptions in light of these findings, we see that spectral optimization alone (choosing a single wavelength to minimize scattering) is not guaranteed to maximize image quality. These results underscore the importance of considering total system performance for real-world applications, as any theoretical gains in clarity from spectral filtering may be negated by the reduction in SNR. Ultimately, the consistent outperformance of the broadband approach suggests that collecting a wider spectral range is a more reliable approach for maintaining visibility in inclement conditions, provided that chromatic aberration is properly managed. Importantly, these conclusions apply primarily to passive imaging; systems using active illumination or other modalities may exhibit different wavelength-dependent behavior. It should be noted that adequate denoising models tailored to SWIR imaging, particularly for low-illumination and high-scattering environments, and image fusion with longer wavelengths bands such as MWIR, LWIR, or SWIR LiDAR have the potential to significantly enhance imagery outputs [27,28].

5. Conclusions

The main contributions of this study are the following: (1) the first controlled tunnel experiment comparing broadband vs. narrowband passive SWIR on a UAV in fog and rain; (2) quantitative evidence that broadband SWIR yields higher image quality than any single sub-band at moderate visibilities; (3) analysis of lens aperture effects, showing a SNR drop from f/1.8 to f/5.6; and (4) an established baseline of SWIR image-quality metrics (SSIM, PSNR, and entropy) in degraded weather for the UAV-vision research community.
Broadband SWIR imaging emerged as a robust performer across all adverse visibility conditions examined in this study, demonstrating relatively high image-quality metrics in heavy rain, advection fog, and radiation fog alike, and in most cases outperforming the narrowband SWIR filters. Although all systems collapse in extremely dense fog (<20 m visibility), the broadband SWIR sustains a measurable advantage in moderate conditions, as quantified by SSIM, PSNR, and entropy, in the scientifically reproducible environment of the climatic CEREMA PAVIN tunnel. This finding revises our understanding of SWIR imaging in fog: it shows that casting a wider “net” over the SWIR spectrum can yield better results than isolating a single wavelength, even in situations where that wavelength was theoretically favored. It is noteworthy that the conclusions are drawn using a CQD imager and that the results could differ from those of other sensor technologies, though the qualitative trend (photon throughput matters) likely remains.
In summary, our results affirm that broadband VNIR-SWIR is a preferable strategy for enhancing visibility in fog and rain. By capturing a broad span of wavelengths, the broadband system maximizes light throughput and maintains higher image signal levels, preserving scene details more effectively under scattering conditions. This insight is valuable for the design of future imaging systems and visibility enhancement technologies, underscoring that maximizing SNR through broad spectral capture can outweigh the benefits of spectral selectivity in real-world fog and rain. We understand that other techniques such as polarization, active illumination, and multispectral fusion may further enhance visibility in ways that a passive broadband imager alone cannot; hence, future work will leverage this dataset to train multi-band fusion algorithms that jointly exploit RGB, SWIR, and LWIR imagery and to prototype tunable SWIR sensors that adapt their spectral response to changing weather.

Author Contributions

Conceptualization, A.B. and R.W.; formal analysis, A.R. and G.S.; funding acquisition, M.L.; investigation, A.B., R.W. and I.M.-S.; methodology, R.W.; resources, I.M.-S. and I.S.; software, D.G.; supervision, A.B.; validation, I.M.-S. and I.S.; visualization, A.R.; writing—original draft, A.B. and A.R.; writing—review and editing, A.B., A.R., R.W., G.S. and I.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The datasets and analysis code generated during this study are available from the corresponding author upon reasonable request.

Acknowledgments

The authors gratefully acknowledge the CEREMA PAVIN tunnel team for expertly coordinating the fog-and-rain facility and providing invaluable technical support throughout the experiments.

Conflicts of Interest

Authors Alexander Bessonov, Richard White, Galih Suwito, Ivonne Medina-Salazar and Marat Lutfullin were employed by the company Quantum Advanced Solutions Ltd.; Authors Dmitrii Gusev and Ilya Shikov were employed by the company Topodrone SA. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
SWIRShort-Wave Infrared
UAVUnmanned Aerial Vehicle
VTOLVertical Take-Off and Landing
DVEDegraded Visual Environment
VISVisible (Spectral Band)
NIRNear-Infrared
MWIRMid-Wave Infrared
LWIRLong-Wave Infrared
IRInfrared
LiDARLight Detection and Ranging
SSIMStructural Similarity Index
PSNRPeak Signal-to-Noise Ratio
SNRSignal-to-Noise Ratio
CQDColloidal Quantum Dot
SWaPSize, Weight and Power
FOVField of View
TECThermoelectric Cooler
IQAImage Quality Assessment
PPKPost-Processed Kinematic
GNSSGlobal Navigation Satellite System
RTKReal-Time Kinematic
RGBRed, Green, Blue

References

  1. Driggers, R.G.; Hodgkin, V.A.; Vollmerhausen, R.H. What good is SWIR? Passive day comparison of VIS, NIR, and SWIR. Proc. SPIE Infrared Imaging Syst. Des. Anal. Model. Test. 2013, 8706, 87060L. [Google Scholar] [CrossRef]
  2. Lang, J.; Wang, Y.; Xiao, X.; Zhuang, X.; Wang, S.; Liu, J.; Wang, J. Study on short-wave-infrared long-distance imaging performance based on multiband imaging experiments. Opt. Eng. 2013, 52, 045008. [Google Scholar] [CrossRef]
  3. Jobert, G.; Vannier, N.; Pelletier, S.; Delubac, R.; Brenière, X.; Péré-Laperne, N.; Rubaldo, L. SWIR’s advantage over the visible in long-range imaging scenarios: Comparative field trials in a variety of atmospheric conditions. Proc. SPIE Electro-Opt. Infrared Syst. Technol. Appl. 2022, 12115, 121150H. [Google Scholar] [CrossRef]
  4. Judd, K.M.; Thornton, M.P.; Richards, A.A. Automotive sensing: Assessing the impact of fog on LWIR, MWIR, SWIR, visible, and LiDAR performance. Proc. SPIE Infrared Technol. Appl. 2019, 11002, 110021F. [Google Scholar] [CrossRef]
  5. St-Laurent, J.; Godbout, M.; Ross, P. Multispectral vision in winter driving conditions: NIR, SWIR, and LWIR for autonomous navigation. Proc. SPIE Integr. Infrared Vis. Imaging Sens. 2021, 11741, 117410A. [Google Scholar] [CrossRef]
  6. Willitsford, A.H.; Brown, D.M.; Baldwin, K.; Hanna, R.T.; Marinello, L. Range-gated active short-wave-infrared imaging for rain penetration. Opt. Eng. 2021, 60, 013103. [Google Scholar] [CrossRef]
  7. Bernini, N.; Bertozzi, M.; Cerri, P.; Fedriga, R.I. SWIR cameras for the automotive field: Two test cases. J. Sens. 2014, 2014, 858979. [Google Scholar] [CrossRef]
  8. Sheeny, M.; Wallace, A.; Emambakhsh, M.; Wang, S.; Connor, B. POL-LWIR vehicle detection: Convolutional neural networks meet polarised infrared sensors. In Proceedings of the 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), Salt Lake City, UT, USA, 18–22 June 2018; pp. 1247–1253. [Google Scholar] [CrossRef]
  9. Duan, J.; Guo, P.; Mo, S.; Wang, J.; Yang, X.; Zang, X.; Zhu, W. Dual-band transmittance defogging model. Appl. Opt. 2025, 64, 262–272. [Google Scholar] [CrossRef]
  10. Pavlović, M.S.; Milanović, P.D.; Stanković, M.S.; Perić, D.B.; Popadić, I.V.; Perić, M.V. Deep learning-based SWIR object detection in long-range surveillance systems: An automated cross-spectral approach. Sensors 2022, 22, 2562. [Google Scholar] [CrossRef] [PubMed]
  11. Stark, B.; McGee, M.; Chen, Y. Short-wave-infrared (SWIR) imaging using small unmanned aerial systems (sUAS). In Proceedings of the IEEE Ubiquitous Computing, Electronics and Mobile Communication Conference (UEMCON), New York, NY, USA, 19–21 October 2015; pp. 495–501. [Google Scholar] [CrossRef]
  12. Cerema. PAVIN Tunnel—Experimental Platform for Visibility, Lighting and Fog. Available online: https://www.cerema.fr/en/innovation-recherche/innovation/offres-technologie/simulation-platform-adverse-climate-conditions (accessed on 22 May 2025).
  13. MacDougal, M.; Mak, C.; Meitzner, J.; Strand, T.; Hood, A.; Geske, J.; Bessonov, A.; Medina-Salazar, I.; Lutfullin, M. Small-pixel SWIR imagers using InGaAs and CQDs. Proc. SPIE Infrared Technol. Appl. L 2024, 13046, 130460A. [Google Scholar] [CrossRef]
  14. Bijelic, M.; Gruber, T.; Ritter, W. Benchmarking image sensors under adverse weather conditions for autonomous driving. In Proceedings of the IEEE Intelligent Vehicles Symposium (IV), Changshu, China, 26–30 June 2018; pp. 834–841. [Google Scholar] [CrossRef]
  15. Hasirlioglu, S.; Kamann, A.; Doric, I.; Brandmeier, T. Test methodology for rain influence on automotive surround sensors. In Proceedings of the IEEE 19th International Conference on Intelligent Transportation Systems (ITSC), Rio de Janeiro, Brazil, 1–4 November 2016; pp. 2242–2247. [Google Scholar] [CrossRef]
  16. Wang, Z.; Bovik, A.C.; Sheikh, H.R.; Simoncelli, E.P. Image quality assessment: From error visibility to structural similarity. IEEE Trans. Image Process. 2004, 13, 600–612. [Google Scholar] [CrossRef] [PubMed]
  17. Renieblas, G.P.; Turrero Nogués, A.; Muñoz González, A.; Gómez-León, N.; Guibelalde del Castillo, E. Structural similarity index family for image quality assessment in radiological images. J. Med. Imaging 2017, 4, 035501. [Google Scholar] [CrossRef] [PubMed]
  18. Xie, Y.; Wei, H.; Liu, Z.; Wang, X.; Ji, X. SynFog: A photo-realistic synthetic fog dataset based on end-to-end imaging simulation for advancing real-world defogging in autonomous driving. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 17–23 June 2024; pp. 21763–21772. [Google Scholar] [CrossRef]
  19. Ogunrinde, I.; Bernadin, S. A review of the impacts of defogging on deep learning-based object detectors in self-driving cars. In Proceedings of the IEEE SoutheastCon 2021, Atlanta, GA, USA, 10–13 March 2021; pp. 1–8. [Google Scholar] [CrossRef]
  20. Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423, 623–656. [Google Scholar] [CrossRef]
  21. Poobathy, D.; Manicka Chezian, R. Edge detection operators: Peak signal-to-noise ratio-based comparison. Int. J. Image Graph. Signal Process. 2014, 6, 55–61. [Google Scholar] [CrossRef]
  22. Gonzalez, R.C.; Woods, R.E. Digital Image Processing, 4th ed.; Pearson: London, UK, 2022; Chapter 4. [Google Scholar]
  23. Li, C.; Zhou, S.; Wu, T.; Shi, J.; Guo, F. A dehazing method for UAV remote sensing based on global and local feature collaboration. Remote Sens. 2025, 17, 1688. [Google Scholar] [CrossRef]
  24. Feng, C.; Chen, Z.; Li, X.; Wang, C.; Yang, J.; Cheng, M.-M.; Dai, Y.; Fu, Q. HazyDet: Open-source benchmark for drone-view object detection in hazy scenes. arXiv 2025, arXiv:2409.19833. [Google Scholar]
  25. Lee, G.Y.; Chen, J.; Dam, T.; Ferdaus, M.M.; Poenar, D.P.; Duong, V.N. Dehazing remote-sensing and UAV imagery: A review of models and metrics. arXiv 2024, arXiv:2405.07520. [Google Scholar]
  26. Wang, Z.; Zhuang, J.; Ye, S.; Xu, N.; Xiao, J.; Peng, C. Image restoration quality assessment based on regional differential information entropy. Entropy 2023, 25, 144. [Google Scholar] [CrossRef] [PubMed]
  27. Liu, Y.; Zhao, G.; Fan, S.; Fei, C.; Liu, J.; Zhang, Z.; Wang, L.; Li, Y.; Zhao, X.; Liu, Z. Tri-band vehicle and vessel dataset for artificial intelligence research. Sci. Data 2025, 12, 592. [Google Scholar] [CrossRef] [PubMed]
  28. Sivaprakasam, V.; Lin, D.; Yetzbacher, M.K.; Gemar, H.E.; Portier, J.M.; Watnik, A.T. Multi-spectral SWIR lidar for imaging and spectral discrimination through partial obscurations. Opt. Express 2023, 31, 5443–5457. [Google Scholar] [CrossRef] [PubMed]
Figure 1. (a) CEREMA’s PAVIN climatic tunnel view prior to the experiment. (b) SWIR payload mounted on a drone. (c) Spectral responsivity profile of SWIR channel with selected optically filtered bands.
Figure 1. (a) CEREMA’s PAVIN climatic tunnel view prior to the experiment. (b) SWIR payload mounted on a drone. (c) Spectral responsivity profile of SWIR channel with selected optically filtered bands.
Drones 09 00553 g001
Figure 2. Comparison of broadband and SWIR images captured at 50 m distance from the targets upon progressively attenuated visibility in radiative fog and rain. The lower the visibility, the higher the information loss.
Figure 2. Comparison of broadband and SWIR images captured at 50 m distance from the targets upon progressively attenuated visibility in radiative fog and rain. The lower the visibility, the higher the information loss.
Drones 09 00553 g002
Figure 3. Entropy, SSIM, and PSNR estimated for varied apertures and optical bands in different visibilities under rain conditions. The colored lines represent the average metric across a set of band filters.
Figure 3. Entropy, SSIM, and PSNR estimated for varied apertures and optical bands in different visibilities under rain conditions. The colored lines represent the average metric across a set of band filters.
Drones 09 00553 g003
Figure 4. Entropy, SSIM, and PSNR for varied apertures and optical filters in advection (left) and radiation (right) fog conditions. The filled regions represent standard deviation, color-coded for corresponding data points.
Figure 4. Entropy, SSIM, and PSNR for varied apertures and optical filters in advection (left) and radiation (right) fog conditions. The filled regions represent standard deviation, color-coded for corresponding data points.
Drones 09 00553 g004
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Bessonov, A.; Rozanov, A.; White, R.; Suwito, G.; Medina-Salazar, I.; Lutfullin, M.; Gusev, D.; Shikov, I. All-Weather Drone Vision: Passive SWIR Imaging in Fog and Rain. Drones 2025, 9, 553. https://doi.org/10.3390/drones9080553

AMA Style

Bessonov A, Rozanov A, White R, Suwito G, Medina-Salazar I, Lutfullin M, Gusev D, Shikov I. All-Weather Drone Vision: Passive SWIR Imaging in Fog and Rain. Drones. 2025; 9(8):553. https://doi.org/10.3390/drones9080553

Chicago/Turabian Style

Bessonov, Alexander, Aleksei Rozanov, Richard White, Galih Suwito, Ivonne Medina-Salazar, Marat Lutfullin, Dmitrii Gusev, and Ilya Shikov. 2025. "All-Weather Drone Vision: Passive SWIR Imaging in Fog and Rain" Drones 9, no. 8: 553. https://doi.org/10.3390/drones9080553

APA Style

Bessonov, A., Rozanov, A., White, R., Suwito, G., Medina-Salazar, I., Lutfullin, M., Gusev, D., & Shikov, I. (2025). All-Weather Drone Vision: Passive SWIR Imaging in Fog and Rain. Drones, 9(8), 553. https://doi.org/10.3390/drones9080553

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop