Next Article in Journal
Coordinated Optimization of Passenger Flow Control and Train Skip-Stop Strategy in Metro Systems Incorporating Reservation
Previous Article in Journal
Determinants of Test-to-Reality CO2 Gaps in European PHEVs: The Limited Role of Battery Capacity
Previous Article in Special Issue
Advanced Multi-Modal Sensor Fusion System for Detecting Falling Humans: Quantitative Evaluation for Enhanced Vehicle Safety
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Feasibility of Infrared-Based Pedestrian Detectability in Unlit Urban and Rural Road Sections Using Consumer Thermal Cameras

1
Department of Transport and Aircraft Equipment and Technologies, Technical University of Sofia, Branch Plovdiv, 25 Tsanko Dyustabanov Street, 4000 Plovdiv, Bulgaria
2
Center of Competence “Smart Mechatronic, Eco-and Energy-Saving Systems and Technologies”, Technical University of Sofia, Branch Plovdiv, 4000 Plovdiv, Bulgaria
3
Department of Mechanical and Instrument Engineering, Technical University of Sofia, Branch Plovdiv, 25 Tsanko Dyustabanov Street, 4000 Plovdiv, Bulgaria
*
Author to whom correspondence should be addressed.
Vehicles 2026, 8(3), 61; https://doi.org/10.3390/vehicles8030061
Submission received: 28 January 2026 / Revised: 18 February 2026 / Accepted: 12 March 2026 / Published: 16 March 2026
(This article belongs to the Special Issue Novel Solutions for Transportation Safety, 2nd Edition)

Abstract

This study evaluates the feasibility of using two affordable thermal cameras (UNI-T UTi260M and UTi260T), which are not designed as automotive sensors, for observing pedestrians and warm objects during night-time driving under low-illumination conditions. The experimental setup includes mounting the camera on the vehicle body (e.g., side mirror area/roof), recording road scenes in urban and rural environments, and selecting representative frames for qualitative and quantitative analysis. The study assesses: (i) observable pedestrian detectability in unlit road sections and under oncoming headlight glare, where visible cameras often lose contrast; (ii) the influence of low ambient temperature and strong cold wind on image appearance (including “whitening”/contrast shifts); and (iii) workflow differences, where UTi260M relies on a smartphone application for streaming/recording, while UTi260T supports PC-based image analysis and temperature-profile visualization. In addition, a calibration-based geometric method is proposed for approximate pedestrian distance estimation from single frames using silhouette pixel height and a regression model based on 1 / h p x , valid for a specific mounting configuration and a known subject height. Results indicate that both cameras can highlight warm objects relative to the background and support visual pedestrian identification at low illumination, including in the presence of oncoming headlights, with UTi260M showing more stable behavior in parts of the tests. This work is a feasibility study and does not claim Advanced Driver Assist Systems (ADAS) functionality; it outlines limitations, repeatability considerations, and a minimal set of metrics and procedures for future extension. All quantitative indicators derived from exported frames are explicitly treated as image-level proxy metrics, not as physical sensor characteristics.

1. Introduction

Road traffic safety during the dark part of the day remains a persistent challenge, largely driven by limited scene illumination, headlight glare, and reduced conspicuity of vulnerable road users such as pedestrians and cyclists. Conventional forward-looking visible-spectrum cameras strongly depend on ambient illumination and scene contrast. In unlit or poorly lit road sections—common not only in rural areas but also in peripheral urban streets—camera-based perception may degrade substantially, especially when dynamic-range limitations and flare reduce effective contrast in the region of interest [1,2].
Thermal infrared (IR) imaging provides a different sensing cue: it captures emitted radiation related to surface temperature and emissivity rather than reflected visible light. Consequently, thermal imaging can remain informative under low illumination and can be less sensitive to visible headlight glare, since it does not rely on the visible-light intensity distribution [3]. Glare is also a safety-relevant factor because it can reduce visual performance and cause temporary visual impairment/discomfort [4,5,6,7].
In parallel with automotive-grade solutions, the market offers low-cost handheld thermal cameras intended primarily for industrial inspection, building diagnostics, and maintenance. Although such devices are not designed as automotive perception sensors, their affordability and accessibility motivate a feasibility-level investigation: can consumer-grade thermal cameras support meaningful detectability of pedestrians/vehicles in unlit road segments, including challenging scenarios with oncoming headlight glare? If feasible, such a setup could serve as a low-cost research platform and a proof-of-concept for night-time situational awareness, especially for educational and experimental purposes.
Prior work has discussed thermal and night-vision approaches for pedestrian detection [8], as well as thermal-camera use cases and limitations for transportation safety and road-user protection [3]. The broader context includes comparisons of thermal and regular cameras across lighting and visibility conditions and the need for “around-the-clock” video-based sensing [9,10], along with advances in computer vision methods mostly developed for regular RGB video [11,12,13]. However, it remains non-trivial to assess what can be achieved using consumer handheld cameras that apply internal automatic gain control (AGC), palette mapping, compression, and overlays.
This study evaluates feasibility of using two consumer thermal cameras (UTi260M and UTi260T) in a road environment. The focus is pedestrian detectability in unlit or poorly lit urban and rural road sections, robustness under oncoming headlight glare, stability under cold ambient conditions, and practical workflow differences between smartphone-based acquisition (UTi260M) and PC-oriented post-processing (UTi260T).
The main contributions are as follows: feasibility evaluation of two consumer L W I R thermal cameras in night-time road scenarios (urban and rural, lit and unlit); qualitative evidence of pedestrian and vehicle detectability under low illumination and in the presence of oncoming headlights; workflow-oriented comparison between UTi260M (smartphone streaming/recording) and UTi260T (PC analysis tools); and a simple calibration-based method to approximate pedestrian distance from single frames using pixel-height measurements, explicitly stated as mounting-specific.

1.1. Visible and Thermal Imaging

Visible-spectrum sensing covers the range to which the human eye is sensitive (photopic vision in daylight and scotopic vision at low-light levels). Under night-time conditions, visible cameras can suffer from low signal, high dynamic range, saturation from strong local light sources, and glare. Infrared radiation lies beyond the visible band. For practical sensing, mid-wave IR (mid-wave infrared (MWIR), ~3–5 μ m ) and long-wave IR (Long—wave infrared (LWIR), ~8–14 μ m ) are commonly considered. L W I R is particularly relevant for passive thermal imaging of objects near ambient temperatures because it captures emitted thermal radiation without requiring external illumination [14,15,16].

1.2. Thermal Vision Systems

Thermal imaging systems generate images based on thermal radiation. Uncooled microbolometer sensors have become widely used due to reduced cost and compactness. In road scenes at night, pedestrians and vehicles often present measurable thermal contrast due to body heat and warmed surfaces (e.g., engine and exhaust areas), supporting detectability in unlit road sections [3,8]. Thermal imaging is also considered relatively robust against some visual limitations (e.g., low illumination and visible glare), and is frequently discussed as a complementary modality in automotive applications [17,18]. The approach can be further optimized, as discussed in related experimental and system integration studies [19,20].
At the same time, it is important to distinguish automotive-grade sensors from consumer handheld devices. Consumer devices often apply internal AGC, palette mapping, and overlays that affect the final exported image. Therefore, when such cameras are used in road experiments, quantitative metrics computed from screenshots/JPG/PNG must be interpreted as image-level proxy measures, not as physical sensor noise parameters or lab-measured N E T D .
Unlike automotive-grade night-vision systems, research deliberately focuses on consumer handheld thermal devices and evaluates what level of pedestrian detectability and geometric reasoning as feasible under real road conditions.

2. Materials and Methods

2.1. Aim and Research Questions

The aim is to evaluate whether affordable (consumer-grade) thermal cameras can support night-time road-scene monitoring, including: (i) detectability and/or visual separation of pedestrians under low illumination; (ii) behavior under oncoming headlights (glare), where visible cameras often lose usable contrast; (iii) stability of thermal visualization at low temperatures and adverse conditions (strong cold wind); and (iv) practical-use differences between UTi260M and UTi260T (UNI-T, Plovdiv, Bulgaria).

2.2. Equipment Used, Data Acquisition and Processing Workflow

Technical differentiation between UTi260M and UTi260T. Although both devices employ a 256*192 pixels uncooled VOx microbolometer sensor with comparable thermal sensitivity (<50 mK) and identical spectral range (8–14 µm), their data acquisition and analysis architectures differ substantially.
UTi260M is a smartphone-integrated thermal module that stores images directly on the mobile device in JPG format and provides temperature readout via on-screen spot/area tools. However, it does not provide radiometric raw data export. Therefore, quantitative analysis relies on display-level frame extraction.
UTi260T, in contrast, supports both WiFi-based mobile transmission and USB-based PC analysis software (UNI–T v 2.3.15). The device allows structured on-screen analysis (multiple points, lines, and regions), gallery-based export, and PC-assisted visualization. This makes UTi260T more suitable for controlled post-processing and documentation.
The comparison of these two devices therefore does not aim to compare sensor performance per se, but rather to evaluate how different consumer-grade acquisition workflows influence feasibility for night-time road monitoring.
Although Table 1 and Table 2 show similar core thermal specifications (sensor resolution, spectral range, and sensitivity), the devices differ in acquisition architecture and data-handling workflow. UTi260M is designed as a smartphone-integrated module with display-level JPG storage and mobile-app temperature readout. UTi260T, in contrast, supports WiFi-based transmission, USB communication, and structured PC analysis software, enabling more controlled post-processing and documentation.
Therefore, the comparison in this study focuses not on intrinsic sensor performance but on feasibility differences arising from distinct consumer-level data acquisition pipelines.
The two devices were selected due to their positioning in the consumer low-cost segment. At the time of purchase, the approximate market prices were ~190 EUR for UTi260M and ~330 EUR for UTi260T, which are substantially lower than automotive-grade thermal imaging systems.
Both cameras offer a wide temperature measurement range (−20 °C to 550 °C), 256 × 192 LWIR resolution, and practical field deployability. The aim of this study is not to evaluate certified automotive sensors, but to assess whether affordable consumer-grade thermal cameras can provide meaningful night-time road-scene information under real conditions.
During road tests, real-time monitoring was performed either through the smartphone application (UTi260M and UTi260T hotspot mode) or directly on the UTi260T device display. Subsequently, exported frames were processed on a Windows-based environment for region of interest (ROI)-based statistical analysis.
No radiometric raw data export was available for UTi260M. Therefore, all quantitative evaluation is based on display-level frame extraction.

2.3. Mounting Configuration and Observation Geometry

To reduce influence from vehicle heat sources, the cameras were mounted outside the engine compartment and oriented toward the road lane and environment ahead.
UTi260T was mounted on the vehicle roof at approximately 1.7 m height in a horizontal orientation to capture the road environment (Figure 1a, position 1).
UTi260M was mounted near the side mirror at approximately ~1 m height (Figure 1a, position 2).
In the reported tests, the practical observation range was approximately ~200 m for high-contrast thermal targets, with a capture cone angle of approximately 56° (horizontal). UTi260T transmitted images wirelessly to a smartphone (after obtaining a functional application/connection), while UTi260M was connected to the smartphone through its supported interface.

2.4. Functional Verification of the System

Before road measurements, correct image transmission and thermal response were verified using a controlled heat source inside the vehicle (e.g., Heating, Ventilation and Air Conditioning (HVAC) outlet). This step confirmed stable streaming/recording and baseline thermal responsiveness prior to analyzing road scenes. Figure 2 illustrates the functional verification procedure of the UTi260T wireless transmission workflow. Subfigures (a) and (b) demonstrate the WiFi-based connection and live streaming of the external environment to an Android device. Subfigures (c) and (d) confirm stable image acquisition during scene alternation between the vehicle cabin and the external road environment. This procedure ensured reliable data transmission, synchronization, and thermal response prior to conducting night-time road experiments.

2.5. Frame Preparation for Analysis

Recordings were collected in urban and rural environments under night-time conditions. Representative frames were selected for analysis and assigned Case IDs. For each case, metadata were recorded: camera (UTi260M/UTi260T), scenario (urban/rural; lit/unlit; glare/no-glare), and notes on environmental conditions (cold wind; sky in frame). The UTi260T provides two synchronized image modalities acquired during the same scene: (i) a thermal image used for target-to-background temperature-contrast assessment, and (ii) a conventional visible-light frame used for contextual interpretation (lane geometry, surrounding objects, and lighting conditions). During post-processing, the manufacturer software allows the two channels to be displayed separately (side-by-side) or inspected sequentially, enabling direct qualitative comparison between thermal detectability and visible-scene readability under identical conditions. In this study, the thermal channel was used for detectability assessment (presence/absence of a warm target, contrast, and proxy noise/uniformity metrics), while the visible channel served as a reference to document illumination conditions (e.g., glare from headlights, unlit road segments, and background clutter) and to support figure annotation.
To minimize visual clutter, UI overlays (e.g., brand/date stamps) were removed by cropping where necessary, without altering the thermal palette or intensity values beyond the exported image’s native representation.

2.6. Extraction of Temperature Information

When radiometric raw data are unavailable, temperature-related information can only be extracted approximately from exported images. Where the interface provides numeric markers (e.g., min/max/spot values), those values were used as reference. Otherwise, a palette/legend mapping can be used to build an approximate color–temperature relationship for the displayed image, with the understanding that the result is dependent on the device’s AGC and palette settings.

2.7. Metrics for Thermal Image Quality and Detectability

Quantitative evaluation was performed using regions of interest (ROIs) over (i) an object region (e.g., pedestrian torso or hottest vehicle area) and (ii) a homogeneous background region (e.g., asphalt/road surface).
For statistical analysis, a fixed rectangular region of interest (ROI) was manually defined within a homogeneous road-background region of each frame. The (ROI) was kept identical in pixel coordinates across consecutive frames to ensure comparability. Regions containing vehicles, pedestrians, text overlays, and graphical elements were excluded to avoid bias in the noise and uniformity estimation.

2.8. Proxy Thermal Image Quality Metrics Derived from 8-Bit Exported Frames

The following procedure provides image-level proxy indicators for noise and non-uniformity computed from exported thermal frames (e.g., JPG/PNG/screenshot). The method does not attempt to recover the sensor’s physical noise characteristics; instead, it quantifies the stability and spatial consistency of the final displayed image, which is influenced by internal AGC, palette mapping, compression, and overlays. The adopted indicators follow common IR quality-analysis ideas (noise, uniformity, structured artifacts) and are adapted for a feasibility dataset with non-radiometric exports [21,22,23,24].
All calculations were performed on 8-bit grayscale representations derived from exported display images. Since UTi260M does not provide radiometric raw data export, the analysis relies exclusively on display-level pixel intensity values digital numbers, (DN).
1.
Temperature mapping method (display-based, optional)
When radiometric raw data are unavailable, a pixel-wise temperature proxy can be obtained from the displayed frame by: (i) extracting the color legend (scale bar), (ii) building a color–temperature mapping from the legend, and (iii) applying nearest-neighbor color matching to each pixel in the thermal region. This enables approximate quantitative comparisons of thermal variations, while explicitly remaining dependent on the display settings (palette, AGC, and min–max scaling of the frame).
2.
Spatial-noise estimation (local standard deviation)
Spatial noise is computed as the standard deviation within a local window centered at pixel:
σ n o i s e i , j = 1 N   k = 1 N ( T k T ¯ ) 2 ,
where σ n o i s e i , j is local spatial-noise estimate for the window centered at pixel i , j ;
T k —temperature of the k -th pixel inside a local analysis window. Index k enumerates all pixels within the window, i.e., k = 1, 2, …,   N .
T ¯ —local mean temperature computed over the same analysis window:
This metric quantifies local thermal fluctuations and is the primary indicator of sensor stability in a single still frame.
The spatial-noise proxy σ n o i s e represents the standard deviation of grayscale intensity values within the selected (ROI) and reflects spatial fluctuation of pixel intensities in homogeneous background regions.
3.
Global noise level
A global noise proxy is defined as the mean of local noise values across the image (or across the analyzed thermal region):
The global noise level is defined as the mean of all local noise values across the image:
σ g l o b a l = 1 H W i , j σ n o i s e i , j ,
where H is image height and W is image width.
4.
Noise distribution (histogram)
A histogram of σ n o i s e values is used to describe the distribution of local noise magnitudes. This helps identify whether the noise is approximately Gaussian, skewed, multimodal, or dominated by structured components (e.g., banding or fixed-pattern artifacts).
5.
Vertical noise profile (row-wise)
Row-wise behavior is summarized by the mean local noise per row:
σ r o w y = 1 W x = 1 W σ n o i s e x ,   y ,
6.
Horizontal noise profile (column-wise)
Similarly, the column-wise profile is
σ c o l x = 1 H y = 1 H σ n o i s e x ,   y ,
7.
Local noise variability (window size sensitivity)
Local noise is computed using small window sizes (e.g., 5 × 5 or 7 × 7). This exposes variations between homogeneous regions (asphalt/background) and areas affected by gradients, edges, interpolation, or compression artifacts.
8.
Uniformity Index (UI)
To characterize frame-level uniformity in a simple way, the following index is used:
U I = 1 σ g l o b a l T m a x T m i n ,
where T m a x is maximum temperature in the analyzed frame;
T m i n is minimum temperature in the analyzed frame.
For display-level exported frames, the (UI) is used as a simple indicator of spatial consistency within the ROI and should be interpreted as a proxy measure rather than a physical sensor uniformity parameter.
9.
Fixed-pattern noise (FPN)
Fixed-pattern artifacts are assessed through variability of row/column means (banding-like structure). In the context of exported non-radiometric images, FPN is interpreted as a structured non-uniformity proxy that can originate from sensor non-uniformity and/or the display pipeline (AGC/NUC mapping).
10.
Estimated noise—equivalent temperature difference (NETD) (proxy)
(NETD) typically requires radiometric temporal measurements. When only single exported frames are available, an approximate indicator may be defined from the global spatial noise:
N E T D e s t α σ g l o b a l ,
where α is a scaling factor dependent on the color–temperature mapping and display scaling. In this work, N E T D e s t is reported only as a proxy and is not interpreted as a sensor specification.
11.
Signal-to-noise ratio (SNR)
For detectability assessment, SNR is defined as
S N R = T ¯ σ g l o b a l ,
where T ¯ is the mean value of the selected object-related signal (or object ROI) and σ g l o b a l is the global noise proxy.

2.9. Distance-to-Pedestrian Estimation from a Single Frame (Monocular, Height-Based)

A simple monocular distance-estimation approach is considered for feasibility analysis using a known subject height. The inputs are: real subject height H ,   m measured subject pixel height h p x , (px) for each frame; and frame dimensions W i m g , H i m g , (px).
For calibration, the real distance D from the camera to the person is measured in the field, together with the person’s height H . For each calibration frame, the subject pixel h p x is extracted (e.g., from a bounding box). A convenient normalized ratio is r = h p x / N v , where N v is the vertical resolution of the frame in pixels.
The approach uses the following relationships.
Calibration from real measurements (deriving an effective pixel focal length):
f p x ( H ,   D ) = h p x D H ,
where D is the distance from the camera to the person/object (m);
h p x is the measured object height in pixels;
H is the real height of the person/object (m);
f p x ( H ,   D ) is the effective focal length expressed in pixels.
The distance model must first be calibrated using initial pairs ( H ,   D ) and the corresponding measured h p x . Once calibrated for a fixed mounting configuration and processing settings, the method can be used to estimate distance only under the same setup and for objects/persons consistent with the assumed height model.
Distance estimation (operational use):
D ( H , h p x ) = f p x H h p x ,   m .

2.10. Pixel Footprint and Percentage Occupancy Ratio (POR)

From each selected frame, an approximate pedestrian silhouette is obtained by segmentation (thresholding by intensity/contrast followed by morphological operations). Then the bounding-box height h p x is measured, together with bounding-box width ω p x and area:
A b b = ω p x h p x
The percentage occupancy is computed as P O R = A b b / A f r a m e , where A f r a m e is the total frame area in pixels.
This yields a measurable dependency: as distance increases, both h p x and P O R decrease, which can be used as an additional geometric cue and as a consistency check for the distance-estimation procedure.

2.11. Entropy-Based Image Information Metric (Shannon Entropy)

Image entropy is used as an indicator of the informational content and intensity distribution within the selected region of interest (ROI). It reflects the degree of variability of grayscale values and provides an estimate of the complexity of the thermal scene representation.
The entropy is calculated using the Shannon formulation:
H = i = 1 N p ( i ) l o g 2 p ( i ) ,
where p ( i ) represents the probability of grayscale intensity level i within the (ROI) histogram. Let p ( i ) denote the normalized probability of intensity level i in the (ROI) histogram.
Higher entropy values indicate richer intensity variation and more complex image structure, while lower entropy values correspond to more uniform regions.

2.12. Data Acquisition Protocol During Road Tests

During night-time road experiments, both devices were operated under defined acquisition workflows.
The UTi260M device was connected via USB-C to an Android smartphone and operated exclusively through the manufacturer mobile application. Real-time streaming and recording were performed on the smartphone. Thermal frames were stored in JPG format on the mobile device. All quantitative analysis was conducted subsequently using exported frames.
During field experiments, the UTi260T device was mounted on the vehicle roof and operated either via (i) WiFi hotspot connection to an Android device for real-time monitoring, or (ii) local display observation. For quantitative analysis, thermal frames were exported and processed in a Windows-based environment using the manufacturer PC software.
Figure 3 and Figure 4 present representative thermal frames obtained during the experimental recordings and illustrate the typical appearance of pedestrians under low-illumination conditions.
The wireless transmission setup shown in Figure 5 was used both for functional verification and during selected road tests to ensure stable real-time monitoring. Quantitative proxy metrics were computed on exported static frames to ensure repeatable (ROI) selection.
All reported quantitative results are based on stationary acquisitions to guarantee frame stability and reproducibility.

3. Results

3.1. Qualitative Observations: Detectability Under Low Illumination and Glare

The analyzed frames show that thermal visualization enables clear separation of a pedestrian from the background in unlit/poorly lit road sections where the visible camera provides low contrast. Under oncoming headlights, the thermal image remains informative because it does not rely on visible-light intensity distribution; the pedestrian remains distinguishable as a warmer region relative to the environment.
In the performed tests, UTi260M showed more stable behavior in motion up to approximately 60 k m / h (observationally), while UTi260T provided stronger post-processing tools (spot/line/graph instruments). Under field conditions, UTi260T performance is more sensitive to workflow factors (connection/app/modes), while UTi260M benefits from a simpler smartphone-based pipeline.

3.2. Issues and Artifacts Observed at Low Temperatures

Sky influence on A G C . When a large portion of the sky is included in the frame, very low apparent temperatures (radiative sky) can affect A G C and contrast distribution. Consequently, the scene may appear contrast-shifted relative to road objects (Figure 3).
Strong cold wind artifact. Under strong cold wind, degradation of the thermal image (“whitening”/reduced clarity) was observed. Likely causes include cooling of the housing/front optics, condensation/frost, or sensor thermal stability triggering A G C / N U C adaptations.

3.3. Proxy Noise and Uniformity Analysis

A quantitative proxy analysis was conducted on homogeneous regions of the scene (typically asphalt/background) to evaluate fluctuation level and spatial non-uniformity of the displayed thermal output. The analysis is based on 8-bit exported images; therefore, results are reported in D N and describe the final visualized frame (including compression, A G C , and overlay elements), not microbolometer physical noise or sensor specification parameters such as N E T D .
To ensure comparability, a homogeneous R O I in the road/background region was selected for each frame, avoiding text overlays, graphical elements, and bright thermal objects. The following proxy metrics were computed: intensity standard deviation ( σ n o i s e ), noise range, noise mode, row/column non-uniformity indicators ( σ r o w , σ c o l ), ( U I ), and a proxy estimate of ( F P N proxy). These metrics are suitable for comparative assessment across scenes/devices under the same procedure and dataset, but must not be interpreted as absolute thermographic sensor characteristics.
Representative UTi260T frames and corresponding software views used for visualization and analysis are shown in Figure 4 and Figure 5. The paired visible frames clarify whether limited detectability in the visible domain is caused by low ambient illumination, glare, or scene complexity. In several night scenes, oncoming headlights substantially reduced the usability of the visible frame (local overexposure and loss of contrast), while the thermal frame retained a stable depiction of warm objects with clear separation from the colder background. This dual-channel evidence supports the main feasibility claim: consumer thermal imaging can complement conventional visibility when the latter is compromised by glare or insufficient street lighting, provided that the thermal device remains within its operational constraints (mounting stability, wind/cooling effects, and limited resolution).

3.4. Comparative Analysis of Thermal Frames Captured with UTi260M Versus UTi260T

Figure 6 presents a thermal frame recorded with the UTi260M via a smartphone application at a road intersection under night-time/low-illumination conditions. The scene does not contain pedestrians; the purpose of this figure is to illustrate palette-dependent rendering behavior and scene-level thermal contrast characteristics rather than pedestrian-specific detection. In contrast to the UTi260T (Figure 4 and Figure 5), where a PC-based workflow enables more structured image inspection and presentation of thermal and visible channels, the UTi260M workflow is primarily oriented toward mobile viewing and recording, with more limited options for extracting radiometric information and conducting detailed post-processing.
Qualitative comparison (key observations):
Thermal contrast and “hot object” highlighting. The UTi260M clearly highlights objects with higher apparent temperature (e.g., parts of a vehicle), which is useful as a rapid visual cue when driving through poorly lit road sections. Under such conditions, the thermal view is less affected by oncoming headlights than a visible-light camera, which is typically impacted by glare and blooming.
Automatic gain/contrast (AGC) and the effect of cold regions (sky/open horizon). A typical behavior is observed: when the scene contains very cold regions (e.g., sky), the automatic scaling redistributes the dynamic range and may reduce local contrast on the roadway, visually perceived as “whitening” or grayscale compression. This dependence on scene composition is an important limitation for consistent visual interpretation across different environments.
Geometry/resolution and effective thermal detail. Although the smartphone-recorded output may have a higher pixel resolution due to screen capture, scaling, or video encoding, the effective thermal spatial detail remains limited by the sensor class (e.g., 256 × 192). This must be considered for any quantitative inference based on object size in pixels, edge sharpness, or fine structural details.
Software environment and traceability. The UTi260T provides a more transparent analysis workflow (point/line tools, statistics, paired visible frame, curves/plots), which supports repeatability and reporting. In the UTi260M case, the analysis is more practical “in-field,” but repeatability is harder to demonstrate when relying mainly on screen-recordings and frames with overlays.
Figure 7 and Figure 8 illustrate representative thermal frames used for qualitative assessment of pedestrian detectability under varying observation distances and environmental conditions.
Outcome of the comparison.
The UTi260M is more suitable for operational, real-time visual screening and mobile monitoring in low-illumination conditions, whereas the UTi260T is more suitable for documentation and post-analysis, where a structured software workflow and clearer traceability of measurement markers are required. In this study, no (ADAS) functionality is claimed; the work is positioned as a feasibility assessment of low-cost thermal devices as an auxiliary information source for detecting warm targets under low illumination. It should be noted that Figure 9 is included to demonstrate scene-level visualization behavior and workflow differences, while dedicated pedestrian examples are discussed separately in Section 3.5.
Table 3 summarizes proxy noise and uniformity metrics for UTi260T and UTi260M thermal frames and for a control night frame from a visible camera. The lowest σ n o i s e is observed for UTi260M (0.488 DN), while UTi260T values are higher (1.113–1.363 DN) for the analyzed scenes. UI is high for thermal frames (0.962–0.996), consistent with a more stable displayed background under low illumination. In contrast, the night visible frame shows substantially lower UI (0.601), reflecting the adverse impact of low signal and strong local light sources on uniformity and usable contrast. The SNR proxy, computed only for thermal frames, is highest for the scene with a clearly expressed pedestrian (561.77), supporting good detectability of a warm object relative to the background in unlit road sections.

3.5. Geometric Measurements from UTi260M Frames (Pedestrian Height 195 cm)

For a sequence of UTi260M frames, pedestrian pixel height h p x was extracted by silhouette segmentation and bounding-box measurement. Using relationships (8) and (9), a mounting-specific distance-estimation function can be obtained after calibration with measured distances, shown in Table 4 and Figure 8.
Table 4. Measured pedestrian pixel height and footprint metrics (preparation for distance calibration).
Table 4. Measured pedestrian pixel height and footprint metrics (preparation for distance calibration).
Frame H , m D , m f p x h p x
Figure 10a1.9561187384
Figure 10b1.9591187289
Figure 10c1.95121187192
Figure 10. Comparative statistical stability analysis of UTi260M and UTi260T based on six consecutive frames acquired under identical stationary night-time conditions: (a) entropy, (b) σ n o i s e , and (c) (UI).
Figure 10. Comparative statistical stability analysis of UTi260M and UTi260T based on six consecutive frames acquired under identical stationary night-time conditions: (a) entropy, (b) σ n o i s e , and (c) (UI).
Vehicles 08 00061 g010

3.6. Practical Observation Range

Both UTi260M and UTi260T showed a similar practical observation range of approximately ~200 m under the reported conditions and a capture cone angle of approximately 56° (Figure 9). Vehicle silhouettes, buildings, pedestrians, and other warm objects were distinguishable in the thermal imagery.

3.7. Statistical Evaluation over 6 Consecutive Frames

UTi260T exhibits higher entropy values (5.76 ± 1.10 bits) compared to UTi260M (4.39 ± 0.42 bits), suggesting greater dynamic contrast redistribution. However, this is accompanied by substantially higher variability in σ n o i s e (17.64 ± 7.91 vs. 7.68 ± 2.69) and significantly lower (UI) (0.506 ± 0.236 vs. 0.892 ± 0.039).
The results confirm that UTi260T applies stronger scene-dependent AGC adjustments, leading to higher dynamic variability between consecutive frames. In contrast, UTi260M demonstrates more stable frame-to-frame behavior, which may favor repeatability in statistical analysis, despite slightly lower entropy levels.
The statistical evaluation of UTi260M over six consecutive frames demonstrates stable frame-to-frame behavior. Entropy values remain within a narrow range (4.39 ± 0.42 bits), indicating consistent informational content without abrupt dynamic-range redistribution. The σ n o i s e metric shows moderate variability (7.68 ± 2.69), with one higher value attributed to scene-dependent contrast changes rather than systematic instability. The (UI) remains high (0.892 ± 0.039), confirming stable local contrast distribution within the selected (ROI), shown in Table 5.
Overall, UTi260M exhibits predictable and repeatable thermal imaging behavior under static vehicle conditions, which supports its suitability for consistent frame-based analysis.
The statistical evaluation of UTi260T reveals higher entropy values (5.76 ± 1.10 bits) compared to UTi260M, indicating richer informational content within the thermal frames. However, the variability between consecutive frames is significantly larger. The σ n o i s e metric (17.64 ± 7.91) demonstrates strong scene-dependent fluctuation, suggesting more aggressive (AGC) behavior, shown in Table 6.
The (UI) exhibits substantial variability (0.506 ± 0.236), indicating dynamic redistribution of contrast within the (ROI). This behavior suggests that UTi260T adapts more strongly to scene composition, which increases informational contrast but reduces frame-to-frame stability.
Figure 10 presents the statistical comparison of entropy, σ n o i s e , and (UI) over six consecutive frames for both devices. UTi260T demonstrates higher entropy peaks and significantly larger σ n o i s e values, indicating stronger scene-dependent contrast redistribution and more aggressive (AGC) behavior. However, this dynamic adaptation results in substantial variability of the (UI), which fluctuates between 0.19 and 0.81.
In contrast, UTi260M exhibits lower but more stable entropy values, moderate σ n o i s e levels, and consistently high (UI) (≈0.9), confirming repeatable frame-to-frame behavior. The results suggest that UTi260T prioritizes adaptive contrast enhancement, whereas UTi260M maintains more conservative and stable thermal visualization characteristics.

4. Discussion and Limitations

4.1. Interpretation of Results and Applicability

The results demonstrate that both consumer-grade thermal devices (UTi260M and UTi260T) provide stable scene-level thermal contrast under night-time road conditions, enabling visual separability of pedestrians and other warm targets from the background. This is particularly relevant for (i) unlit or poorly lit sections where visible imagery has low contrast and (ii) oncoming headlights where visible cameras are sensitive to glare and local saturation. Because thermal imaging relies on emitted long-wave infrared radiation rather than reflected visible light, target-to-background separability is preserved even under glare or low-illumination conditions.
UTi260M offers a more practical field workflow due to direct smartphone visualization and recording. UTi260T provides stronger post-processing instruments (temperature profiles, analysis tools), but under field conditions, the workflow depends more strongly on app/software stability and data-handling.
The reported SNR proxy values quantify display-level contrast separability and should not be interpreted as physical detector-level (SNR).

4.2. Environmental Effects and Observed Artifacts

At low ambient temperature, two effects were observed: (i) the inclusion of a large sky region can produce very low apparent temperatures and alter AGC, affecting road-region visibility; and (ii) strong cold wind was associated with temporary contrast degradation (“whitening”), likely caused by rapid front-optics cooling, altered sensor thermal equilibrium, and dynamic AGC redistribution. Consumer-grade microbolometer systems rely on internal non-uniformity correction (NUC) and (AGC), both of which may react to rapid environmental temperature shifts. These findings motivate stabilization measures (protective housing, airflow reduction near optics, condensation control) and procedural standardization (fixed pose, limited sky in frame) for future work.

4.3. Limitations of Data and Methodology

As a feasibility-oriented investigation, the study has several methodological limitations that restrict direct generalization: non-radiometric images and proxy metrics. Proxy indicators ( σ n o i s e , UI, FPN proxy, SNR proxy) depend on compression, palette, AGC, and overlays; characterize display-level variability and must not be interpreted as physical (NETD) or intrinsic detector specifications. Tests do not cover rain/fog/snow/wet asphalt reflectance comprehensively; there is a limited number of scenes and standardization. Some analyses use static frames because certain workflows do not reliably accept frames extracted from video—no (ADAS) claims. The work reports detectability/visual separability only; automotive-grade (ADAS) would require multi-sensor integration, calibration, functional safety, and large-scale validation as well as mounting-specific distance estimation. Pixel-height distance estimation requires calibration, consistent processing, and error reporting (e.g., MAE/RMSE); frames with partial occlusion or unstable segmentation should be excluded.

4.4. Potential Image-Enhancement Approaches

Advanced enhancement techniques such as multi-scale wavelet filtering, contrast-limited adaptive histogram equalization (CLAHE), or low-frequency compensation methods may improve perceived contrast under challenging environmental conditions (e.g., cold wind or large sky regions triggering (AGC) redistribution). Such approaches could mitigate dynamic-range compression and enhance target-to-background separability. Future work may evaluate enhancement algorithms quantitatively (e.g., via entropy, contrast metrics, or structural similarity indices) to assess whether post-processing improves detectability without distorting thermal interpretability.

4.5. Cold-Wind-Induced Contrast Degradation

The observed “whitening” effect under strong cold headwind conditions is likely associated with rapid convective cooling of the front optics and camera housing. When exposed to cold airflow, the external lens surface may experience sudden temperature reduction, potentially leading to transient thermal imbalance between the optics and the microbolometer sensor. Such imbalance can affect apparent scene contrast due to internal (NUC) cycles and (AGC) redistribution.
In addition, rapid cooling may alter the effective radiative balance between foreground objects and background regions, especially when combined with large sky portions in the frame. Since consumer-grade thermal cameras rely on automatic dynamic-range scaling, abrupt environmental changes can lead to temporary contrast compression or grayscale “whitening”.
No internal temperature logging of the sensor core was available; therefore, the explanation remains physically motivated but indirect. Controlled laboratory measurements of sensor and optics temperature under airflow exposure are proposed as future work.

4.6. Comparative AGC Behavior Between UTi260M and UTi260T

Although both devices employ similar LWIR microbolometer sensors, differences in (AGC) implementation appear to influence scene rendering and proxy metric variability. In dynamic road scenes where large cold regions (e.g., sky) alternate with warmer foreground regions (road surface, vehicles, pedestrians), the two devices exhibited different contrast redistribution characteristics.
UTi260M tended to maintain more stable foreground contrast during short-term scene alternations, likely due to its smartphone-integrated display scaling pipeline. In contrast, UTi260T, particularly when operated via PC software, showed more pronounced global dynamic-range redistribution when large cold regions dominated the frame. This behavior can increase apparent spatial variability and affect proxy noise metrics such as σ n o i s e and UI.
These differences are attributed not to intrinsic sensor performance, but to device-specific AGC algorithms and display-level processing. Since this study relies on exported non-radiometric frames, the reported proxy metrics characterize final rendered output rather than detector-level characteristics.

4.7. Statistical Stability and Device Behavior

The statistical evaluation over six consecutive stationary frames revealed distinct frame-to-frame behavior between the two thermal devices.
UTi260M demonstrated stable entropy values (4.39 ± 0.42), moderate σ n o i s e (7.68 ± 2.69), and consistently high (UI) (0.892 ± 0.039). The low variability confirms conservative and repeatable (AGC) behavior under static night-time conditions.
In contrast, UTi260T exhibited higher mean entropy (5.76 ± 1.10), significantly higher σ n o i s e (17.64 ± 7.91), and a substantially lower and more variable (UI) (0.506 ± 0.236). The strong frame-to-frame fluctuations indicate more aggressive dynamic-range redistribution and scene-dependent AGC adaptation.
These findings suggest that UTi260T prioritizes adaptive contrast enhancement, potentially increasing the visual detectability of warm targets, whereas UTi260M favors the stability and repeatability of thermal representation.

4.8. Practical Implications

From an operational perspective: higher entropy and contrast adaptation (UTi260T) may improve the visual separation of warm objects; higher statistical stability (UTi260M) may improve repeatability for frame-based quantitative analysis.
The choice between devices therefore depends on whether dynamic enhancement or frame-to-frame stability is prioritized.

4.9. Limitations of the Statistical Assessment

The statistical analysis was performed under stationary vehicle conditions and over a limited number of frames. Although six consecutive frames allow preliminary stability assessment, broader environmental variability (moving vehicle, changing weather, dynamic scenes) may influence AGC behavior differently.
Additionally, the metrics are based on non-radiometric image outputs; therefore, σ n o i s e and (UI) should be interpreted as proxy indicators rather than physical sensor noise characteristics (e.g., NETD).

5. Conclusions

This work evaluated the feasibility of using two affordable thermal cameras (UTi260M and UTi260T) for night-time road monitoring, focusing on pedestrian detectability in unlit urban and rural road sections and robustness under oncoming headlights. The observations indicate that thermal visualization can remain informative in scenarios where visible cameras suffer from low contrast or glare, and pedestrians can remain distinguishable relative to the background under low ambient temperatures.
Given the non-radiometric nature of exported frames, a set of image-level proxy metrics ( σ n o i s e , UI, FPN proxy, SNR proxy) was used to characterize the displayed image and enable comparisons across devices and scenes. A calibration-based geometric approach was also proposed to estimate pedestrian distance from single frames using pixel-height measurements for a fixed mounting configuration.
The study does not claim an automotive-certified system and should be interpreted as feasibility-level evidence. Future work should include controlled tests under rain/fog/snow, broader temperature ranges, a larger and more repeatable dataset, and publication of frames and scripts as supplementary material to support reproducibility.

Author Contributions

Conceptualization, Conceptualization, Y.S.; methodology, Y.S.; investigation, Y.S.; data curation, Y.S.; formal analysis, Y.S.; visualization, Y.S.; writing—original draft preparation, Y.S.; writing—review and editing, Y.S., A.T. and P.M.; supervision, A.T.; resources, A.T., P.M.; project administration, Y.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the European Regional Development Fund within the OP “Research, Innovation and Digitalization Programme for Intelligent Transformation 2021–2027”, Project No BG16RFPR002-1.014-0005 Center of competence “Smart Mechatronics, Eco-and Energy Saving Systems and Technologies”.

Institutional Review Board Statement

Ethical review and approval were waived for this study due to its retrospective character and the fact that it only involved contactless collected data. The study did not have a medical purpose and therefore does not fall under the jurisdiction of the ethics committee.

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The datasets consist of thermal image frames recorded during controlled road experiments and processed analysis outputs.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
MWIRMid-Wave Infrared
IRInfrared
LWIRLong–Wave Infrared
AGCAutomatic Gain Control
NUCNon–Uniformity Correction
ADASAdvanced Driver Assist Systems
NETDNoise–Equivalent Temperature Difference
ROIRegion of Interest
DNDigital Numbers
FPNFixed–pattern Noise
SNRSignal-to-Noise Ratio
PORPixel Occupancy Ratio
FOVField of View
HVACHeating, Ventilation and Air Conditioning
CLAHEContrast–Limited Adaptive Histogram Equalization
UIUniformity Index
RGBRed–Green–Blue

References

  1. Alldieck, T.; Bahnsen, C.; Moeslund, T. Context-Aware Fusion of RGB and Thermal Imagery for Traffic Monitoring. Sensors 2016, 16, 1947. [Google Scholar] [CrossRef]
  2. Pettirsch, A.; Garcia-Hernandez, A. New generation thermal traffic sensor: A novel dataset and monocular 3D thermal vision framework. Knowl.-Based Syst. 2025, 315, 113334. [Google Scholar] [CrossRef]
  3. Dona, R.; Mattas, K.; Vass, S.; Delubac, G.; Matias, J.; Tinnes, S.; Ciuffo, B. Thermal Cameras and Their Safety Implications for Pedestrian Protection: A Mixed Empirical and Simulation-Based Characterization. Transp. Res. Rec. 2025, 2679, 404–421. [Google Scholar] [CrossRef]
  4. Wessapan, T.; Rattanadecho, P. Influence of ambient temperature on heat transfer in the human eye during exposure to electromagnetic fields at 900 MHz. Int. J. Heat Mass Transf. 2014, 70, 378–388. [Google Scholar] [CrossRef]
  5. Kulacki, F.A.; Acharya, S.; Chudnovsky, Y.; Cotta, R.M.; Devireddy, R.; Dhir, V.K.; Mengüç, M.P.; Mostaghimi, J.; Vafai, K. Handbook of Thermal Science and Engineering; Springer: Berlin/Heidelberg, Germany, 2018. [Google Scholar] [CrossRef]
  6. Hall, J.E.; Michael, E. Guyton and Hall Textbook of Medical Physiology; Elsevier Health Sciences: Amsterdam, The Netherlands, 2026; ISBN 978-0-443-11101-3, 978-0-443-28109-9. [Google Scholar]
  7. Tasgaonkar, P.P.; Garg, R.D.; Garg, P.K. Vehicle Detection and Traffic Estimation with Sensors Technologies for Intelligent Transportation Systems. Sens. Imaging 2020, 21, 29. [Google Scholar] [CrossRef]
  8. Yawale, A.D.; Raskar, V.B. Pedestrain detection by video processing using thermal and night vision system. Int. J. Eng. Sci. Res. Technol. 2017, 6, 29–38. [Google Scholar] [CrossRef]
  9. Yoneyama, A.; Yeh, C.-H.; Kuo, C.-C.J. Robust vehicle and traffic information extraction for highway surveillance. EURASIP J. Adv. Signal Process. 2005, 2005, 912501. [Google Scholar] [CrossRef]
  10. Iwasaki, Y.; Kawata, S.; Nakamiya, T. Vehicle detection even in poor visibility conditions using infrared thermal images and its application to road traffic flow monitoring. In Emerging Trends in Computing, Informatics, Systems Sciences, and Engineering; Sobh, T., Elleithy, K., Eds.; Springer Science+Business Media: New York, NY, USA, 2013; pp. 997–1009. [Google Scholar]
  11. Casado-Sanz, N.; Guirao, B.; Galera, A.L.; Attard, M. Investigating the Risk Factors Associated with the Severity of the Pedestrians Injured on Spanish Crosstown Roads. Sustainability 2019, 11, 5194. [Google Scholar] [CrossRef]
  12. Zangenehpour, S.; Miranda-Moreno, L.F.; Saunier, N. Automated classification based on video data at intersections with heavy pedestrian and bicycle traffic: Methodology and application. Transp. Res. Part C Emerg. Technol. 2015, 56, 161–176. [Google Scholar] [CrossRef]
  13. Fu, T.; Stipancic, J.; Zangenehpour, S.; Miranda-Moreno, L.; Saunier, N. Automatic Traffic Data Collection under Varying Lighting and Temperature Conditions in Multimodal Environments: Thermal versus Visible Spectrum Video-Based Systems. J. Adv. Transp. 2017, 2017, 5142732. [Google Scholar] [CrossRef]
  14. Available online: https://science.nasa.gov/ems/09_visiblelight/ (accessed on 10 March 2026).
  15. Faundez-Zanuy, M.; Mekyska, J.; Espinosa-Duro, V. On the focusing of thermal images. Pattern Recognit. Lett. 2011, 32, 1548–1557. [Google Scholar] [CrossRef]
  16. Available online: https://www.icnirp.org/en/frequencies/infrared/index.html (accessed on 10 March 2026).
  17. Beg, M.S.; Ismail, M.Y.; Badrulhisam, N.H.; Siswanto, I.; Gunadi, G. Improving Vehicle Assistance Systems: Evaluation of Augmented Capabilities through Infrared Thermal Camera Integration. Int. J. Automot. Mech. Eng. 2025, 22, 12236–12252. [Google Scholar] [CrossRef]
  18. Farooq, M.A.; Shariff, W.; O’Callaghan, D.; Merla, A.; Corcoran, P. On the Role of Thermal Imaging in Automotive Applications: A Critical Review. IEEE Access 2023, 11, 25152–25173. [Google Scholar] [CrossRef]
  19. Tashev, A.; Dimitrov, E. Investigation of LPG influence on cylinder pressure of VW 1.9D diesel engine operating in dual-fuel mode. AIP Conf. Proc. 2025, 3274, 060003. [Google Scholar] [CrossRef]
  20. Mitev, P. Development of a Training Station for the Orientation of Dice Parts with Machine Vision. Eng. Proc. 2024, 70, 57. [Google Scholar] [CrossRef]
  21. Wolf, A.; Pezoa, J.E.; Figueroa, M. Modeling and Compensating Temperature-Dependent Non-Uniformity Noise in IR Microbolometer Cameras. Sensors 2016, 16, 1121. [Google Scholar] [CrossRef]
  22. König, S.; Gutschwager, B.; Taubert, R.D.; Hollandt, J. Metrological characterization and calibration of thermographic cameras for quantitative temperature measurement. J. Sens. Sens. Syst. 2020, 9, 425–442. [Google Scholar] [CrossRef]
  23. Igual, J. Photographic Noise Performance Measures Based on RAW Files Analysis of Consumer Cameras. Electronics 2019, 8, 1284. [Google Scholar] [CrossRef]
  24. Lee, H.; Kang, M.G. Infrared Image Deconvolution Considering Fixed Pattern Noise. Sensors 2023, 23, 3033. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Camera mounting configuration and positions on the vehicle: (a) UTi260T (pos. 1) on roof; UTi260M (pos. 2) near side mirror; (b) example mounting view; (c) external mounting position of the thermal camera on the windshield enabling forward road monitoring; (d) external mounting configuration of the thermal camera on the vehicle body during night-time road experiments.
Figure 1. Camera mounting configuration and positions on the vehicle: (a) UTi260T (pos. 1) on roof; UTi260M (pos. 2) near side mirror; (b) example mounting view; (c) external mounting position of the thermal camera on the windshield enabling forward road monitoring; (d) external mounting configuration of the thermal camera on the vehicle body during night-time road experiments.
Vehicles 08 00061 g001
Figure 2. Functional verification of UTi260T wireless streaming and image acquisition workflow. (a) Initial WiFi connection between UTi260T and Android mobile device; (b) external-environment thermal capture during live wireless transmission; (c) in-cabin thermal capture confirming stable streaming under low-light conditions; (d) subsequent external-environment acquisition to verify switching between indoor and outdoor scenes without loss of signal or image stability.
Figure 2. Functional verification of UTi260T wireless streaming and image acquisition workflow. (a) Initial WiFi connection between UTi260T and Android mobile device; (b) external-environment thermal capture during live wireless transmission; (c) in-cabin thermal capture confirming stable streaming under low-light conditions; (d) subsequent external-environment acquisition to verify switching between indoor and outdoor scenes without loss of signal or image stability.
Vehicles 08 00061 g002
Figure 3. Example illustrating sky-induced AGC/contrast shift and appearance changes at low-temperature conditions: (a) pedestrian detected near a parked vehicle in a residential street environment; (b) pedestrian observed at close range near a building façade; (c) pedestrian detected at a longer distance on an unlit road section.
Figure 3. Example illustrating sky-induced AGC/contrast shift and appearance changes at low-temperature conditions: (a) pedestrian detected near a parked vehicle in a residential street environment; (b) pedestrian observed at close range near a building façade; (c) pedestrian detected at a longer distance on an unlit road section.
Vehicles 08 00061 g003
Figure 4. UTi260T night-time frame and visualization in analysis software: (a) thermal frame with software overlay tools (processing/visualization); (b) corresponding baseline (“normal”) view.
Figure 4. UTi260T night-time frame and visualization in analysis software: (a) thermal frame with software overlay tools (processing/visualization); (b) corresponding baseline (“normal”) view.
Vehicles 08 00061 g004
Figure 5. UTi260T night-time frame for a different scene and visualization in analysis software: (a) thermal frame with activated processing/visualization tools; (b) baseline (“normal”) view for comparison.
Figure 5. UTi260T night-time frame for a different scene and visualization in analysis software: (a) thermal frame with activated processing/visualization tools; (b) baseline (“normal”) view for comparison.
Vehicles 08 00061 g005
Figure 6. Thermal frame captured with UTi260M at a night-time intersection under low illumination (no pedestrian present). The figure illustrates palette-dependent rendering differences using identical raw thermal data: (a) Gray scale; (b) Red hot; (c) Black white; (d) Lava; (e) Iron red. The purpose is to demonstrate how palette selection influences perceived contrast and visual interpretation in consumer-grade thermal imaging.
Figure 6. Thermal frame captured with UTi260M at a night-time intersection under low illumination (no pedestrian present). The figure illustrates palette-dependent rendering differences using identical raw thermal data: (a) Gray scale; (b) Red hot; (c) Black white; (d) Lava; (e) Iron red. The purpose is to demonstrate how palette selection influences perceived contrast and visual interpretation in consumer-grade thermal imaging.
Vehicles 08 00061 g006
Figure 7. Thermal frame acquired with UTi260M used for monocular distance calibration and single-frame pedestrian ranging based on the “real height–pixel height” relationship. The pedestrian (≈1.95 m) is represented by a bounding box, from which the pixel height h p x is measured and used to estimate the distance D after prior calibration. The visible-light view (inset) is included for contextual reference under low illumination and oncoming headlight conditions: (a) pedestrian detected at short distance with clear thermal contrast relative to the background; (b) pedestrian detected at intermediate distance where thermal contrast begins to decrease; (c) pedestrian detected at longer distance where the pedestrian appears smaller and the thermal contrast is further reduced.
Figure 7. Thermal frame acquired with UTi260M used for monocular distance calibration and single-frame pedestrian ranging based on the “real height–pixel height” relationship. The pedestrian (≈1.95 m) is represented by a bounding box, from which the pixel height h p x is measured and used to estimate the distance D after prior calibration. The visible-light view (inset) is included for contextual reference under low illumination and oncoming headlight conditions: (a) pedestrian detected at short distance with clear thermal contrast relative to the background; (b) pedestrian detected at intermediate distance where thermal contrast begins to decrease; (c) pedestrian detected at longer distance where the pedestrian appears smaller and the thermal contrast is further reduced.
Vehicles 08 00061 g007
Figure 8. Visualization the dependence of pedestrian pixel height on distance.
Figure 8. Visualization the dependence of pedestrian pixel height on distance.
Vehicles 08 00061 g008
Figure 9. Examples demonstrating the practical observation range and scene coverage for UTi260M and synchronized dual-channel capture with UTi260T: (a) UTi260M with smartphone and application; (b) thermal frame used for temperature-contrast and detectability assessment; (c) simultaneous visible-light frame used to document illumination conditions and scene context (road geometry, traffic, and glare sources).
Figure 9. Examples demonstrating the practical observation range and scene coverage for UTi260M and synchronized dual-channel capture with UTi260T: (a) UTi260M with smartphone and application; (b) thermal frame used for temperature-contrast and detectability assessment; (c) simultaneous visible-light frame used to document illumination conditions and scene context (road geometry, traffic, and glare sources).
Vehicles 08 00061 g009
Table 1. UTi260M thermal imager specifications.
Table 1. UTi260M thermal imager specifications.
SensorUncooled Vanadium Oxide
Range switchingLow temperature (−20–150 °C), high temperature (0–550 °C) (auto switching)
ModesIndustrial, human body
Emissivity0.95 (default) 0.01–1.00
IR resolution256*192 (49,152)
Infrared spectral bandwidth8–14 µm
Thermal sensitivity<50 mK
Frame rate25 Hz
Table 2. UTi260T thermal imager specifications.
Table 2. UTi260T thermal imager specifications.
SensorUncooled Vanadium Oxide
Temperature measurement range−20~150 °C, 100~550 °C (manual shift)
Infrared response band8–14 µm
IR resolution256*192 (49,152)
Infrared spectral bandwidth8–14 µm
Thermal sensitivity/NEDT<50 mK
Frame rate25 Hz
Temperature measurement resolution0.1 °C
Table 3. Proxy noise and uniformity metrics computed on homogeneous background (ROIs) (8-bit, DN).
Table 3. Proxy noise and uniformity metrics computed on homogeneous background (ROIs) (8-bit, DN).
Image σ n o i s e (DN)Noise Range (DN)Noise Mode (DN) σ r o w (DN) σ c o l (DN)UI = 1 − σ/μFPN Proxy (DN)SNR Proxy N E T D e s t
UTi260T—thermal (pedestrian, frame, Figure 8)1.1136350.9050.9680.9620.125561.77n/a
UTi260T—thermal (parked vehicles/street, frame, Figure 7)1.36381501.2731.1970.9930.229120.16n/a
UTi260M—thermal via smartphone (frame, Figure 9)0.48831160.4540.4530.9960.069301.23n/a
Visible-light camera (night frame, Figure 8)1.293841.2830.8930.6010.260n/an/a
Table 5. Frame-to-frame statistical stability analysis of UTi260M based on six consecutive stationary acquisitions.
Table 5. Frame-to-frame statistical stability analysis of UTi260M based on six consecutive stationary acquisitions.
FrameEntropy σ n o i s e UI
14.43256.95880.91237
24.79418.60500.85040
34.18146.91660.90105
43.69334.44790.93968
54.39386.71720.90938
64.819712.44930.83784
Summary (UTi260M): Entropy: 4.3858 ± 0.4192, σ n o i s e : 7.6825 ± 2.6862, UI: 0.8918 ± 0.0393.
Table 6. Frame-to-frame statistical stability analysis of UTi260T based on six consecutive stationary acquisitions.
Table 6. Frame-to-frame statistical stability analysis of UTi260T based on six consecutive stationary acquisitions.
FrameEntropy σ n o i s e UI
14.593012.92810.35833
26.355923.82600.73215
36.370018.63760.39638
47.373929.69270.19188
54.76289.75640.81307
65.131311.00760.54351
Summary (UTi260T): Entropy: 5.7645 ± 1.1030, σ n o i s e : 17.6414 ± 7.9104, UI: 0.5059 ± 0.2364.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Stoyanov, Y.; Tashev, A.; Mitev, P. Feasibility of Infrared-Based Pedestrian Detectability in Unlit Urban and Rural Road Sections Using Consumer Thermal Cameras. Vehicles 2026, 8, 61. https://doi.org/10.3390/vehicles8030061

AMA Style

Stoyanov Y, Tashev A, Mitev P. Feasibility of Infrared-Based Pedestrian Detectability in Unlit Urban and Rural Road Sections Using Consumer Thermal Cameras. Vehicles. 2026; 8(3):61. https://doi.org/10.3390/vehicles8030061

Chicago/Turabian Style

Stoyanov, Yordan, Atanasi Tashev, and Penko Mitev. 2026. "Feasibility of Infrared-Based Pedestrian Detectability in Unlit Urban and Rural Road Sections Using Consumer Thermal Cameras" Vehicles 8, no. 3: 61. https://doi.org/10.3390/vehicles8030061

APA Style

Stoyanov, Y., Tashev, A., & Mitev, P. (2026). Feasibility of Infrared-Based Pedestrian Detectability in Unlit Urban and Rural Road Sections Using Consumer Thermal Cameras. Vehicles, 8(3), 61. https://doi.org/10.3390/vehicles8030061

Article Metrics

Back to TopTop