Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (5,769)

Search Parameters:
Keywords = location sensor

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
22 pages, 10000 KB  
Article
The Development of a Wildfire Early Warning System Using LoRa Technology
by Supawee Makdee, Ponglert Sangkaphet, Chanidapa Boonprasom, Buppawan Chaleamwong and Nawara Chansiri
Computers 2026, 15(2), 105; https://doi.org/10.3390/computers15020105 - 2 Feb 2026
Abstract
Sok Chan Forest, located in Lao Suea Kok District, Ubon Ratchathani Province, Thailand, is frequently affected by wildfires during the dry season, resulting in significant environmental degradation and adverse impacts on the livelihoods of local communities. In this study, we outline the development [...] Read more.
Sok Chan Forest, located in Lao Suea Kok District, Ubon Ratchathani Province, Thailand, is frequently affected by wildfires during the dry season, resulting in significant environmental degradation and adverse impacts on the livelihoods of local communities. In this study, we outline the development of a prototype wildfire early warning system utilizing LoRa technology to address the long-distance data transmission limitations that are commonly encountered when using conventional Internet of Things (IoT) solutions. The proposed system comprises sensor nodes that communicate from peer to peer with a central node, which subsequently relays the collected data to a remote database server via the internet. Real-time alerts are disseminated through both a smartphone application and a web-based platform, thereby facilitating timely notification of authorities and community members. Field experiments in Sok Chan Forest demonstrated reliable single-hop communication with a 100% packet delivery ratio at distances up to 1500 m, positive SNR, and RSSI levels above receiver sensitivity, as well as sub-second end-to-end detection latency in both single- and two-hop configurations. A controlled alarm accuracy evaluation yielded an overall classification accuracy of 91.7%, with perfect precision for the Fire class, while a user study involving five software development experts and fifteen firefighters yielded an average effectiveness score of 3.84, reflecting a high level of operational efficacy. Full article
(This article belongs to the Special Issue Wireless Sensor Networks in IoT)
Show Figures

Figure 1

5 pages, 770 KB  
Proceeding Paper
Monitoring Water Quality in Small Reservoirs Using Sentinel-2 Imagery and Machine Learning
by Victoria Amores-Chaparro, Fernando Broncano-Morgado, Pablo Fernández-González, Aurora Cuartero and Jesús Torrecilla-Pinero
Eng. Proc. 2026, 123(1), 7; https://doi.org/10.3390/engproc2026123007 - 2 Feb 2026
Abstract
This article investigates the estimation of water quality parameters, specifically chlorophyll-a, applying machine learning techniques to Sentinel-2 images. This study focuses on five small reservoirs located in the Extremadura region (Spain), as these are the ones for which continuous daily records from automatic [...] Read more.
This article investigates the estimation of water quality parameters, specifically chlorophyll-a, applying machine learning techniques to Sentinel-2 images. This study focuses on five small reservoirs located in the Extremadura region (Spain), as these are the ones for which continuous daily records from automatic in situ sensors are available. Chlorophyll-a estimates are obtained from two sources: (1) From the C2RCC atmospheric correction of Sentinel-2 images using Sen2Cor and radiometric calibration to ensure temporal consistency, and (2) from in situ data obtained from the official website of the Guadiana Basin Automatic Network Information System. The machine learning (ML)-based methodology significantly improves the predicted results for inland water bodies, enabling enhanced continuous assessment of water quality in small reservoirs. Full article
(This article belongs to the Proceedings of First Summer School on Artificial Intelligence in Cybersecurity)
Show Figures

Figure 1

28 pages, 32119 KB  
Article
NOAH: A Multi-Modal and Sensor Fusion Dataset for Generative Modeling in Remote Sensing
by Abdul Mutakabbir, Chung-Horng Lung, Marzia Zaman, Darshana Upadhyay, Kshirasagar Naik, Koreen Millard, Thambirajah Ravichandran and Richard Purcell
Remote Sens. 2026, 18(3), 466; https://doi.org/10.3390/rs18030466 - 1 Feb 2026
Abstract
Earth Observation (EO) and Remote Sensing (RS) data are widely used in various fields, including weather, environment, and natural disaster modeling and prediction. EO and RS done through geostationary satellite constellations in fields such as these are limited to a smaller region, while [...] Read more.
Earth Observation (EO) and Remote Sensing (RS) data are widely used in various fields, including weather, environment, and natural disaster modeling and prediction. EO and RS done through geostationary satellite constellations in fields such as these are limited to a smaller region, while sun synchronous satellite constellations have discontinuous spatial and temporal coverage. This limits the ability of EO and RS data for near-real-time weather, environment, and natural disaster applications. To address these limitations, we introduce Now Observation Assemble Horizon (NOAH), a multi-modal, sensor fusion dataset that combines Ground-Based Sensors (GBS) of weather stations with topography, vegetation (land cover, biomass, and crown cover), and fuel types data from RS data sources. NOAH is collated using publicly available data from Environment and Climate Change Canada (ECCC), Spatialized CAnadian National Forest Inventory (SCANFI) and United States Geological Survey (USGS), which are well-maintained, documented, and reliable. Applications of the NOAH dataset include, but are not limited to, expanding RS data tiles, filling in missing data, and super-resolution of existing data sources. Additionally, Generative Artificial Intelligence (GenAI) or Generative Modeling (GM) can be applied for near-real-time model-generated or synthetic estimate data for disaster modeling in remote locations. This can complement the use of existing observations by field instruments, rather than replacing them. UNet backbone with Feature-wise Linear Modulation (FiLM) injection of GBS data was used to demonstrate the initial proof-of-concept modeling in this research. This research also lists ideal characteristics for GM or GenAI datasets for RS. The code and a subset of the NOAH dataset (NOAH mini) are made open-sourced. Full article
Show Figures

Figure 1

21 pages, 6597 KB  
Article
Summertime Air Pollution Measurements from Temporary Events—Fireworks and Festival Cooking
by Daniel L. Mendoza, Erik T. Crosman, Mamta Chaudhari, Corbin Anderson and Shawn A. Gonzales
Environments 2026, 13(2), 79; https://doi.org/10.3390/environments13020079 (registering DOI) - 1 Feb 2026
Abstract
Air pollution during mass-gathering events such as festivals and firework shows is a growing concern globally. Fireworks at festivals on average almost double the observed particulate pollution levels, while food trucks and associated diesel generators are known to result in very local air [...] Read more.
Air pollution during mass-gathering events such as festivals and firework shows is a growing concern globally. Fireworks at festivals on average almost double the observed particulate pollution levels, while food trucks and associated diesel generators are known to result in very local air pollution hotspots that are an emerging important area of research regarding sources of urban volatile organic compounds. This study adds to the scientific body of evidence of the impact of festival fireworks and cooking pollution in the USA by quantifying the impact of fireworks and cooking emissions during short summer festivals in June and July 2023 in Utah’s Salt Lake Valley using paired PM2.5, ozone, and BC sensors located at two distances nearby to the sources. Both fireworks and cooking increased PM2.5 and BC during the evening dinner and firework displays, while evening ozone was observed to drop during fireworks. The ozone concentration reductions during fireworks displays are likely associated with NOx titration due to fireworks and cooking emissions. Regulating fireworks and cooking emissions during annual festivals has resulted in significant reductions in PM2.5 pollution and corresponding benefits to human health. These findings can support policy decisions to reduce exposure to emissions locally. Full article
(This article belongs to the Special Issue Ambient Air Pollution, Built Environment, and Public Health)
Show Figures

Figure 1

15 pages, 3432 KB  
Article
Characterization and Impact of Meteorological Environmental Parameters on Gas Concentrations (NH3 and CH4) in a Maternity Pig Farm in Southeastern Spain
by Melisa Gómez-Garrido, Martire Angélica Terrero Turbí, Isabel María Fernández Bastida and Ángel Faz Cano
Agriculture 2026, 16(3), 349; https://doi.org/10.3390/agriculture16030349 - 1 Feb 2026
Abstract
Intensive pig production generates significant emissions of ammonia (NH3) and methane (CH4), gases with both environmental and health impacts, primarily originating from slurry storage lagoons and their management. This study monitored a maternity pig farm over a 360 day [...] Read more.
Intensive pig production generates significant emissions of ammonia (NH3) and methane (CH4), gases with both environmental and health impacts, primarily originating from slurry storage lagoons and their management. This study monitored a maternity pig farm over a 360 day period, using sensors located next to the slurry storage lagoon (Sensor 4) and in the immediate external surroundings of the facility, while simultaneously recording environmental variables (temperature, relative humidity, wind, and precipitation). The results showed that concentrations at the lagoon were thousands to tens of thousands of times higher than those measured in the surrounding area, with temperature and relative humidity emerging as key factors that increase volatilization and microbial generation, especially in summer under medium humidity conditions. Precipitation and wind modulate concentrations through resuspension and dispersion processes. Overall, the slurry storage lagoon constitutes the primary hotspot of emissions, and proper sensor placement is essential to accurately estimate its real impact, while integrating climatic and spatial conditions is crucial for designing and implementing effective mitigation strategies in intensive pig production systems. Full article
(This article belongs to the Section Ecosystem, Environment and Climate Change in Agriculture)
Show Figures

Figure 1

22 pages, 9263 KB  
Article
On the Variability of the Barometric Effect and Its Relation to Cosmic-Ray Neutron Sensing
by Patrick Davies, Roland Baatz, Paul Schattan, Emmanuel Quansah, Leonard Kofitse Amekudzi and Heye Reemt Bogena
Sensors 2026, 26(3), 925; https://doi.org/10.3390/s26030925 (registering DOI) - 1 Feb 2026
Abstract
Accurate estimation of the barometric coefficient (β) is important for correcting pressure effects in soil moisture data from cosmic-ray neutron sensing (CRNS) due to the barometric effect. To evaluate estimation strategies for β, we compared analytical and empirical approaches using [...] Read more.
Accurate estimation of the barometric coefficient (β) is important for correcting pressure effects in soil moisture data from cosmic-ray neutron sensing (CRNS) due to the barometric effect. To evaluate estimation strategies for β, we compared analytical and empirical approaches using 71 CRNS and 46 neutron monitor (NM) stations across the United States, Europe, and globally. Our results show spatio-temporal variation in the barometric effect, with β ranging from 0.66 to 0.82 %hPa for NM and from 0.63 to 0.80 %hPa for CRNS. These coefficients exhibit higher variability than previously published semi-analytical models. In addition, we found that the analytically determined β values were systematically lower compared with empirical estimates, with stronger agreement between the two empirical methods (r0.67) than between empirical and analytical approaches. Furthermore, NM stations produced higher β values than CRNS, indicating that differences in detector energy sensitivity affected the values of β. Principal Component Analysis (PCA) further showed that the analytical and empirical β estimates clustered together, reflecting shared sensitivity to elevation. In contrast, soil moisture and atmospheric humidity projected nearly orthogonally to the β vectors, indicating negligible influence, while cut-off rigidity contributed to a separate, inverse gradient. Analytical β estimates were fully orthogonal to AH, while empirical methods showed only slight deviations beyond orthogonality. The barometric coefficient (β), therefore, varies with location, altitude, atmospheric conditions, and sensor type, highlighting the necessity of station-specific values for precise correction. Overall, our study emphasizes the need for atmospheric correction in CRNS measurements and introduces a method for deriving site- and sensor-specific β values for accurate soil moisture estimation. Full article
(This article belongs to the Section Environmental Sensing)
Show Figures

Figure 1

26 pages, 3401 KB  
Article
Toward an Integrated IoT–Edge Computing Framework for Smart Stadium Development
by Nattawat Pattarawetwong, Charuay Savithi and Arisaphat Suttidee
J. Sens. Actuator Netw. 2026, 15(1), 15; https://doi.org/10.3390/jsan15010015 - 1 Feb 2026
Abstract
Large sports stadiums require robust real-time monitoring due to high crowd density, complex spatial configurations, and limited network infrastructure. This research evaluates a hybrid edge–cloud architecture implemented in a national stadium in Thailand. The proposed framework integrates diverse surveillance subsystems, including automatic number [...] Read more.
Large sports stadiums require robust real-time monitoring due to high crowd density, complex spatial configurations, and limited network infrastructure. This research evaluates a hybrid edge–cloud architecture implemented in a national stadium in Thailand. The proposed framework integrates diverse surveillance subsystems, including automatic number plate recognition, face recognition, and panoramic cameras, with edge-based processing to enable real-time situational awareness during high-attendance events. A simulation based on the stadium’s physical layout and operational characteristics is used to analyze coverage patterns, processing locations, and network performance under realistic event scenarios. The results show that geometry-informed sensor deployment ensures continuous visual coverage and minimizes blind zones without increasing camera density. Furthermore, relocating selected video processing tasks from the cloud to the edge reduces uplink bandwidth requirements by approximately 50–75%, depending on the processing configuration, and stabilizes data transmission during peak network loads. These findings suggest that processing location should be considered a primary architectural design factor in smart stadium systems. The combination of edge-based processing with centralized cloud coordination offers a practical model for scalable, safety-oriented monitoring solutions in high-density public venues. Full article
(This article belongs to the Section Big Data, Computing and Artificial Intelligence)
Show Figures

Figure 1

16 pages, 2470 KB  
Article
Integrated Methane Sensor Prototype Based on H-QEPAS Technique with a 3D-Printed Gas Chamber
by Jingze Cai, Yanjun Chen, Hanxu Ma, Shunda Qiao, Ying He, Qi Li, Tongyu Dai and Yufei Ma
Appl. Sci. 2026, 16(3), 1427; https://doi.org/10.3390/app16031427 - 30 Jan 2026
Viewed by 74
Abstract
In the paper, a heterodyne quartz-enhanced photoacoustic spectroscopy (H-QEPAS)-based integrated methane (CH4) sensor prototype is reported. The CH4 absorption line located at 1650.96 nm was selected as the target spectral line. The design features an integrated, 3D-printed gas chamber for [...] Read more.
In the paper, a heterodyne quartz-enhanced photoacoustic spectroscopy (H-QEPAS)-based integrated methane (CH4) sensor prototype is reported. The CH4 absorption line located at 1650.96 nm was selected as the target spectral line. The design features an integrated, 3D-printed gas chamber for reduced size and weight. To realize the coordinated operation of each hardware component, a control program was designed based on LabVIEW platform, enabling the adjustment of various hardware parameters. The piezoelectric signal generated by the quartz tuning fork (QTF) was amplified via a trans-impedance amplifier (TIA), acquired by a data acquisition card (DAQ), and then transmitted to a virtual lock-in amplifier (LIA) on the PC terminal for processing. The dimensions of the integrated CH4 sensor prototype are 33 cm in length, 27 cm in width, and 15 cm in height. The final test results demonstrate that the sensor prototype exhibits an excellent concentration linear response, with a detection limit of 26.72 ppm and a short detection time of approximately 4 s. Full article
(This article belongs to the Special Issue Latest Applications of Laser Measurement Technologies)
12 pages, 1282 KB  
Article
Assessing the Capabilities of Oil Detection Canines to Detect Submerged Weathered Oils in a Boreal Lake
by Vince Palace, Paul Bunker, Lauren Timlick, Christina Brewster, Ed Owens, James McCourt and David Dickins
Water 2026, 18(3), 355; https://doi.org/10.3390/w18030355 - 30 Jan 2026
Viewed by 89
Abstract
The efficacy of oil spill response depends on the speed of detecting the oil. Detecting submerged oil is more difficult than oil on the water surface, because most conventional sensors are not effective. Oil Detection Canines (ODCs) have been reliably used to detect [...] Read more.
The efficacy of oil spill response depends on the speed of detecting the oil. Detecting submerged oil is more difficult than oil on the water surface, because most conventional sensors are not effective. Oil Detection Canines (ODCs) have been reliably used to detect oil during shoreline spill surveys, and preliminary laboratory studies also showed promising results for detecting oil submerged under water. To confirm their potential, a field study was conducted in a boreal freshwater lake in Northwestern Ontario, Canada to investigate the capability of an ODC to detect submerged weathered oils at depths of 1 to 5 m. Triplicate targets at each depth used weathered diluted bitumen (dilbit), Bunker C residual fuel oil, and Maya crude oil burn residue and both the ODC and handler blinded to the location of each target. Boat-based searches were conducted and the handler identified “alerts” based on ODC behaviour changes that were compared to georeferenced oil target locations. The ODC positively identified seven (7) of the eight (8) dilbit targets at 1 to 5 m, five (5) of the six (6) Bunker C targets at 1 and 3 m, and none of the burn residue targets at 1-m depth. The ability of ODCs to detect submerged or sunken oil in shallow water was clearly demonstrated, adding another technique for submerged and sunken oil surveys with the advantages of real-time data returns, the ability to detect small oil deposits, and an operational capability in shallow waters with potential for detection in deeper water. Full article
Show Figures

Figure 1

22 pages, 4243 KB  
Article
Lumbar Shear Force Prediction Models for Ergonomic Assessment of Manual Lifting Tasks
by Davide Piovesan and Xiaoxu Ji
Appl. Sci. 2026, 16(3), 1414; https://doi.org/10.3390/app16031414 - 30 Jan 2026
Viewed by 69
Abstract
Lumbar shear forces are increasingly recognized as critical contributors to lower-back injury risk, yet most ergonomic assessment tools—most notably the Revised NIOSH Lifting Equation (RNLE)—do not directly estimate shear loading. This study develops and evaluates a family of linear mixed-effects regression models that [...] Read more.
Lumbar shear forces are increasingly recognized as critical contributors to lower-back injury risk, yet most ergonomic assessment tools—most notably the Revised NIOSH Lifting Equation (RNLE)—do not directly estimate shear loading. This study develops and evaluates a family of linear mixed-effects regression models that statistically predict L4/L5 lumbar shear force exposure using traditional NIOSH lifting parameters combined with posture descriptors extracted from digital human models. A harmonized dataset of 106 peak-shear lifting postures was compiled from five controlled laboratory studies, with lumbar shear forces obtained from validated biomechanical simulations implemented in the Siemens JACK (Siemens software, Plano, TX, USA) platform. Twelve model formulations were examined, varying in fixed-effect structure and hierarchical random effects, to quantify how load magnitude, hand location, sex, and joint posture relate to simulated task-level anterior–posterior shear exposure at the lumbar spine. Across all models, load magnitude and horizontal reach emerged as the strongest and most stable predictors of shear exposure, reflecting their direct mechanical influence on anterior spinal loading. Hip and knee flexion provided substantial additional explanatory power, highlighting the role of whole-body posture strategy in modulating shear demand. Upper-limb posture and coupling quality exhibited minimal or inconsistent effects once load geometry and lower-body posture were accounted for. Random-effects analyses demonstrated that meaningful variability arises from individual movement strategies and task conditions, underscoring the necessity of mixed-effects modeling for representing hierarchical structure in lifting data. Parsimonious models incorporating subject-level random intercepts produced the most stable and interpretable coefficients while maintaining strong goodness-of-fit. Overall, the findings extend the NIOSH framework by identifying posture-dependent determinants of lumbar shear exposure and by demonstrating that simulated shear loading can be reliably predicted using ergonomically accessible task descriptors. The proposed models are intended as statistical predictors of task-level shear exposure that complement—rather than replace—comprehensive biomechanical simulations. This work provides a quantitative foundation for integrating shear-aware metrics into ergonomic risk assessment practices, supporting posture-informed screening of manual material-handling tasks in field and sensor-based applications. Full article
(This article belongs to the Special Issue Novel Approaches and Applications in Ergonomic Design, 4th Edition)
Show Figures

Figure 1

39 pages, 2222 KB  
Review
Digital Technologies and Machine Learning in Environmental Hazard Monitoring: A Synthesis of Evidence for Floods, Air Pollution, Earthquakes, and Fires
by Jacek Lukasz Wilk-Jakubowski, Artur Kuchcinski, Grzegorz Kazimierz Wilk-Jakubowski, Andrzej Palej and Lukasz Pawlik
Sensors 2026, 26(3), 893; https://doi.org/10.3390/s26030893 - 29 Jan 2026
Viewed by 137
Abstract
This review synthesizes the state of the art on the integration of digital technologies, particularly machine learning, the Internet of Things (IoT), and advanced image processing techniques, for enhanced hazard monitoring. Focusing on air pollution, earthquakes, floods, and fires, we analyze articles selected [...] Read more.
This review synthesizes the state of the art on the integration of digital technologies, particularly machine learning, the Internet of Things (IoT), and advanced image processing techniques, for enhanced hazard monitoring. Focusing on air pollution, earthquakes, floods, and fires, we analyze articles selected from Scopus published between 2015 and 2024. This study classifies the selected articles based on hazard type, digital technology application, geographical location, and research methodology. We assess the effectiveness of various approaches in improving the accuracy and efficiency of hazard detection, monitoring, and prediction. The review highlights the growing trend of leveraging multi-sensor data fusion, deep learning models, and IoT-enabled systems for real-time monitoring and early warning. Furthermore, we identify key challenges and future directions in the development of robust and scalable hazard monitoring systems, emphasizing the importance of data-driven solutions for sustainable environmental management and disaster resilience. Full article
(This article belongs to the Special Issue Smart Gas Sensor Applications in Environmental Change Monitoring)
21 pages, 5931 KB  
Article
Validation of Inertial Sensor-Based Step Detection Algorithms for Edge Device Deployment
by Maksymilian Kisiel, Arslan Amjad and Agnieszka Szczęsna
Sensors 2026, 26(3), 876; https://doi.org/10.3390/s26030876 - 29 Jan 2026
Viewed by 116
Abstract
Step detection based on measurements of inertial measurement units (IMUs) is fundamental for human activity recognition, indoor navigation, and health monitoring applications. This study validates and compares five fundamentally different step detection algorithms for potential implementation on edge devices. A dedicated measurement system [...] Read more.
Step detection based on measurements of inertial measurement units (IMUs) is fundamental for human activity recognition, indoor navigation, and health monitoring applications. This study validates and compares five fundamentally different step detection algorithms for potential implementation on edge devices. A dedicated measurement system based on the Raspberry Pi Pico 2W microcontroller with two IMU sensors (Waveshare Pico-10DOF-IMU and Adafruit ST-9-DOF-Combo) was designed. The implemented algorithms include Peak Detection, Zero-Crossing, Spectral Analysis, Adaptive Threshold, and SHOE (Step Heading Offset Estimator). Validation was performed across 84 measurement sessions covering seven test scenarios (Timed Up and Go test, natural and fast walking, jogging, and stair climbing) and four sensor mounting locations (thigh pocket, ankle, wrist, and upper arm). Results demonstrate that Peak Detection achieved the best overall performance, with an average F1-score of 0.82, while Spectral Analysis excelled in stair scenarios (F1 = 0.86–0.92). Surprisingly, upper arm mounting yielded the highest accuracy (F1 = 0.84), outperforming ankle placement. The TUG clinical test proved most challenging (average F1 = 0.68), while fast walking was easiest (F1 = 0.87). Additionally, a preliminary application to 668 clinical TUG recordings from the open-access FRAILPOL database revealed algorithm-specific failure modes when continuous gait assumptions are violated. These findings provide practical guidelines for algorithm selection in edge computing applications and activity monitoring systems. Full article
(This article belongs to the Collection Sensors for Gait, Human Movement Analysis, and Health Monitoring)
Show Figures

Figure 1

21 pages, 3516 KB  
Article
Visual Navigation Using Depth Estimation Based on Hybrid Deep Learning in Sparsely Connected Path Networks for Robustness and Low Complexity
by Huda Al-Saedi, Pedram Salehpour and Seyyed Hadi Aghdasi
Appl. Syst. Innov. 2026, 9(2), 29; https://doi.org/10.3390/asi9020029 - 27 Jan 2026
Viewed by 208
Abstract
Robot navigation refers to a robot’s ability to determine its position within a reference frame and plan a path to a target location. Visual navigation, which relies on visual sensors such as cameras, is one approach to this problem. Among visual navigation methods, [...] Read more.
Robot navigation refers to a robot’s ability to determine its position within a reference frame and plan a path to a target location. Visual navigation, which relies on visual sensors such as cameras, is one approach to this problem. Among visual navigation methods, Visual Teach and Repeat (VT&R) techniques are commonly used. To develop an effective robot navigation framework based on the VT&R method, accurate and fast depth estimation of the scene is essential. In recent years, event cameras have garnered significant interest from machine vision researchers due to their numerous advantages and applicability in various environments, including robotics and drones. However, the main gap is how these cameras are used in a navigation system. The current research uses the attention-based UNET neural network to estimate the depth of a scene using an event camera. The attention-based UNET structure leads to accurate depth detection of the scene. This depth information is then used, together with a hybrid deep neural network consisting of a Convolutional Neural Network (CNN) and Long Short-Term Memory (LSTM), for robot navigation. Simulation results on the DENSE dataset yield an RMSE of 8.15, which is an acceptable result compared to other similar methods. This method not only provides good accuracy but also operates at high speed, making it suitable for real-time applications and visual navigation methods based on VT&R. Full article
(This article belongs to the Special Issue AI-Driven Decision Support for Systemic Innovation)
Show Figures

Figure 1

24 pages, 4205 KB  
Article
Data Fusion Method for Multi-Sensor Internet of Things Systems Including Data Imputation
by Saugat Sharma, Grzegorz Chmaj and Henry Selvaraj
IoT 2026, 7(1), 11; https://doi.org/10.3390/iot7010011 - 26 Jan 2026
Viewed by 171
Abstract
In Internet of Things (IoT) systems, data collected by geographically distributed sensors is often incomplete due to device failures, harsh deployment conditions, energy constraints, and unreliable communication. Such data gaps can significantly degrade downstream data processing and decision-making, particularly when failures result in [...] Read more.
In Internet of Things (IoT) systems, data collected by geographically distributed sensors is often incomplete due to device failures, harsh deployment conditions, energy constraints, and unreliable communication. Such data gaps can significantly degrade downstream data processing and decision-making, particularly when failures result in the loss of all locally redundant sensors. Conventional imputation approaches typically rely on historical trends or multi-sensor fusion within the same target environment; however, historical methods struggle to capture emerging patterns, while same-location fusion remains vulnerable to single-point failures when local redundancy is unavailable. This article proposes a correlation-aware, cross-location data fusion framework for data imputation in IoT networks that explicitly addresses single-point failure scenarios. Instead of relying on co-located sensors, the framework selectively fuses semantically similar features from independent and geographically distributed gateways using summary statistics-based and correlation screening to minimize communication overhead. The resulting fused dataset is then processed using a lightweight KNN with an Iterative PCA imputation method, which combines local neighborhood similarity with global covariance structure to generate synthetic data for missing values. The proposed framework is evaluated using real-world weather station data collected from eight geographically diverse locations across the United States. The experimental results show that the proposed approach achieves improved or comparable imputation accuracy relative to conventional same-location fusion methods when sufficient cross-location feature correlation exists and degrades gracefully when correlation is weak. By enabling data recovery without requiring redundant local sensors, the proposed approach provides a resource-efficient and failure-resilient solution for handling missing data in IoT systems. Full article
Show Figures

Figure 1

19 pages, 4306 KB  
Article
Sparse Reconstruction of Pressure Field for Wedge Passive Fluidic Thrust Vectoring Nozzle
by Zi Huang, Yunsong Gu, Qiuhui Xu and Linkai Li
Sensors 2026, 26(3), 811; https://doi.org/10.3390/s26030811 - 26 Jan 2026
Viewed by 177
Abstract
Fluidic thrust vectoring control (FTVC) enables highly agile flight without the mechanical complexity of traditional vectoring nozzles. However, a robust onboard identification of the jet deflection state remains challenging when only limited measurements are available. This study proposes a sparse reconstruction of the [...] Read more.
Fluidic thrust vectoring control (FTVC) enables highly agile flight without the mechanical complexity of traditional vectoring nozzles. However, a robust onboard identification of the jet deflection state remains challenging when only limited measurements are available. This study proposes a sparse reconstruction of the pressure field method for a wedge passive FTVC nozzle and validates the approach experimentally on a low-speed jet platform. By combining the proper orthogonal decomposition (POD) algorithm with an l1-regularized compressed sensing method, a full Coanda wall pressure distribution is reconstructed from the sparse measurements. A genetic algorithm is then employed to optimize the wall pressure tap locations, identifying an optimal layout. With only four pressure taps, the local pressure coefficient errors were maintained within |ΔCp| < 0.02. In contrast, conventional Kriging interpolation requires increasing the sensor count to 13 to approach the reconstruction level of the proposed POD–compressed sensing method using 4 sensors, yet still exhibits a reduced fidelity in capturing key flow structure characteristics. Overall, the proposed approach provides an efficient and physically interpretable strategy for pressure field estimation, supporting lightweight, low-maintenance, and precise fluidic thrust vectoring control. Full article
(This article belongs to the Topic Advanced Engines Technologies)
Show Figures

Figure 1

Back to TopTop