Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (297)

Search Parameters:
Keywords = IMU–GNSS

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
54 pages, 8516 KB  
Review
Interdisciplinary Applications of LiDAR in Forest Studies: Advances in Sensors, Methods, and Cross-Domain Metrics
by Nadeem Fareed, Carlos Alberto Silva, Izaya Numata and Joao Paulo Flores
Remote Sens. 2026, 18(2), 219; https://doi.org/10.3390/rs18020219 - 9 Jan 2026
Viewed by 385
Abstract
Over the past two decades, Light Detection and Ranging (LiDAR) technology has evolved from early National Aeronautics and Space Administration (NASA)-led airborne laser altimetry into commercially mature systems that now underpin vegetation remote sensing across scales. Continuous advancements in laser engineering, signal processing, [...] Read more.
Over the past two decades, Light Detection and Ranging (LiDAR) technology has evolved from early National Aeronautics and Space Administration (NASA)-led airborne laser altimetry into commercially mature systems that now underpin vegetation remote sensing across scales. Continuous advancements in laser engineering, signal processing, and complementary technologies—such as Inertial Measurement Units (IMU) and Global Navigation Satellite Systems (GNSS)—have yielded compact, cost-effective, and highly sophisticated LiDAR sensors. Concurrently, innovations in carrier platforms, including uncrewed aerial systems (UAS), mobile laser scanning (MLS), Simultaneous Localization and Mapping (SLAM) frameworks, have expanded LiDAR’s observational capacity from plot- to global-scale applications in forestry, precision agriculture, ecological monitoring, Above Ground Biomass (AGB) modeling, and wildfire science. This review synthesizes LiDAR’s cross-domain capabilities for the following: (a) quantifying vegetation structure, function, and compositional dynamics; (b) recent sensor developments encompassing ALS discrete-return (ALSD), and ALS full-waveform (ALSFW), photon-counting LiDAR (PCL), emerging multispectral LiDAR (MSL), and hyperspectral LiDAR (HSL) systems; and (c) state-of-the-art data processing and fusion workflows integrating optical and radar datasets. The synthesis demonstrates that many LiDAR-derived vegetation metrics are inherently transferable across domains when interpreted within a unified structural framework. The review further highlights the growing role of artificial-intelligence (AI)-driven approaches for segmentation, classification, and multitemporal analysis, enabling scalable assessments of vegetation dynamics at unprecedented spatial and temporal extents. By consolidating historical developments, current methodological advances, and emerging research directions, this review establishes a comprehensive state-of-the-art perspective on LiDAR’s transformative role and future potential in monitoring and modeling Earth’s vegetated ecosystems. Full article
(This article belongs to the Special Issue Digital Modeling for Sustainable Forest Management)
Show Figures

Graphical abstract

10 pages, 2505 KB  
Proceeding Paper
Flight Test Performance Assessment of a Machine-Learning Software-Enhanced Inertial Navigation System
by Matthew Starkey, Carl Sequeira, Conrad Rider, Gabriel Furse and Dylan Palmer-Jorge
Eng. Proc. 2025, 88(1), 79; https://doi.org/10.3390/engproc2025088079 - 6 Jan 2026
Viewed by 148
Abstract
In this paper, Flare Bright presents flight test results gathered using a ~2m fixed wingspan drone to demonstrate the capability that has been achieved using an Inertial Navigation System (INS) augmented by Machine Learning tuned software. INSs, using Inertial Measurement Units (IMUs), are [...] Read more.
In this paper, Flare Bright presents flight test results gathered using a ~2m fixed wingspan drone to demonstrate the capability that has been achieved using an Inertial Navigation System (INS) augmented by Machine Learning tuned software. INSs, using Inertial Measurement Units (IMUs), are invaluable for position estimation in GNSS-compromised environments as no external information is required. However, with no absolute measurement of a vehicle’s position or attitude, INSs suffer from significant drift over time. The results from a robust flight test programme, over multiple vehicles, terrains and flight paths, show how Flare Bright combined a low cost and low SWaP (space, weight and power) IMU, with their patent-pending software-only techniques, to boost INS performance to the degree of besting a ‘tactical grade’ IMU in ~20 min. These results credibly demonstrate the value of Flare Bright’s solution as an effective, low-cost and low-weight INS for extended flight operations of small uncrewed aerial systems in GNSS-compromised environments, with performance comparable to heavier, more expensive high-end IMUs. Full article
(This article belongs to the Proceedings of European Navigation Conference 2024)
Show Figures

Figure 1

26 pages, 3302 KB  
Article
An Autonomous Land Vehicle Navigation System Based on a Wheel-Mounted IMU
by Shuang Du, Wei Sun, Xin Wang, Yuyang Zhang, Yongxin Zhang and Qihang Li
Sensors 2026, 26(1), 328; https://doi.org/10.3390/s26010328 - 4 Jan 2026
Viewed by 383
Abstract
Navigation errors due to drifting in inertial systems using low-cost sensors are some of the main challenges for land vehicle navigation in Global Navigation Satellite System (GNSS)-denied environments. In this paper, we propose an autonomous navigation strategy with a wheel-mounted microelectromechanical system (MEMS) [...] Read more.
Navigation errors due to drifting in inertial systems using low-cost sensors are some of the main challenges for land vehicle navigation in Global Navigation Satellite System (GNSS)-denied environments. In this paper, we propose an autonomous navigation strategy with a wheel-mounted microelectromechanical system (MEMS) inertial measurement unit (IMU), referred to as the wheeled inertial navigation system (INS), to effectively suppress drifted navigation errors. The position, velocity, and attitude (PVA) of the vehicle are predicted through the inertial mechanization algorithm, while gyro outputs are utilized to derive the vehicle’s forward velocity, which is treated as an observation with non-holonomic constraints (NHCs) to estimate the inertial navigation error states. To establish a theoretical foundation for wheeled INS error characteristics, a comprehensive system observability analysis is conducted from an analytical point of view. The wheel rotation significantly improves the observability of gyro errors perpendicular to the rotation axis, which effectively suppresses azimuth errors, horizontal velocity, and position errors. This leads to the superior navigation performance of a wheeled INS over the traditional odometer (OD)/NHC/INS. Moreover, a hybrid extended particle filter (EPF), which fuses the extended Kalman filter (EKF) and PF, is proposed to update the vehicle’s navigation states. It has the advantages of (1) dealing with the system’s non-linearity and non-Gaussian noises, and (2) simultaneously achieving both a high level of accuracy in its estimation and tolerable computational complexity. Kinematic field test results indicate that the proposed wheeled INS is able to provide an accurate navigation solution in GNSS-denied environments. When a total distance of over 26 km is traveled, the maximum position drift rate is only 0.47% and the root mean square (RMS) of the heading error is 1.13°. Full article
(This article belongs to the Section Navigation and Positioning)
Show Figures

Figure 1

15 pages, 1584 KB  
Article
Curvature-Constrained Motion Planning Method for Differential-Drive Mobile Robot Platforms
by Rudolf Krecht and Áron Ballagi
Appl. Sci. 2026, 16(1), 322; https://doi.org/10.3390/app16010322 - 28 Dec 2025
Viewed by 320
Abstract
Compact heavy-duty skid-steer robots are increasingly used for city logistics and intralogistics tasks where high payload capacity and stability are required. However, their limited maneuverability and non-negligible turning radius challenge conventional waypoint-tracking controllers that assume unconstrained motion. This paper proposes a curvature-constrained trajectory [...] Read more.
Compact heavy-duty skid-steer robots are increasingly used for city logistics and intralogistics tasks where high payload capacity and stability are required. However, their limited maneuverability and non-negligible turning radius challenge conventional waypoint-tracking controllers that assume unconstrained motion. This paper proposes a curvature-constrained trajectory planning and control framework that guarantees geometrically feasible motion for such platforms. The controller integrates an explicit curvature limit into a finite-state machine, ensuring smooth heading transitions without in-place rotation. The overall architecture integrates GNSS-RTK and IMU localization, modular ROS 2 nodes for trajectory execution, and a supervisory interface developed in Foxglove Studio for intuitive mission planning. Field trials on a custom four-wheel-drive skid-steer platform demonstrate centimeter-scale waypoint accuracy on straight and curved trajectories, with stable curvature compliance across all tested scenarios. The proposed method achieves the smoothness required by most applications while maintaining the computational simplicity of geometric followers. Computational simplicity is reflected in the absence of online optimization or trajectory reparameterization; the controller executes a constant-time geometric update per cycle, independent of waypoint count. The results confirm that curvature-aware control enables reliable navigation of compact heavy-duty robots in semi-structured outdoor environments and provides a practical foundation for future extensions. Full article
(This article belongs to the Special Issue Sustainable Mobility and Transportation (SMTS 2025))
Show Figures

Figure 1

26 pages, 4779 KB  
Article
MF-IEKF: A Multiplicative Federated Invariant Extended Kalman Filter for INS/GNSS
by Lebin Zhao, Tao Chen, Peipei Yuan, Xiaoyang Li and Yang Luo
Sensors 2026, 26(1), 127; https://doi.org/10.3390/s26010127 - 24 Dec 2025
Viewed by 452
Abstract
The integration of an inertial navigation system (INS) with the Global Navigation Satellite System (GNSS) is crucial for suppressing the error drift of the INS. However, traditional fusion methods based on the extended Kalman filter (EKF) suffer from geometric inconsistency, leading to biased [...] Read more.
The integration of an inertial navigation system (INS) with the Global Navigation Satellite System (GNSS) is crucial for suppressing the error drift of the INS. However, traditional fusion methods based on the extended Kalman filter (EKF) suffer from geometric inconsistency, leading to biased estimates—a problem markedly exacerbated under large initial misalignment angles. The invariant extended Kalman filter (IEKF) embeds the state in the Lie group SE2(3) to establish a more consistent framework, yet two limitations remain. First, its standard update fails to synergize complementary error information within the left-invariant formulation, capping estimation accuracy. Second, velocity and position states converge slowly under extreme misalignment. To address these issues, a multiplicative federated IEKF (MF-IEKF) was proposed. A geometrically consistent state propagation model on SE2(3) is derived from multiplicative IMU pre-integration. Two parallel, mutually inverse left-invariant error sub-filters (ML1-IEKF and ML2-IEKF) cooperate to improve overall accuracy. For large-misalignment scenarios, a short-term multiplicative right-invariant sub-filter is introduced to suppress initial position and velocity errors. Extensive Monte Carlo simulations and KITTI dataset experiments show that MF-IEKF achieves higher navigation accuracy and robustness than ML1-IEKF. Full article
(This article belongs to the Section Intelligent Sensors)
Show Figures

Figure 1

24 pages, 4196 KB  
Article
Real-Time Cooperative Path Planning and Collision Avoidance for Autonomous Logistics Vehicles Using Reinforcement Learning and Distributed Model Predictive Control
by Mingxin Li, Hui Li, Yunan Yao, Yulei Zhu, Hailong Weng, Huabiao Jin and Taiwei Yang
Machines 2026, 14(1), 27; https://doi.org/10.3390/machines14010027 - 24 Dec 2025
Viewed by 322
Abstract
In industrial environments such as ports and warehouses, autonomous logistics vehicles face significant challenges in coordinating multiple vehicles while ensuring safe and efficient path planning. This study proposes a novel real-time cooperative control framework for autonomous vehicles, combining reinforcement learning (RL) and distributed [...] Read more.
In industrial environments such as ports and warehouses, autonomous logistics vehicles face significant challenges in coordinating multiple vehicles while ensuring safe and efficient path planning. This study proposes a novel real-time cooperative control framework for autonomous vehicles, combining reinforcement learning (RL) and distributed model predictive control (DMPC). The RL agent dynamically adjusts the optimization weights of the DMPC to adapt to the vehicle’s real-time environment, while the DMPC enables decentralized path planning and collision avoidance. The system leverages multi-source sensor fusion, including GNSS, UWB, IMU, LiDAR, and stereo cameras, to provide accurate state estimations of vehicles. Simulation results demonstrate that the proposed RL-DMPC approach outperforms traditional centralized control strategies in terms of tracking accuracy, collision avoidance, and safety margins. Furthermore, the proposed method significantly improves control smoothness compared to rule-based strategies. This framework is particularly effective in dynamic and constrained industrial settings, offering a robust solution for multi-vehicle coordination with minimal communication delays. The study highlights the potential of combining RL with DMPC to achieve real-time, scalable, and adaptive solutions for autonomous logistics. Full article
(This article belongs to the Special Issue Control and Path Planning for Autonomous Vehicles)
Show Figures

Figure 1

18 pages, 8006 KB  
Article
Optimal Low-Cost MEMS INS/GNSS Integrated Georeferencing Solution for LiDAR Mobile Mapping Applications
by Nasir Al-Shereiqi, Mohammed El-Diasty and Ghazi Al-Rawas
Sensors 2025, 25(24), 7683; https://doi.org/10.3390/s25247683 - 18 Dec 2025
Viewed by 397
Abstract
Mobile mapping systems using LiDAR technology are becoming a reliable surveying technique to generate accurate point clouds. Mobile mapping systems integrate several advanced surveying technologies. This research investigated the development of a low-cost, accurate Microelectromechanical System (MEMS)-based INS/GNSS georeferencing system for LiDAR mobile [...] Read more.
Mobile mapping systems using LiDAR technology are becoming a reliable surveying technique to generate accurate point clouds. Mobile mapping systems integrate several advanced surveying technologies. This research investigated the development of a low-cost, accurate Microelectromechanical System (MEMS)-based INS/GNSS georeferencing system for LiDAR mobile mapping applications, enabling the generation of accurate point clouds. The challenge of using the MEMS IMU is that it is contaminated by high levels of noise and bias instability. To overcome this issue, new denoising and filtering methods were developed using a wavelet neural network (WNN) and an optimal maximum likelihood estimator (MLE) method to achieve an accurate MEMS-based INS/GNSS integration navigation solution for LiDAR mobile mapping applications. Moreover, the final accuracy of the MEMS-based INS/GNSS navigation solution was compared with the ASPRS standards for geospatial data production. It was found that the proposed WNN denoising method improved the MEMS-based INS/GNSS integration accuracy by approximately 11%, and that the optimal MLE method achieved approximately 12% higher accuracy than the forward-only navigation solution without GNSS outages. The proposed WNN denoising outperforms the current state-of-the-art Long Short-Term Memory (LSTM)–Recurrent Neural Network (RNN), or LSTM-RNN, denoising model. Additionally, it was found that, depending on the sensor–object distance, the accuracy of the optimal MLE-based MEMS INS/GNSS navigation solution with WNN denoising ranged from 1 to 3 cm for ground mapping and from 1 to 9 cm for building mapping, which can fulfill the ASPRS standards of classes 1 to 3 and classes 1 to 9 for ground and building mapping cases, respectively. Full article
(This article belongs to the Section Industrial Sensors)
Show Figures

Figure 1

22 pages, 3132 KB  
Article
A Study on a Low-Cost IMU/Doppler Integrated Velocity Estimation Method Under Insufficient GNSS Observation Conditions
by Yinggang Wang, Hongli Zhang, Kemeng Li, Hanghang Xu and Yijin Chen
Sensors 2025, 25(24), 7674; https://doi.org/10.3390/s25247674 - 18 Dec 2025
Viewed by 517
Abstract
The Global Navigation Satellite System (GNSS)/Inertial Measurement Unit (IMU) Loosely Coupled (LC) integration framework has been widely adopted due to its simple structure, but it relies on complete GNSS position and velocity solutions, and the rapid accumulation of IMU errors can easily lead [...] Read more.
The Global Navigation Satellite System (GNSS)/Inertial Measurement Unit (IMU) Loosely Coupled (LC) integration framework has been widely adopted due to its simple structure, but it relies on complete GNSS position and velocity solutions, and the rapid accumulation of IMU errors can easily lead to navigation failure when fewer than four satellites are visible. In this paper, GNSS Doppler observations are fused with IMU attitude information within an LC framework. An inter-satellite differential Doppler model is introduced, and the velocity obtained from the differential Doppler solution is transformed into the navigation frame using the IMU-derived attitude, enabling three-dimensional velocity estimation in the navigation frame even when only two satellites are available. Analysis of real vehicle data collected by the GREAT team at Wuhan University shows that the Signal-to-Noise Ratio (SNR) and the geometric relationship between the Satellite Difference Vector (SDV) and the Receiver Motion Direction (RMD) are the dominant factors affecting velocity accuracy. A multi-factor threshold screening strategy further indicates that when SNR> 40 and SDV·RMD >0.2, the Root Mean Square (RMS) of the velocity error is approximately 0.3 m/s and the data retention rate exceeds 44%, achieving a good balance between accuracy and availability. The results indicate that, while maintaining a simple system structure, the proposed Doppler–IMU fusion method can significantly enhance velocity robustness and positioning continuity within an LC architecture under weak GNSS conditions (when more than two satellites are visible but standalone GNSS positioning is still unavailable), and is suitable for constructing low-cost, highly reliable integrated navigation systems. Full article
(This article belongs to the Section Navigation and Positioning)
Show Figures

Figure 1

24 pages, 7868 KB  
Article
An Indoor UAV Localization Framework with ESKF Tightly-Coupled Fusion and Multi-Epoch UWB Outlier Rejection
by Jianmin Zhao, Zhongliang Deng, Enwen Hu, Wenju Su, Boyang Lou and Yanxu Liu
Sensors 2025, 25(24), 7673; https://doi.org/10.3390/s25247673 - 18 Dec 2025
Viewed by 444
Abstract
Unmanned aerial vehicles (UAVs) are increasingly used indoors for inspection, security, and emergency tasks. Achieving accurate and robust localization under Global Navigation Satellite System (GNSS) unavailability and obstacle occlusions is therefore a critical challenge. Due to their inherent physical limitations, Inertial Measurement Unit [...] Read more.
Unmanned aerial vehicles (UAVs) are increasingly used indoors for inspection, security, and emergency tasks. Achieving accurate and robust localization under Global Navigation Satellite System (GNSS) unavailability and obstacle occlusions is therefore a critical challenge. Due to their inherent physical limitations, Inertial Measurement Unit (IMU)–based localization errors accumulate over time, Ultra-Wideband (UWB) measurements suffer from systematic biases in Non-Line-of-Sight (NLOS) environments and Visual–Inertial Odometry (VIO) depends heavily on environmental features, making it susceptible to long-term drift. We propose a tightly coupled fusion framework based on the Error-State Kalman Filter (ESKF). Using an IMU motion model for prediction, the method incorporates raw UWB ranges, VIO relative poses, and TFmini altitude in the update step. To suppress abnormal UWB measurements, a multi-epoch outlier rejection method constrained by VIO is developed, which can robustly eliminate NLOS range measurements and effectively mitigate the influence of outliers on observation updates. This framework improves both observation quality and fusion stability. We validate the proposed method on a real-world platform in an underground parking garage. Experimental results demonstrate that, in complex indoor environments, the proposed approach exhibits significant advantages over existing algorithms, achieving higher localization accuracy and robustness while effectively suppressing UWB NLOS errors as well as IMU and VIO drift. Full article
(This article belongs to the Section Navigation and Positioning)
Show Figures

Figure 1

21 pages, 2192 KB  
Article
Development, Implementation and Experimental Assessment of Path-Following Controllers on a 1:5 Scale Vehicle Testbed
by Luca Biondo, Angelo Domenico Vella and Alessandro Vigliani
Machines 2025, 13(12), 1116; https://doi.org/10.3390/machines13121116 - 3 Dec 2025
Viewed by 461
Abstract
The development of control strategies for autonomous vehicles requires a reliable and cost-effective validation approach. In this context, testbeds enabling repeatable experiments under controlled conditions are gaining relevance. Scaled vehicles have proven to be a valuable alternative to full-scale or simulation-based testing, enabling [...] Read more.
The development of control strategies for autonomous vehicles requires a reliable and cost-effective validation approach. In this context, testbeds enabling repeatable experiments under controlled conditions are gaining relevance. Scaled vehicles have proven to be a valuable alternative to full-scale or simulation-based testing, enabling experimental validation while reducing costs and risks. This work presents a 1:5 scale modular vehicle platform, derived from a commercial Radio-Controlled (RC) vehicle and adapted as experimental testbed for control strategy validation and vehicle dynamics studies. The vehicle features an electric powertrain, operated through a Speedgoat Baseline Real-Time Target Machine (SBRTM). The hardware architecture includes a high-performance Inertial Measurement Unit (IMU) with embedded Global Navigation Satellite System (GNSS). An Extended Kalman Filter (EKF) is implemented to enhance positioning accuracy by fusing inertial and GNSS data, providing reliable estimates of the vehicle position, velocity, and orientation. Two path-following algorithms, i.e., Stanley Controller (SC) and the Linear Quadratic Regulator (LQR), are designed and integrated. Outdoor experimental tests enable the evaluation of tracking accuracy and robustness. The results demonstrate that the proposed scaled testbed constitutes a reliable and flexible platform for benchmarking autonomous vehicle controllers and enabling experimental testing. Full article
Show Figures

Figure 1

28 pages, 82399 KB  
Article
Assessment of Smartphone GNSS Measurements in Tightly Coupled Visual Inertial Navigation
by Mehmet Fikret Ocal, Murat Durmaz, Engin Tunali and Hasan Yildiz
Appl. Sci. 2025, 15(23), 12796; https://doi.org/10.3390/app152312796 - 3 Dec 2025
Viewed by 2201
Abstract
Precise, seamless, and high-rate navigation remains a major challenge, particularly when relying on low-cost sensors. With the decreasing cost of cameras, Inertial Measurement Units (IMUs), and Global Navigation Satellite System (GNSS) receivers, tightly coupled fusion frameworks, such as GVINS, have gained considerable attention. [...] Read more.
Precise, seamless, and high-rate navigation remains a major challenge, particularly when relying on low-cost sensors. With the decreasing cost of cameras, Inertial Measurement Units (IMUs), and Global Navigation Satellite System (GNSS) receivers, tightly coupled fusion frameworks, such as GVINS, have gained considerable attention. GVINS is an optimization-based factor-graph framework that integrates visual and inertial measurements with single-frequency GNSS-code pseudorange observations to provide robust and drift-free navigation. This study aimed to evaluate the potential of applying GVINS to low-cost, low-power, and single-frequency GNSS receivers, particularly those embedded in smartphones, by integrating 1 Hz GNSS measurements collected in three challenging urban scenarios into the GVINS framework to produce seamless 10 Hz positioning estimates. The experiments were conducted using an Xsens MTi-1 IMU and global-shutter (GS) cameras, as well as a Samsung A51 smartphone and a u-blox ZED-F9P GNSS receiver. GVINS was modified to process 1 Hz GNSS measurements. Differential corrections from a nearby GNSS reference station were also incorporated to assess their impact on optimization-based filters, such as GVINS. The performance of GVINS and Differential GVINS (D-GVINS) solutions using smartphone measurements was compared against standard point positioning (SPP) and differential GPS (DGPS) results obtained from the same smartphone GNSS receiver, as well as the GVINS solution derived from u-blox ZED-F9P measurements sampled at 1 Hz. Experimental results show that GVINS effectively operates with smartphone GNSS measurements, reducing 3D RMS errors by 80.4%, 64.9%, and 83.8% for the sports field, campus-walking, and campus-driving datasets, respectively, when differential corrections are applied relative to the SPP solution. These results highlight the potential of smartphone GNSS receivers within the GVINS framework: Even though they observe fewer constellations, lower signal quality, and a lower number of satellites, they can still achieve a performance comparable to that of a relatively higher-end dual-frequency GNSS receiver, the u-blox ZED-F9P. Further studies will focus on adapting the GVINS algorithm to run directly on smartphones to utilize all the available measurements, including the camera, IMU, barometer, magnetometer, and additional ranging sensors. Full article
Show Figures

Figure 1

30 pages, 7942 KB  
Article
Research on Agricultural Autonomous Positioning and Navigation System Based on LIO-SAM and Apriltag Fusion
by Xianping Guan, Hongrui Ge, Shicheng Nie and Yuhan Ding
Agronomy 2025, 15(12), 2731; https://doi.org/10.3390/agronomy15122731 - 27 Nov 2025
Viewed by 891
Abstract
The application of autonomous navigation in intelligent agriculture is becoming more and more extensive. Traditional navigation schemes in greenhouses, orchards, and other agricultural environments often have problems such as the inability to deal with an uneven illumination distribution, complex layout, highly repetitive and [...] Read more.
The application of autonomous navigation in intelligent agriculture is becoming more and more extensive. Traditional navigation schemes in greenhouses, orchards, and other agricultural environments often have problems such as the inability to deal with an uneven illumination distribution, complex layout, highly repetitive and similar structures, and difficulty in receiving GNSS (Global Navigation Satellite System) signals. In order to solve this problem, this paper proposes a new tightly coupled LiDAR (Light Detection and Ranging) inertial odometry SLAM (LIO-SAM) framework named April-LIO-SAM. The framework innovatively uses Apriltag, a two-dimensional bar code widely used for precise positioning, pose estimation, and scene recognition of objects as a global positioning beacon to replace GNSS to provide absolute pose observation. The system uses three-dimensional LiDAR (VLP-16) and IMU (inertial measurement unit) to collect environmental data and uses Apriltag as absolute coordinates instead of GNSS to solve the problem of unreliable GNSS signal reception in greenhouses, orchards, and other agricultural environments. The SLAM trajectories and navigation performance were validated in a carefully built greenhouse and orchard environment. The experimental results show that the navigation map developed by the April-LIO-SAM yields a root mean square error of 0.057 m. The average positioning errors are 0.041 m, 0.049 m, 0.056 m, and 0.070 m, respectively, when the density of Apriltag is 3 m, 5 m, and 7 m. The navigation experimental results indicate that, at speeds of 0.4, 0.3, and 0.2 m/s, the average lateral deviation is less than 0.053 m, with a standard deviation below 0.034 m. The average heading deviation is less than 2.3°, with a standard deviation below 1.6°. The positioning stability experiments under interference conditions such as illumination and occlusion were carried out. It was verified that the system maintained a good stability under complex external conditions, and the positioning error fluctuation was within 3.0 mm. The results confirm that the robot positioning and navigation accuracy of mobile robots satisfy the continuity in the facility. Full article
(This article belongs to the Special Issue Research Progress in Agricultural Robots in Arable Farming)
Show Figures

Figure 1

37 pages, 2062 KB  
Article
Neural Networks for Estimating Attitude, Line of Sight, and GNSS Ambiguity Through Onboard Sensor Fusion
by Raul de Celis and Luis Cadarso
Sensors 2025, 25(23), 7212; https://doi.org/10.3390/s25237212 - 26 Nov 2025
Viewed by 752
Abstract
Accurate estimation of attitude, line of sight (LOS), and carrier-phase ambiguity is essential for the performance of Guidance, Navigation, and Control (GNC) systems operating under highly dynamic and uncertain conditions. Traditional sensor fusion and filtering methods, although effective, often require precise modeling and [...] Read more.
Accurate estimation of attitude, line of sight (LOS), and carrier-phase ambiguity is essential for the performance of Guidance, Navigation, and Control (GNC) systems operating under highly dynamic and uncertain conditions. Traditional sensor fusion and filtering methods, although effective, often require precise modeling and high-grade sensors to maintain robustness. This paper investigates a deep learning-based estimation framework for attitude, LOS, and GNSS ambiguity through the fusion of onboard sensors—GNSS, IMU, and semi-active laser (SAL)—and remote sensing information. Two neural network estimators are developed to address the most critical components of the navigation chain: GNSS carrier-phase ambiguity and gravity-vector reconstruction in the body frame, which are integrated into a hybrid guidance and navigation scheme for attitude and LOS determination. These learning-based estimators capture nonlinear relationships between sensor measurements and physical states, improving generalization under degraded conditions. The proposed system is validated in a six-degree-of-freedom (6-DoF) simulation environment that includes full aerodynamic modeling of artillery guided rockets. Comparative analyses demonstrate that the learning-based ambiguity and gravity estimators reduce overall latency, enhance estimation accuracy, and improve guidance precision compared to conventional networks. The results suggest that deep learning-based sensor fusion can serve as a practical foundation for next-generation low-cost GNC systems, enabling precise and reliable operation in scenarios with limited observability or sensor degradation. Full article
(This article belongs to the Section Remote Sensors)
Show Figures

Figure 1

23 pages, 4676 KB  
Article
A Study on a High-Precision 3D Position Estimation Technique Using Only an IMU in a GNSS Shadow Zone
by Yanyun Ding, Yunsik Kim and Hunkee Kim
Sensors 2025, 25(23), 7133; https://doi.org/10.3390/s25237133 - 22 Nov 2025
Viewed by 710
Abstract
In Global Navigation Satellite System (GNSS)-denied environments, reconstructing three dimensional trajectories using only an Inertial Measurement Unit faces challenges such as heading drift, stride error accumulation, and gait recognition uncertainty. This paper proposes a path estimation method with a nine-axis inertial sensor that [...] Read more.
In Global Navigation Satellite System (GNSS)-denied environments, reconstructing three dimensional trajectories using only an Inertial Measurement Unit faces challenges such as heading drift, stride error accumulation, and gait recognition uncertainty. This paper proposes a path estimation method with a nine-axis inertial sensor that continuously and accurately estimates an agent’s path without external support. The method detects stationary states and halts updates to suppress error propagation. During motion, gait modes including flat walking, stair ascent, and stair descent are classified using vertical acceleration with dynamic thresholds. Vertical displacement is estimated by combining gait pattern and posture angle during stair traversal, while planar displacement is updated through adaptive stride length adjustment based on gait cycle and movement magnitude. Heading is derived from the attitude matrix aligned with magnetic north, enabling projection of displacements onto a unified frame. Experiments show planar errors below three percent for one-hundred-meter paths and vertical errors under two percent in stair environments up to ten stories, with stable heading maintained. Overall, the method achieves reliable gait recognition and continuous three-dimensional trajectory reconstruction with low computational cost, using only a single inertial sensor and no additional devices. Full article
(This article belongs to the Section Navigation and Positioning)
Show Figures

Figure 1

24 pages, 3456 KB  
Article
Field Testing of ADAS Technologies in Naturalistic Driving Conditions
by Adam Skokan
Vehicles 2025, 7(4), 135; https://doi.org/10.3390/vehicles7040135 - 21 Nov 2025
Viewed by 513
Abstract
This paper evaluates Advanced Driver Assistance Systems (ADASs) in test scenarios derived from naturalistic driving and crash data, mapped to ISO 26262, ISO/PAS 21448 (SOTIF), and ISO 34502. From eight high-risk scenarios, it is validated for left turns across oncoming traffic on a [...] Read more.
This paper evaluates Advanced Driver Assistance Systems (ADASs) in test scenarios derived from naturalistic driving and crash data, mapped to ISO 26262, ISO/PAS 21448 (SOTIF), and ISO 34502. From eight high-risk scenarios, it is validated for left turns across oncoming traffic on a proving ground using a Škoda Superb iV against a soft Global Vehicle Target. ODD and spatiotemporal thresholds are parameterized and speed/acceleration profiles from GNSS/IMU data are analyzed. AEB and FCW performance varies across nominally identical runs, driven by human-in-the-loop variability and target detectability. In successful interventions, peak deceleration reached −0.64 g, meeting UNECE R152 criteria; in other runs, late detection narrowed TTC below intervention thresholds, leading to contact. Limitations in current protocols are identified and argue for scenario catalogs with realistic context (weather, surface, masking) and latency-aware metrics. The results motivate extending validation beyond standard tracks toward mixed methods linking simulation, scenario databases, and instrumented field trials. Full article
Show Figures

Figure 1

Back to TopTop