error_outline You can access the new MDPI.com website here. Explore and share your feedback with us.
 
 
Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (268)

Search Parameters:
Keywords = autonomous inertial navigation system

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
26 pages, 3302 KB  
Article
An Autonomous Land Vehicle Navigation System Based on a Wheel-Mounted IMU
by Shuang Du, Wei Sun, Xin Wang, Yuyang Zhang, Yongxin Zhang and Qihang Li
Sensors 2026, 26(1), 328; https://doi.org/10.3390/s26010328 - 4 Jan 2026
Viewed by 206
Abstract
Navigation errors due to drifting in inertial systems using low-cost sensors are some of the main challenges for land vehicle navigation in Global Navigation Satellite System (GNSS)-denied environments. In this paper, we propose an autonomous navigation strategy with a wheel-mounted microelectromechanical system (MEMS) [...] Read more.
Navigation errors due to drifting in inertial systems using low-cost sensors are some of the main challenges for land vehicle navigation in Global Navigation Satellite System (GNSS)-denied environments. In this paper, we propose an autonomous navigation strategy with a wheel-mounted microelectromechanical system (MEMS) inertial measurement unit (IMU), referred to as the wheeled inertial navigation system (INS), to effectively suppress drifted navigation errors. The position, velocity, and attitude (PVA) of the vehicle are predicted through the inertial mechanization algorithm, while gyro outputs are utilized to derive the vehicle’s forward velocity, which is treated as an observation with non-holonomic constraints (NHCs) to estimate the inertial navigation error states. To establish a theoretical foundation for wheeled INS error characteristics, a comprehensive system observability analysis is conducted from an analytical point of view. The wheel rotation significantly improves the observability of gyro errors perpendicular to the rotation axis, which effectively suppresses azimuth errors, horizontal velocity, and position errors. This leads to the superior navigation performance of a wheeled INS over the traditional odometer (OD)/NHC/INS. Moreover, a hybrid extended particle filter (EPF), which fuses the extended Kalman filter (EKF) and PF, is proposed to update the vehicle’s navigation states. It has the advantages of (1) dealing with the system’s non-linearity and non-Gaussian noises, and (2) simultaneously achieving both a high level of accuracy in its estimation and tolerable computational complexity. Kinematic field test results indicate that the proposed wheeled INS is able to provide an accurate navigation solution in GNSS-denied environments. When a total distance of over 26 km is traveled, the maximum position drift rate is only 0.47% and the root mean square (RMS) of the heading error is 1.13°. Full article
(This article belongs to the Section Navigation and Positioning)
Show Figures

Figure 1

27 pages, 5167 KB  
Article
Autonomous Locomotion and Embedded Trajectory Control in Miniature Robots Using Piezoelectric-Actuated 3D-Printed Resonators
by Byron Ricardo Zapata Chancusig, Jaime Rolando Heredia Velastegui, Víctor Ruiz-Díez and José Luis Sánchez-Rojas
Actuators 2026, 15(1), 23; https://doi.org/10.3390/act15010023 - 1 Jan 2026
Viewed by 295
Abstract
This article presents the design, fabrication, and experimental validation of a centimeter-scale autonomous robot that achieves bidirectional locomotion and trajectory control through 3D-printed resonators actuated by piezoelectricity and integrated with miniature legs. Building on previous works that employed piezoelectric bimorphs, the proposed system [...] Read more.
This article presents the design, fabrication, and experimental validation of a centimeter-scale autonomous robot that achieves bidirectional locomotion and trajectory control through 3D-printed resonators actuated by piezoelectricity and integrated with miniature legs. Building on previous works that employed piezoelectric bimorphs, the proposed system replaces them with custom-designed 3D-printed resonant plates that exploit the excitation of standing waves (SW) to generate motion. Each resonator is equipped with strategically positioned passive legs that convert vibratory energy into effective thrust, enabling both linear and rotational movement. A differential drive configuration, implemented through two independently actuated resonators, allows precise guidance and the execution of complex trajectories. The robot integrates onboard control electronics consisting of a microcontroller and inertial sensors, which enable closed-loop trajectory correction via a PD controller and allow autonomous navigation. The experimental results demonstrate high-precision motion control, achieving linear displacement speeds of 8.87 mm/s and a maximum angular velocity of 37.88°/s, while maintaining low power consumption and a compact form factor. Furthermore, the evaluation using the mean absolute error (MAE) yielded a value of 0.83° in trajectory tracking. This work advances the field of robotics and automatic control at the insect scale by integrating efficient piezoelectric actuation, additive manufacturing, and embedded sensing into a single autonomous platform capable of agile and programmable locomotion. Full article
Show Figures

Graphical abstract

27 pages, 5037 KB  
Article
A TCN-BiLSTM and ANR-IEKF Hybrid Framework for Sustained Vehicle Positioning During GNSS Outages
by Senhao Niu, Jie Li, Chenjun Hu, Junlong Li, Debiao Zhang and Kaiqiang Feng
Sensors 2026, 26(1), 152; https://doi.org/10.3390/s26010152 - 25 Dec 2025
Viewed by 265
Abstract
The performance of integrated Global Navigation Satellite System and Inertial Navigation System (GNSS/INS) navigation often declines in complex urban environments due to frequent GNSS signal blockages. This poses a significant challenge for autonomous driving applications that require continuous and reliable positioning. To address [...] Read more.
The performance of integrated Global Navigation Satellite System and Inertial Navigation System (GNSS/INS) navigation often declines in complex urban environments due to frequent GNSS signal blockages. This poses a significant challenge for autonomous driving applications that require continuous and reliable positioning. To address this limitation, this paper presents a novel hybrid framework that combines a deep learning architecture with an adaptive Kalman Filter. At the core of this framework is a Temporal Convolutional Network and Bidirectional Long Short-Term Memory (TCN-BiLSTM) model, which generates accurate pseudo-GNSS measurements from raw INS data during GNSS outages. These measurements are then fused with the INS data stream using an Adaptive Noise-Regulated Iterated Extended Kalman Filter (ANR-IEKF), which enhances robustness by dynamically estimating and adjusting the process and observation noise statistics in real time. The proposed ANR-IEKF + TCN-BiLSTM framework was validated using a real-world vehicle dataset that encompasses both straight-line and turning scenarios. The results demonstrate its superior performance in positioning accuracy and robustness compared to several baseline models, thereby confirming its effectiveness as a reliable solution for maintaining high-precision navigation in GNSS-denied environments. Validated in 70 s GNSS outage environments, our approach enhances positioning accuracy by over 50% against strong deep learning baselines with errors reduced to roughly 3.4 m. Full article
(This article belongs to the Section Navigation and Positioning)
Show Figures

Figure 1

23 pages, 6988 KB  
Article
A Blended Extended Kalman Filter Approach for Enhanced AGV Localization in Centralized Camera-Based Control Systems
by Nopparut Khaewnak, Soontaree Seangsri, Siripong Pawako, Sorada Khaengkarn and Jiraphon Srisertpol
Automation 2026, 7(1), 4; https://doi.org/10.3390/automation7010004 - 24 Dec 2025
Viewed by 212
Abstract
This research presents a study on enhancing the localization and orientation accuracy of indoor Autonomous Guided Vehicles (AGVs) operating under a centralized, camera-based control system. We investigate and compare the performance of two Extended Kalman Filter (EKF) configurations: a standard EKF and a [...] Read more.
This research presents a study on enhancing the localization and orientation accuracy of indoor Autonomous Guided Vehicles (AGVs) operating under a centralized, camera-based control system. We investigate and compare the performance of two Extended Kalman Filter (EKF) configurations: a standard EKF and a novel Blended EKF. The research methodology comprises four primary stages: (1) Sensor bias correction for the camera (CAM), Dead Reckoning, and Inertial Measurement Unit (IMU) to improve raw data quality; (2) Calculation of sensor weights using the Inverse-Variance Weighting principle, which assigns higher confidence to sensors with lower variance; (3) Multi-sensor data fusion to generate a stable state estimation that closely approximates the ground truth (GT); and (4) A comparative performance evaluation between the standard EKF, which processes sensor updates independently, and the Blended EKF, which fuses CAM and DR (Dead Reckoning) measurements prior to the filter’s update step. Experimental results demonstrate that the implementation of bias correction and inverse-variance weighting significantly reduces the Root Mean Square Error (RMSE) across all sensors. Furthermore, the Blended EKF not only achieved a lower RMSE in certain scenarios but also produced smooth trajectories similar to or less than the standard EKF in some weightings. These findings indicate the significant potential of the proposed approach in developing more accurate and robust navigation systems for AGVs in complex indoor environments. Full article
(This article belongs to the Section Robotics and Autonomous Systems)
Show Figures

Figure 1

20 pages, 1200 KB  
Article
Tax Compliance and Technological Innovation: Case Study on the Development of Tools to Assist Sales Tax Inspections to Curb Tax Fraud
by Vera Lucia Reiko Yoshida Shidomi and Joshua Onome Imoniana
Technologies 2025, 13(12), 594; https://doi.org/10.3390/technologies13120594 - 17 Dec 2025
Viewed by 371
Abstract
This paper mainly studies tax inspection decision-making technology, aiming to improve the accuracy and robustness of target recognition, state estimation, and autonomous decision making in complex environments by constructing an application that integrates visual, radar, and inertial navigation information. Tax inspection is a [...] Read more.
This paper mainly studies tax inspection decision-making technology, aiming to improve the accuracy and robustness of target recognition, state estimation, and autonomous decision making in complex environments by constructing an application that integrates visual, radar, and inertial navigation information. Tax inspection is a universally complex phenomenon, but little is known about the use of innovative technology to arm tax auditors with tools in monitoring it. Thus, based on the legitimacy theory, there is an agreement between taxpayers and the tax authorities regarding adequate compliance with tax legislation. The use of systemic controls by tax authorities is essential to track stakeholders’ contracts and ensure the upholding of this mandate. The case study is exploratory, using participant observation, and interventionist approach to a tax auditing. The results indicated that partnership between experienced tax auditors and IT tax auditors offered several tangible benefits to the in-house development and monitoring of an innovative application. It also indicates that OCR supports a data lake for inspectors in which stored information is available on standby during inspection. Furthermore, auditors’ use of mobile applications programmed with intelligent perception and tracking resources instead of using searches on mainframes streamlined the inspection process. The integration of professional skepticism, empathy among users, and technological innovation created a surge in independence among tax auditors and ensured focus. This paper’s contribution lies in the discussion of the enhancement of tax inspection through target recognition, drawing on legitimacy theory to rethink the relationship between taxpayers and tax authorities regarding adequate compliance with tax legislation, and presenting an exploratory case study using a participant observation, interventionist approach focused on a tax auditor. The implications of this study for policy makers, auditors, and academics are only the peak of the iceberg, as innovation in public administration presupposes efficiency. As a suggestion for future dimensions of research, we recommend the infusion of AI into these tools for further efficacy and effectiveness to mitigate fraud in the undue appropriation of taxes and undue competition. Full article
(This article belongs to the Section Information and Communication Technologies)
Show Figures

Figure 1

26 pages, 6776 KB  
Article
An Improved Adaptive Robust Extended Kalman Filter for Arctic Shipborne Tightly Coupled GNSS/INS Navigation
by Wei Liu, Tengfei Qi, Yuan Hu, Shanshan Fu, Bing Han, Tsung-Hsuan Hsieh and Shengzheng Wang
J. Mar. Sci. Eng. 2025, 13(12), 2395; https://doi.org/10.3390/jmse13122395 - 17 Dec 2025
Viewed by 351
Abstract
In the Arctic region, the navigation and positioning accuracy of shipborne and autonomous underwater vehicle (AUV) integrated Global Navigation Satellite System (GNSS) and Inertial Navigation System (INS) solutions is severely degraded due to poor satellite geometry, frequent ionospheric disturbances, non-Gaussian measurement noise, and [...] Read more.
In the Arctic region, the navigation and positioning accuracy of shipborne and autonomous underwater vehicle (AUV) integrated Global Navigation Satellite System (GNSS) and Inertial Navigation System (INS) solutions is severely degraded due to poor satellite geometry, frequent ionospheric disturbances, non-Gaussian measurement noise, and strong multipath effects, as well as long-term INS-based dead-reckoning for AUVs when GNSS is unavailable underwater. In addition, the sparse ground-based augmentation infrastructure and the lack of reliable reference trajectories and dedicated test ranges in polar waters hinder the validation and performance assessment of existing marine navigation systems, further complicating the achievement of accurate and reliable navigation in this region. To improve the positioning accuracy of the GNSS/INS shipborne navigation system, this paper adopts a tightly coupled GNSS/INS navigation approach. To further enhance the accuracy and robustness of tightly coupled GNSS/INS positioning, this paper proposes an improved Adaptive Robust Extended Kalman Filter (IAREKF) algorithm to effectively suppress the effects of gross errors and non-Gaussian noise, thereby significantly enhancing the system’s robustness and positioning accuracy. First, the residuals and Mahalanobis distance are calculated using the Adaptive Robust Extended Kalman Filter (AREKF), and the chi-square test is used to assess the anomalies of the observations. Subsequently, the observation noise covariance matrix is dynamically adjusted to improve the filter’s anti-interference capability in the complex Arctic environment. However, the state estimation accuracy of AREKF is still affected by GNSS signal degradation, leading to a decrease in navigation and positioning accuracy. To further improve the robustness and positioning accuracy of the filter, this paper introduces a sliding window mechanism, which dynamically adjusts the observation noise covariance matrix using historical residual information, thereby effectively improving the system’s stability in harsh environments. Field experiments conducted on an Arctic survey vessel demonstrate that the proposed improved adaptive robust extended Kalman filter significantly enhances the robustness and accuracy of Arctic integrated navigation. In the Arctic voyages at latitudes 80.3° and 85.7°, compared to the Loosely coupled EKF, the proposed method reduced the horizontal root mean square error by 61.78% and 21.7%, respectively. Full article
(This article belongs to the Section Ocean Engineering)
Show Figures

Figure 1

22 pages, 5278 KB  
Article
Robust Navigation in Multipath Environments Using GNSS/UWB/INS Integration with Anchor Position Estimation Toward eVTOL Operations
by Atsushi Osaka and Toshiaki Tsujii
Sensors 2025, 25(24), 7419; https://doi.org/10.3390/s25247419 - 5 Dec 2025
Viewed by 538
Abstract
Emerging technologies such as urban air mobility and autonomous vehicles increasingly rely on Global Navigation Satellite Systems (GNSS) for accurate positioning. However, GNSS alone suffers from severe degradation in complex environments, particularly due to multipath effects caused by reflections from surrounding structures. These [...] Read more.
Emerging technologies such as urban air mobility and autonomous vehicles increasingly rely on Global Navigation Satellite Systems (GNSS) for accurate positioning. However, GNSS alone suffers from severe degradation in complex environments, particularly due to multipath effects caused by reflections from surrounding structures. These effects distort pseudo-range measurements and, in combination with signal attenuation and blockage, lead to significant positioning errors. To address this challenge, this study proposes a loosely integrated navigation framework that combines GNSS, ultra-wideband (UWB), and inertial navigation system (INS) data. UWB enables high-precision ranging, and we further extend its application to estimate the locations of UWB anchors themselves. This approach alleviates a major technical limitation of UWB systems, which typically require anchor positions near buildings to be precisely surveyed beforehand. Field experiments were conducted in multipath-prone outdoor environments using a drone equipped with GNSS, UWB, and INS sensors. The results demonstrate that the proposed GNSS/UWB/INS integration reduces positioning errors by up to approximately 90% compared with GNSS/INS integration. Moreover, in areas surrounded by UWB anchors (UWB-Anchored Area), submeter-level positioning accuracy was achieved. These findings highlight the robustness of the proposed method against multipath interference and its potential to overcome anchor-dependency issues, thereby contributing to safe and reliable navigation solutions for future urban applications such as eVTOL operations. Full article
(This article belongs to the Section Navigation and Positioning)
Show Figures

Figure 1

21 pages, 2192 KB  
Article
Development, Implementation and Experimental Assessment of Path-Following Controllers on a 1:5 Scale Vehicle Testbed
by Luca Biondo, Angelo Domenico Vella and Alessandro Vigliani
Machines 2025, 13(12), 1116; https://doi.org/10.3390/machines13121116 - 3 Dec 2025
Viewed by 409
Abstract
The development of control strategies for autonomous vehicles requires a reliable and cost-effective validation approach. In this context, testbeds enabling repeatable experiments under controlled conditions are gaining relevance. Scaled vehicles have proven to be a valuable alternative to full-scale or simulation-based testing, enabling [...] Read more.
The development of control strategies for autonomous vehicles requires a reliable and cost-effective validation approach. In this context, testbeds enabling repeatable experiments under controlled conditions are gaining relevance. Scaled vehicles have proven to be a valuable alternative to full-scale or simulation-based testing, enabling experimental validation while reducing costs and risks. This work presents a 1:5 scale modular vehicle platform, derived from a commercial Radio-Controlled (RC) vehicle and adapted as experimental testbed for control strategy validation and vehicle dynamics studies. The vehicle features an electric powertrain, operated through a Speedgoat Baseline Real-Time Target Machine (SBRTM). The hardware architecture includes a high-performance Inertial Measurement Unit (IMU) with embedded Global Navigation Satellite System (GNSS). An Extended Kalman Filter (EKF) is implemented to enhance positioning accuracy by fusing inertial and GNSS data, providing reliable estimates of the vehicle position, velocity, and orientation. Two path-following algorithms, i.e., Stanley Controller (SC) and the Linear Quadratic Regulator (LQR), are designed and integrated. Outdoor experimental tests enable the evaluation of tracking accuracy and robustness. The results demonstrate that the proposed scaled testbed constitutes a reliable and flexible platform for benchmarking autonomous vehicle controllers and enabling experimental testing. Full article
Show Figures

Figure 1

17 pages, 10859 KB  
Article
TSFNet: A Two-Stage Fusion Network for Visual–Inertial Odometry
by Shuai Wang, Yuntao Liang, Jiongxun Lin, Yuxi Gan, Mengping Zhong, Xia Yin and Bao Peng
Mathematics 2025, 13(23), 3842; https://doi.org/10.3390/math13233842 - 30 Nov 2025
Viewed by 352
Abstract
In autonomous operations of unmanned aerial vehicles (UAVs), accurate pose estimation is a core prerequisite for achieving autonomous navigation, obstacle avoidance, and task execution. To address the challenge of localization in GNSS-denied environments, Visual–Inertial Odometry (VIO) has emerged as a mainstream solution due [...] Read more.
In autonomous operations of unmanned aerial vehicles (UAVs), accurate pose estimation is a core prerequisite for achieving autonomous navigation, obstacle avoidance, and task execution. To address the challenge of localization in GNSS-denied environments, Visual–Inertial Odometry (VIO) has emerged as a mainstream solution due to its outstanding performance. However, existing deep learning-based VIO methods exhibit limitations in their multi-modal fusion mechanisms. These methods typically employ simple concatenation or attention mechanisms for feature fusion. Furthermore, enhancements in accuracy are often accompanied by significant computational overhead. This makes it difficult for models to effectively handle complex, dynamic scenes while remaining lightweight. To this end, this paper proposes TSFNet (Two-stage Sequential Fusion Network), an efficient two-stage sequential fusion network. In the first stage, the network employs a lightweight visual backbone and a bidirectional recurrent network in parallel to extract spatial and motion features, respectively. A gated fusion unit is employed to achieve adaptive intra-frame feature fusion, dynamically balancing the contributions of different modalities. In the second stage, the fused features are organized into sequences and fed into a dedicated temporal network to explicitly model inter-frame motion dynamics. This decoupled fusion architecture significantly enhances the model’s representational capacity. Experimental results demonstrate that TSFNet achieves superior performance on both the EuRoC and Zurich Urban MAV datasets. Notably, on the Zurich Urban MAV dataset, it reduces the localization Root Mean Square Error (RMSE) by 62% compared to the baseline model, while simultaneously reducing the number of parameters and computational load by 76.65% and 24.30%, respectively. This research confirms that the decoupled two-stage fusion strategy is an effective approach for realizing high-precision, lightweight VIO systems. Full article
Show Figures

Figure 1

30 pages, 7942 KB  
Article
Research on Agricultural Autonomous Positioning and Navigation System Based on LIO-SAM and Apriltag Fusion
by Xianping Guan, Hongrui Ge, Shicheng Nie and Yuhan Ding
Agronomy 2025, 15(12), 2731; https://doi.org/10.3390/agronomy15122731 - 27 Nov 2025
Viewed by 798
Abstract
The application of autonomous navigation in intelligent agriculture is becoming more and more extensive. Traditional navigation schemes in greenhouses, orchards, and other agricultural environments often have problems such as the inability to deal with an uneven illumination distribution, complex layout, highly repetitive and [...] Read more.
The application of autonomous navigation in intelligent agriculture is becoming more and more extensive. Traditional navigation schemes in greenhouses, orchards, and other agricultural environments often have problems such as the inability to deal with an uneven illumination distribution, complex layout, highly repetitive and similar structures, and difficulty in receiving GNSS (Global Navigation Satellite System) signals. In order to solve this problem, this paper proposes a new tightly coupled LiDAR (Light Detection and Ranging) inertial odometry SLAM (LIO-SAM) framework named April-LIO-SAM. The framework innovatively uses Apriltag, a two-dimensional bar code widely used for precise positioning, pose estimation, and scene recognition of objects as a global positioning beacon to replace GNSS to provide absolute pose observation. The system uses three-dimensional LiDAR (VLP-16) and IMU (inertial measurement unit) to collect environmental data and uses Apriltag as absolute coordinates instead of GNSS to solve the problem of unreliable GNSS signal reception in greenhouses, orchards, and other agricultural environments. The SLAM trajectories and navigation performance were validated in a carefully built greenhouse and orchard environment. The experimental results show that the navigation map developed by the April-LIO-SAM yields a root mean square error of 0.057 m. The average positioning errors are 0.041 m, 0.049 m, 0.056 m, and 0.070 m, respectively, when the density of Apriltag is 3 m, 5 m, and 7 m. The navigation experimental results indicate that, at speeds of 0.4, 0.3, and 0.2 m/s, the average lateral deviation is less than 0.053 m, with a standard deviation below 0.034 m. The average heading deviation is less than 2.3°, with a standard deviation below 1.6°. The positioning stability experiments under interference conditions such as illumination and occlusion were carried out. It was verified that the system maintained a good stability under complex external conditions, and the positioning error fluctuation was within 3.0 mm. The results confirm that the robot positioning and navigation accuracy of mobile robots satisfy the continuity in the facility. Full article
(This article belongs to the Special Issue Research Progress in Agricultural Robots in Arable Farming)
Show Figures

Figure 1

24 pages, 15361 KB  
Article
UAV Sensor Data Fusion for Localization Using Adaptive Multiscale Feature Matching Mechanisms Under GPS-Deprived Environment
by Yu-Shun Wang and Chia-Hao Chang
Aerospace 2025, 12(12), 1048; https://doi.org/10.3390/aerospace12121048 - 25 Nov 2025
Viewed by 474
Abstract
The application of unmanned vehicles in civilian and military fields is increasingly widespread. Traditionally, unmanned vehicles primarily rely on Global Positioning Systems (GPSs) for positioning; however, GPS signals can be limited or completely lost in conditions such as building obstructions, indoor environments, or [...] Read more.
The application of unmanned vehicles in civilian and military fields is increasingly widespread. Traditionally, unmanned vehicles primarily rely on Global Positioning Systems (GPSs) for positioning; however, GPS signals can be limited or completely lost in conditions such as building obstructions, indoor environments, or electronic interference. In addition, countries are actively developing GPS jamming and deception technologies for military applications, making precise positioning and navigation of unmanned vehicles in GPS-denied or constrained environments a critical issue that needs to be addressed. In this work, authors propose a method based on Visual–Inertial Odometry (VIO), integrating the extended Kalman filter (EKF), an Inertial Measurement Unit (IMU), optical flow, and feature matching to achieve drone localization in GPS-denied environments. The proposed method uses the heading angle and acceleration data obtained from the IMU as the state prediction for the EKF, and estimates relative displacement using optical flow. It further corrects the optical flow calculation errors through IMU rotation compensation, enhancing the robustness of visual odometry. Additionally, when re-selecting feature points for optical flow, it combines a KAZE feature matching technique for global position correction, reducing drift errors caused by long-duration flight. The authors also employ an adaptive noise adjustment strategy that dynamically adjusts the internal state and measurement noise matrices of the EKF based on the rate of change in heading angle and feature matching reliability, allowing the drone to maintain stable positioning in various flight conditions. According to the simulation results, the proposed method is able to effectively estimate the flight trajectory of drones without GPS. Compared to results that rely solely on optical flow or feature matching, it significantly reduces cumulative errors. This makes it suitable for urban environments, forest areas, and military applications where GPS signals are limited, providing a reliable solution for autonomous navigation and positioning of drones. Full article
(This article belongs to the Section Aeronautics)
Show Figures

Figure 1

19 pages, 3299 KB  
Article
GPLVINS: Tightly Coupled GNSS-Visual-Inertial Fusion for Consistent State Estimation with Point and Line Features for Unmanned Aerial Vehicles
by Xinyu Chen, Shuaixin Li, Ruifeng Lu and Xiaozhou Zhu
Drones 2025, 9(11), 801; https://doi.org/10.3390/drones9110801 - 17 Nov 2025
Viewed by 671
Abstract
The employment of linear features to enhance the positioning precision and robustness of point-based VIO (visual-inertial odometry) has attracted mounting attention, especially for UAV (unmanned aerial vehicle) applications where reliable 6-DoF pose estimation is critical for autonomous navigation, mission execution, and safety. This [...] Read more.
The employment of linear features to enhance the positioning precision and robustness of point-based VIO (visual-inertial odometry) has attracted mounting attention, especially for UAV (unmanned aerial vehicle) applications where reliable 6-DoF pose estimation is critical for autonomous navigation, mission execution, and safety. This paper presents GPLVINS—GNSS (global navigation satellite system)-point-line-visual-inertial navigation system—a UAV-tailored enhancement of the nonlinear optimization-based GVINS (GNSS-visual-inertial navigation system). Unlike GVINS, which struggles with feature extraction in weak-texture environments and depends entirely on point features, GPLVINS innovatively integrates line features into its state optimization framework to enhance robustness and accuracy. While existing studies adopt the LSD (line segment detector) algorithm for line feature extraction, this approach often generates numerous short line segments in real-world scenes. Such an outcome not only increases computational costs but also degrades pose estimation performance. In order to address this issue, the present study proposes an NMS (non-maximum suppression) strategy for the refinement of LSD. The line reprojection residual is then formulated as the distance between point and line, which is incorporated into the nonlinear optimization process. Experimental validations on open-source datasets and self-collected UAV datasets across indoor, outdoor, and indoor–outdoor transition scenarios demonstrate that GPLVINS exhibits superior positioning performance and enhanced robustness for UAVs in environments with feature degradation or drastic lighting intensity variations. Full article
Show Figures

Figure 1

27 pages, 6674 KB  
Article
Design and Development of an Autonomous Mobile Robot for Unstructured Indoor Environments
by Ameur Gargouri, Mohamed Karray, Bechir Zalila and Mohamed Ksantini
Machines 2025, 13(11), 1044; https://doi.org/10.3390/machines13111044 - 12 Nov 2025
Viewed by 2160
Abstract
This research work presents the design and the development of a cost-effective autonomous mobile robot for locating misplaced objects within unstructured indoor environments. The tools integrated into the proposed system for perception and localization are a hardware architecture equipped with LiDAR, an inertial [...] Read more.
This research work presents the design and the development of a cost-effective autonomous mobile robot for locating misplaced objects within unstructured indoor environments. The tools integrated into the proposed system for perception and localization are a hardware architecture equipped with LiDAR, an inertial measurement unit (IMU), and wheel encoders. The system also includes an ROS2-based software stack enabling autonomous navigation via the NAV2 framework and Adaptive Monte Carlo Localization (AMCL). For real-time object detection, a lightweight YOLO11n model is developed and implemented on a Raspberry Pi 4 to enable the robot to identify common household items. The robot’s motion control is achieved by a fuzzy logic-enhanced PID controller that dynamically modifies gain values based on navigation conditions. Remote supervision, task management, and real-time status monitoring are provided by a user-friendly Flutter-based mobile application. Simulations and real-world experiments demonstrate the robustness, modularity, and responsiveness of the robot in dynamic environments. This robot achieves a 3 cm localization error and a 95% task execution success rate. Full article
Show Figures

Figure 1

27 pages, 2523 KB  
Article
Robust Vehicle Pose Estimation Through Multi-Sensor Fusion of Camera, IMU, and GPS Using LSTM and Kalman Filter
by Tae-Hyeok Jeong, Yong-Jun Lee, Woo-Jin Ahn, Tae-Koo Kang and Myo-Taeg Lim
Appl. Sci. 2025, 15(22), 11863; https://doi.org/10.3390/app152211863 - 7 Nov 2025
Viewed by 762
Abstract
Accurate vehicle localization remains a critical challenge due to the frequent loss or degradation of sensor data, such as from visual, inertial, and GPS sources. In this study, we present a novel localization algorithm that dynamically fuses data from heterogeneous sensors to achieve [...] Read more.
Accurate vehicle localization remains a critical challenge due to the frequent loss or degradation of sensor data, such as from visual, inertial, and GPS sources. In this study, we present a novel localization algorithm that dynamically fuses data from heterogeneous sensors to achieve stable and precise positioning. The proposed algorithm integrates a deep learning-based visual-inertial odometry (VIO) module with a Kalman filter for global data fusion. A key innovation of the method is its adaptive fusion strategy, which adjusts feature weights based on sensor reliability, thereby ensuring optimal data utilization. Extensive experiments across varied scenarios demonstrate the algorithm’s superior performance, consistently achieving lower RMSE values and reducing position errors by 79–91% compared to four state-of-the-art baselines—even under adverse conditions such as sensor failures or missing data. This work lays the foundation for deploying robust localization systems in real-world applications, including autonomous vehicles, robotics, and navigation technologies. Full article
(This article belongs to the Special Issue AI-Aided Intelligent Vehicle Positioning in Urban Areas)
Show Figures

Figure 1

20 pages, 8109 KB  
Article
Development of an Orchard Inspection Robot: A ROS-Based LiDAR-SLAM System with Hybrid A*-DWA Navigation
by Jiwei Qu, Yanqiu Gu, Zhinuo Qiu, Kangquan Guo and Qingzhen Zhu
Sensors 2025, 25(21), 6662; https://doi.org/10.3390/s25216662 - 1 Nov 2025
Viewed by 1191
Abstract
The application of orchard inspection robots has become increasingly widespread. How-ever, achieving autonomous navigation in unstructured environments continues to pre-sent significant challenges. This study investigates the Simultaneous Localization and Mapping (SLAM) navigation system of an orchard inspection robot and evaluates its performance using [...] Read more.
The application of orchard inspection robots has become increasingly widespread. How-ever, achieving autonomous navigation in unstructured environments continues to pre-sent significant challenges. This study investigates the Simultaneous Localization and Mapping (SLAM) navigation system of an orchard inspection robot and evaluates its performance using Light Detection and Ranging (LiDAR) technology. A mobile robot that integrates tightly coupled multi-sensors is developed and implemented. The integration of LiDAR and Inertial Measurement Units (IMUs) enables the perception of environmental information. Moreover, the robot’s kinematic model is established, and coordinate transformations are performed based on the Unified Robotics Description Format (URDF). The URDF facilitates the visualization of robot features within the Robot Operating System (ROS). ROS navigation nodes are configured for path planning, where an improved A* algorithm, combined with the Dynamic Window Approach (DWA), is introduced to achieve efficient global and local path planning. The comparison of the simulation results with classical algorithms demonstrated the implemented algorithm exhibits superior search efficiency and smoothness. The robot’s navigation performance is rigorously tested, focusing on navigation accuracy and obstacle avoidance capability. Results demonstrated that, during temporary stops at waypoints, the robot exhibits an average lateral deviation of 0.163 m and a longitudinal deviation of 0.282 m from the target point. The average braking time and startup time of the robot at the four waypoints are 0.46 s and 0.64 s, respectively. In obstacle avoidance tests, optimal performance is observed with an expansion radius of 0.4 m across various obstacle sizes. The proposed combined method achieves efficient and stable global and local path planning, serving as a reference for future applications of mobile inspection robots in autonomous navigation. Full article
Show Figures

Figure 1

Back to TopTop