Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (9)

Search Parameters:
Keywords = Lidar pitching motion

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
29 pages, 6572 KB  
Article
Robust Parking Space Recognition Approach Based on Tightly Coupled Polarized Lidar and Pre-Integration IMU
by Jialiang Chen, Fei Li, Xiaohui Liu and Yuelin Yuan
Appl. Sci. 2024, 14(20), 9181; https://doi.org/10.3390/app14209181 - 10 Oct 2024
Cited by 1 | Viewed by 2319
Abstract
Improving the accuracy of parking space recognition is crucial in the fields for Automated Valet Parking (AVP) of autonomous driving. In AVP, accurate free space recognition significantly impacts the safety and comfort of both the vehicles and drivers. To enhance parking space recognition [...] Read more.
Improving the accuracy of parking space recognition is crucial in the fields for Automated Valet Parking (AVP) of autonomous driving. In AVP, accurate free space recognition significantly impacts the safety and comfort of both the vehicles and drivers. To enhance parking space recognition and annotation in unknown environments, this paper proposes an automatic parking space annotation approach with tight coupling of Lidar and Inertial Measurement Unit (IMU). First, the pose of the Lidar frame was tightly coupled with high-frequency IMU data to compensate for vehicle motion, reducing its impact on the pose transformation of the Lidar point cloud. Next, simultaneous localization and mapping (SLAM) were performed using the compensated Lidar frame. By extracting two-dimensional polarized edge features and planar features from the three-dimensional Lidar point cloud, a polarized Lidar odometry was constructed. The polarized Lidar odometry factor and loop closure factor were jointly optimized in the iSAM2. Finally, the pitch angle of the constructed local map was evaluated to filter out ground points, and the regions of interest (ROI) were projected onto a grid map. The free space between adjacent vehicle point clouds was assessed on the grid map using convex hull detection and straight-line fitting. The experiments were conducted on both local and open datasets. The proposed method achieved an average precision and recall of 98.89% and 98.79% on the local dataset, respectively; it also achieved 97.08% and 99.40% on the nuScenes dataset. And it reduced storage usage by 48.38% while ensuring running time. Comparative experiments on open datasets show that the proposed method can adapt to various scenarios and exhibits strong robustness. Full article
Show Figures

Figure 1

20 pages, 3282 KB  
Article
Evaluating the Performance of Pulsed and Continuous-Wave Lidar Wind Profilers with a Controlled Motion Experiment
by Shokoufeh Malekmohammadi, Christiane Duscha, Alastair D. Jenkins, Felix Kelberlau, Julia Gottschall and Joachim Reuder
Remote Sens. 2024, 16(17), 3191; https://doi.org/10.3390/rs16173191 - 29 Aug 2024
Cited by 2 | Viewed by 2480
Abstract
While floating wind lidars provide reliable and cost-effective measurements, these measurements may be inaccurate due to the motion of the installation platforms. Prior studies have not distinguished between systematic errors associated with lidars and errors resulting from motion. This study will fill this [...] Read more.
While floating wind lidars provide reliable and cost-effective measurements, these measurements may be inaccurate due to the motion of the installation platforms. Prior studies have not distinguished between systematic errors associated with lidars and errors resulting from motion. This study will fill this gap by examining the impact of platform motion on two types of profiling wind lidar systems: the pulsed WindCube V1 (Leosphere) and the continuous-wave ZephIR 300 (Natural Power). On a moving hexapod platform, both systems were subjected to 50 controlled sinusoidal motion cases in different degrees of freedom. Two reference lidars were placed at a distance of five meters from the platform as reference lidars. Motion-induced errors in mean wind speed and turbulence intensity estimation by lidars are analyzed. Additionally, the effectiveness of a motion correction approach in reducing these errors across various scenarios is evaluated. The results indicate that presence of rotational motion leads to higher turbulence intensity (TI) estimation by moving lidars. The absolute percentage error between lidars is the highest when lidars are exposed to yaw and heave motion and is the lowest when exposed to surge motion. The correlation between lidars, though it is the lowest in the presence of pitch, yaw, and heave motion. Furthermore, applying motion compensation can compensate the correlation drop and erroneous TI estimation. Full article
(This article belongs to the Special Issue Observation of Atmospheric Boundary-Layer Based on Remote Sensing)
Show Figures

Figure 1

19 pages, 8848 KB  
Article
Experimental Evaluation of the Motion-Induced Effects for Turbulent Fluctuations Measurement on Floating Lidar Systems
by Maxime Thiébaut, Nicolas Thebault, Marc Le Boulluec, Guillaume Damblans, Christophe Maisondieu, Cristina Benzo and Florent Guinot
Remote Sens. 2024, 16(8), 1337; https://doi.org/10.3390/rs16081337 - 10 Apr 2024
Cited by 4 | Viewed by 2142
Abstract
This study examines how motion influences turbulent velocity fluctuations utilizing measurements obtained from a wind lidar profiler. Onshore tests were performed using a WindCube v2.1 lidar, which was mobile and mounted on a hexapod to simulate buoy motion. Additionally, a fixed WindCube v2.1 [...] Read more.
This study examines how motion influences turbulent velocity fluctuations utilizing measurements obtained from a wind lidar profiler. Onshore tests were performed using a WindCube v2.1 lidar, which was mobile and mounted on a hexapod to simulate buoy motion. Additionally, a fixed WindCube v2.1 lidar was used as a reference during these tests. To assess the motion-induced effects on velocity fluctuations measured by floating lidar systems, the root-mean-square error (RMSE) of velocity fluctuations obtained from the fixed and mobile lidars was calculated. A comprehensive wind dataset spanning 22.5 h was analyzed, with a focus on regular motions involving single-axis rotations and combinations of rotations around multiple axes. The investigation of single-axis rotations revealed that the primary influencing factor on the results was the alignment between the tilt direction of the mobile lidar and the wind direction. The highest RMSE values occurred when the tilt of the mobile lidar leans in the wind direction, resulting in pitch motion, whereas the lowest RMSE values were observed when the tilt of the mobile lidar leans perpendicular to the wind direction, resulting in roll motion. Moreover, the addition of motion around extra axes of rotation was found to increase RMSE. Full article
(This article belongs to the Section Atmospheric Remote Sensing)
Show Figures

Figure 1

16 pages, 2082 KB  
Article
Fast Lidar Inertial Odometry and Mapping for Mobile Robot SE(2) Navigation
by Wei Chen and Jian Sun
Appl. Sci. 2023, 13(17), 9597; https://doi.org/10.3390/app13179597 - 24 Aug 2023
Viewed by 3128
Abstract
This paper presents a fast Lidar inertial odometry and mapping (F-LIOM) method for mobile robot navigation on flat terrain with high real-time pose estimation, map building, and place recognition. Existing works on Lidar inertial odometry have mostly parameterized the keyframe pose as SE(3) [...] Read more.
This paper presents a fast Lidar inertial odometry and mapping (F-LIOM) method for mobile robot navigation on flat terrain with high real-time pose estimation, map building, and place recognition. Existing works on Lidar inertial odometry have mostly parameterized the keyframe pose as SE(3) even when the robots moved on flat ground, which complicated the motion model and was not conducive to real-time non-linear optimization. In this paper, F-LIOM is shown to be cost-effective in terms of model complexity and computation efficiency for robot SE(2) navigation, as the motions in other degrees of freedom in 3D, including roll, pitch, and z, are considered to be noise terms that corrupt the pose estimation. For front-end place recognition, the smoothness information of the feature point cloud is introduced to construct a novel global descriptor that integrates geometry and environmental texture characteristics. Experiments under challenging scenarios, including self-collected datasets and public datasets, were conducted to validate the proposed method. The experimental results demonstrated that F-LIOM could achieve competitive real-time performance in terms of accuracy compared with state-of-the-art counterparts. Our solution has significant superiority and the potential to be deployed in limited-resource mobile robot systems. Full article
Show Figures

Figure 1

18 pages, 2893 KB  
Article
Estimation of Wave Period from Pitch and Roll of a Lidar Buoy
by Andreu Salcedo-Bosch, Francesc Rocadenbosch, Miguel A. Gutiérrez-Antuñano and Jordi Tiana-Alsina
Sensors 2021, 21(4), 1310; https://doi.org/10.3390/s21041310 - 12 Feb 2021
Cited by 6 | Viewed by 4297
Abstract
This work proposes a new wave-period estimation (L-dB) method based on the power-spectral-density (PSD) estimation of pitch and roll motional time series of a Doppler wind lidar buoy under the assumption of small angles (±22 deg) and slow yaw drifts (1 min), and [...] Read more.
This work proposes a new wave-period estimation (L-dB) method based on the power-spectral-density (PSD) estimation of pitch and roll motional time series of a Doppler wind lidar buoy under the assumption of small angles (±22 deg) and slow yaw drifts (1 min), and the neglection of translational motion. We revisit the buoy’s simplified two-degrees-of-freedom (2-DoF) motional model and formulate the PSD associated with the eigenaxis tilt of the lidar buoy, which was modelled as a complex-number random process. From this, we present the L-dB method, which estimates the wave period as the average wavelength associated to the cutoff frequency span at which the spectral components drop off L decibels from the peak level. In the framework of the IJmuiden campaign (North Sea, 29 March–17 June 2015), the L-dB method is compared in reference to most common oceanographic wave-period estimation methods by using a TriaxysTM buoy. Parametric analysis showed good agreement (correlation coefficient, ρ = 0.86, root-mean-square error (RMSE) = 0.46 s, and mean difference, MD = 0.02 s) between the proposed L-dB method and the oceanographic zero-crossing method when the threshold L was set at 8 dB. Full article
(This article belongs to the Section Physical Sensors)
Show Figures

Figure 1

18 pages, 7935 KB  
Article
Pole-Like Object Extraction and Pole-Aided GNSS/IMU/LiDAR-SLAM System in Urban Area
by Tianyi Liu, Le Chang, Xiaoji Niu and Jingnan Liu
Sensors 2020, 20(24), 7145; https://doi.org/10.3390/s20247145 - 13 Dec 2020
Cited by 13 | Viewed by 4962
Abstract
Vision-based sensors such as LiDAR (Light Detection and Ranging) are adopted in the SLAM (Simultaneous Localization and Mapping) system. In the 16-beam LiDAR aided SLAM system, due to the difficulty of object detection by sparse laser data, neither the grid-based nor feature point-based [...] Read more.
Vision-based sensors such as LiDAR (Light Detection and Ranging) are adopted in the SLAM (Simultaneous Localization and Mapping) system. In the 16-beam LiDAR aided SLAM system, due to the difficulty of object detection by sparse laser data, neither the grid-based nor feature point-based solution can avoid the interference of moving objects. In an urban environment, the pole-like objects are common, invariant and have distinguishing characteristics. Therefore, it is suitable to bring more robust and reliable positioning results as auxiliary information in the process of vehicle positioning and navigation. In this work, we proposed a scheme of a SLAM system using a GNSS (Global Navigation Satellite System), IMU (Inertial Measurement Unit) and LiDAR sensor using the position of pole-like objects as the features for SLAM. The scheme combines a traditional preprocessing method and a small scale artificial neural network to extract the pole-like objects in environment. Firstly, the threshold-based method is used to extract the pole-like object candidates from the point cloud, and then, the neural network is applied for training and inference to obtain pole-like objects. The result shows that the accuracy and recall rate are sufficient to provide stable observation for the following SLAM process. After extracting the poles from the LiDAR point cloud, their coordinates are added to the feature map, and the nonlinear optimization of the front end is carried out by utilizing the distance constraints corresponding to the pole coordinates; then, the heading angle and horizontal plane translation are estimated. The ground feature points are used to enhance the elevation, pitch and roll angle accuracy. The performance of the proposed navigation system is evaluated through field experiments by checking the position drift and attitude errors during multiple two-min mimic GNSS outages without additional IMU motion constrain such as NHC (Nonholonomic Constrain). The experimental results show that the performance of the proposed scheme is superior to that of the conventional feature point grid-based SLAM with the same back end, especially in congested crossroads where slow-moving vehicles are surrounded and pole-like objects are rich in the environment. The mean plane position error during two-min GNSS outages was reduced by 38.5%, and the root mean square error was reduced by 35.3%. Therefore, the proposed pole-like feature-based GNSS/IMU/LiDAR SLAM system can fuse condensed information from those sensors effectively to mitigate positioning and orientation errors, even in a short-time GNSS denied environment. Full article
(This article belongs to the Special Issue Sensors and Sensor's Fusion in Autonomous Vehicles)
Show Figures

Figure 1

19 pages, 4239 KB  
Article
Design of a Predictive RBF Compensation Fuzzy PID Controller for 3D Laser Scanning System
by Minghui Zhao, Xiaobin Xu, Hao Yang and Zhijie Pan
Appl. Sci. 2020, 10(13), 4662; https://doi.org/10.3390/app10134662 - 6 Jul 2020
Cited by 12 | Viewed by 3626
Abstract
A new proportional integral derivative (PID) control method is proposed for the 3D laser scanning system converted from 2D Lidar with a pitching motion device. It combines the advantages of a fuzzy algorithm, a radial basis function (RBF) neural network and a predictive [...] Read more.
A new proportional integral derivative (PID) control method is proposed for the 3D laser scanning system converted from 2D Lidar with a pitching motion device. It combines the advantages of a fuzzy algorithm, a radial basis function (RBF) neural network and a predictive algorithm to control the pitching motion of 2D Lidar quickly and accurately. The proposed method adopts the RBF neural network and feedback compensation to eliminate the unknown nonlinear part in the Lidar pitching motion, adaptively adjusting the PID parameter by a fuzzy algorithm. Then, the predictive control algorithm is adopted to optimize the overall controller output in real time. Finally, the simulation results show that the step response time of the Lidar pitching motion system using the control method is reduced from 15.298 s to 1.957 s with a steady-state error of 0.07°. Meanwhile, the system still has favorable response performance for the sinusoidal and step inputs under model mismatch and large disturbance. Therefore, the control method proposed above can improve the system performance and control the pitching motion of the 2D Lidar effectively. Full article
(This article belongs to the Section Mechanical Engineering)
Show Figures

Figure 1

14 pages, 9883 KB  
Article
Payload for Contact Inspection Tasks with UAV Systems
by L. M. González-deSantos, J. Martínez-Sánchez, H. González-Jorge, M. Ribeiro, J. B. de Sousa and P. Arias
Sensors 2019, 19(17), 3752; https://doi.org/10.3390/s19173752 - 30 Aug 2019
Cited by 17 | Viewed by 5992
Abstract
This paper presents a payload designed to perform semi-autonomous contact inspection tasks without any type of positioning system external to the UAV, such as a global navigation satellite system (GNSS) or motion capture system, making possible inspection in challenging GNSS- denied sites. This [...] Read more.
This paper presents a payload designed to perform semi-autonomous contact inspection tasks without any type of positioning system external to the UAV, such as a global navigation satellite system (GNSS) or motion capture system, making possible inspection in challenging GNSS- denied sites. This payload includes two LiDAR sensors which measure the distance between the UAV and the target structure and their inner orientation angle. The system uses this information to control the approaching of the UAV to the structure and the contact between both, actuating over the pitch and yaw signals. This control is performed using a hybrid automaton with different states that represent all the possible UAV status during the inspection tasks. It uses different control strategies in each state. An ultrasonic gauge has been used as the inspection sensor of the payload to measure the thickness of a metallic sheet. The sensor requires a stable contact in order to collect reliable measurements. Several tests have been performed on the system, reaching accurate results which show it is able to maintain a stable contact with the target structure. Full article
(This article belongs to the Section Remote Sensors)
Show Figures

Figure 1

19 pages, 3057 KB  
Article
Estimation of the Motion-Induced Horizontal-Wind-Speed Standard Deviation in an Offshore Doppler Lidar
by Miguel A. Gutiérrez-Antuñano, Jordi Tiana-Alsina, Andreu Salcedo and Francesc Rocadenbosch
Remote Sens. 2018, 10(12), 2037; https://doi.org/10.3390/rs10122037 - 14 Dec 2018
Cited by 26 | Viewed by 5268
Abstract
This work presents a new methodology to estimate the motion-induced standard deviation and related turbulence intensity on the retrieved horizontal wind speed by means of the velocity-azimuth-display algorithm applied to the conical scanning pattern of a floating Doppler lidar. The method considers a [...] Read more.
This work presents a new methodology to estimate the motion-induced standard deviation and related turbulence intensity on the retrieved horizontal wind speed by means of the velocity-azimuth-display algorithm applied to the conical scanning pattern of a floating Doppler lidar. The method considers a ZephIR™300 continuous-wave focusable Doppler lidar and does not require access to individual line-of-sight radial-wind information along the scanning pattern. The method combines a software-based velocity-azimuth-display and motion simulator and a statistical recursive procedure to estimate the horizontal wind speed standard deviation—as a well as the turbulence intensity—due to floating lidar buoy motion. The motion-induced error is estimated from the simulator’s side by using basic motional parameters, namely, roll/pitch angular amplitude and period of the floating lidar buoy, as well as reference wind speed and direction measurements at the study height. The impact of buoy motion on the retrieved wind speed and related standard deviation is compared against a reference sonic anemometer and a reference fixed lidar over a 60-day period at the IJmuiden test site (the Netherlands). Individual case examples and an analysis of the overall campaign are presented. After the correction, the mean deviation in the horizontal wind speed standard deviation between the reference and the floating lidar was improved by about 70%, from 0.14 m/s (uncorrected) to −0.04 m/s (corrected), which makes evident the goodness of the method. Equivalently, the error on the estimated turbulence intensity (3–20 m/s range) reduced from 38% (uncorrected) to 4% (corrected). Full article
(This article belongs to the Special Issue Remote Sensing of Atmospheric Conditions for Wind Energy Applications)
Show Figures

Graphical abstract

Back to TopTop