Figure 1.
The Mars rover takes photos and extracts the skyline.
Figure 1.
The Mars rover takes photos and extracts the skyline.
Figure 2.
Flowchart of skyline localization of unmanned boat [
9]. It can be seen that due to the turbulence of the unmanned boat, the skyline of the original collected image has a large distortion. Through the second step of calibration, the distortion has been corrected.
Figure 2.
Flowchart of skyline localization of unmanned boat [
9]. It can be seen that due to the turbulence of the unmanned boat, the skyline of the original collected image has a large distortion. Through the second step of calibration, the distortion has been corrected.
Figure 3.
Fish-eye camera takes urban images and extracts skyline.
Figure 3.
Fish-eye camera takes urban images and extracts skyline.
Figure 4.
Enumeration method for estimating camera orientation. Using a combination of various parameters within a predetermined range, the skyline with the highest matching score after calibration is found to determine the camera orientation.
Figure 4.
Enumeration method for estimating camera orientation. Using a combination of various parameters within a predetermined range, the skyline with the highest matching score after calibration is found to determine the camera orientation.
Figure 5.
Camera pose estimation based on semantic segmentation. Through semantic segmentation of the image, various geomorphic regions are identified, and the camera orientation is determined by matching with the terrain model containing GIS data.
Figure 5.
Camera pose estimation based on semantic segmentation. Through semantic segmentation of the image, various geomorphic regions are identified, and the camera orientation is determined by matching with the terrain model containing GIS data.
Figure 6.
Skyline features of concavity.
Figure 6.
Skyline features of concavity.
Figure 7.
The effect of clustering DEM skyline dataset. (Top): The terrain of the localization area; and (Bottom) results of clustering DEM skylines in and directions, respectively. The green points are the sampling points of the DEM skyline, and the red points are the tolerance regions. The sampling points of a skyline in two directions are different due to the influence of terrain.
Figure 7.
The effect of clustering DEM skyline dataset. (Top): The terrain of the localization area; and (Bottom) results of clustering DEM skylines in and directions, respectively. The green points are the sampling points of the DEM skyline, and the red points are the tolerance regions. The sampling points of a skyline in two directions are different due to the influence of terrain.
Figure 8.
The lapel point of a skyline: (left) left lapel point of skyline; and (right) right lapel point of skyline.
Figure 8.
The lapel point of a skyline: (left) left lapel point of skyline; and (right) right lapel point of skyline.
Figure 9.
The flow chart of the proposed method.
Figure 9.
The flow chart of the proposed method.
Figure 10.
The intense change of the skyline in a mountainous area. Although View1 and View2 are only 30 m apart, the skyline along the same direction changes greatly due to the promixity of the mountains.
Figure 10.
The intense change of the skyline in a mountainous area. Although View1 and View2 are only 30 m apart, the skyline along the same direction changes greatly due to the promixity of the mountains.
Figure 11.
The flow chart of clustering lapel points.
Figure 11.
The flow chart of clustering lapel points.
Figure 12.
The angle between the North of the lapel point and the adjacent point in the clockwise direction. The current selected lapel point is the right lapel point indicated by the black arrow, and its Angle is , and the Angle is .
Figure 12.
The angle between the North of the lapel point and the adjacent point in the clockwise direction. The current selected lapel point is the right lapel point indicated by the black arrow, and its Angle is , and the Angle is .
Figure 13.
(L-distance, R-distance) of lapel point: (left) DEM skyline with the position of a lapel point denoted by a red dot; and (right) the depth of the skyline in (left), where the red line denotes the difference in depth in (left), and the L-distance and R-distance are denoted by black arrows, representing the depth of left and right sides of lapel point, respectively.
Figure 13.
(L-distance, R-distance) of lapel point: (left) DEM skyline with the position of a lapel point denoted by a red dot; and (right) the depth of the skyline in (left), where the red line denotes the difference in depth in (left), and the L-distance and R-distance are denoted by black arrows, representing the depth of left and right sides of lapel point, respectively.
Figure 14.
The lapel point matrix of the dataset. The size of the matrix is related to the DEM area and skyline sampling density. The element in the matrix represents the number of lapel points on the skyline .
Figure 14.
The lapel point matrix of the dataset. The size of the matrix is related to the DEM area and skyline sampling density. The element in the matrix represents the number of lapel points on the skyline .
Figure 15.
The heatmap of the lapel points. The size of the heatmap is related to the DEM area and the skyline sampling density. The dot in the heatmap represents the number of lapel points of the skyline in row i and column j.
Figure 15.
The heatmap of the lapel points. The size of the heatmap is related to the DEM area and the skyline sampling density. The dot in the heatmap represents the number of lapel points of the skyline in row i and column j.
Figure 16.
The layering heatmap according to the number of each lapel point.
Figure 16.
The layering heatmap according to the number of each lapel point.
Figure 17.
The influence of the position of an observation point on the sequence of lapel points: (left) Point A is the current observation point; and (right) the lapel point sequence as observed from a slightly moved A. The dotted circle represents the original position of A.
Figure 17.
The influence of the position of an observation point on the sequence of lapel points: (left) Point A is the current observation point; and (right) the lapel point sequence as observed from a slightly moved A. The dotted circle represents the original position of A.
Figure 18.
The clustering of the lapel point sequence: (left) Distribution of skyline with group of 8 lapel points; and (right) the result of the clustering by lapel point sequence and distance. It can be seen that the skyline with lapel number 8 is divided into three categories.
Figure 18.
The clustering of the lapel point sequence: (left) Distribution of skyline with group of 8 lapel points; and (right) the result of the clustering by lapel point sequence and distance. It can be seen that the skyline with lapel number 8 is divided into three categories.
Figure 19.
A cluster of lapel points: (left) The heatmap of lapel points within 1 km. The color of each point denotes the corresponding number of lapel points; and (right) the clustering result of (left). The color of each point denotes the corresponding category of lapel points.
Figure 19.
A cluster of lapel points: (left) The heatmap of lapel points within 1 km. The color of each point denotes the corresponding number of lapel points; and (right) the clustering result of (left). The color of each point denotes the corresponding category of lapel points.
Figure 20.
The feature extraction map of skyline type lapels: (First row) n skylines in the same category considered by the clustering method to have the same geometry; (Second row) based on the lapel point sequence of the first skyline in the same category, adjust the sequence of the remaining skyline lapel points to correct the changes caused by the position of observation points; (Third row) statistics of the type and angle of each lapel point; and (Fourth row) skyline lapel point index, which records the type and angle range of each lapel point category.
Figure 20.
The feature extraction map of skyline type lapels: (First row) n skylines in the same category considered by the clustering method to have the same geometry; (Second row) based on the lapel point sequence of the first skyline in the same category, adjust the sequence of the remaining skyline lapel points to correct the changes caused by the position of observation points; (Third row) statistics of the type and angle of each lapel point; and (Fourth row) skyline lapel point index, which records the type and angle range of each lapel point category.
Figure 21.
The effect of the extraction skyline and lapel points. Red curve denotes the skyline and the ridge lines, where the top red curve is the skyline, and a blue square denotes a lapel point after manual verification and the ambiguous or incorrect lapel points are removed.
Figure 21.
The effect of the extraction skyline and lapel points. Red curve denotes the skyline and the ridge lines, where the top red curve is the skyline, and a blue square denotes a lapel point after manual verification and the ambiguous or incorrect lapel points are removed.
Figure 22.
The panoramic skyline retrieval process. The retrieval consists of four steps, with each step filtering out some candidate points, and the last step sorting them according to the matching results.
Figure 22.
The panoramic skyline retrieval process. The retrieval consists of four steps, with each step filtering out some candidate points, and the last step sorting them according to the matching results.
Figure 23.
Coarse matching: (Left) the state before the DEM skyline is coarsely matched with the image skyline; and (Right) the state after coarse matching between DEM skyline and image skyline. Through the coarse matching, the candidate yaw in the image can be determined at the same time.
Figure 23.
Coarse matching: (Left) the state before the DEM skyline is coarsely matched with the image skyline; and (Right) the state after coarse matching between DEM skyline and image skyline. Through the coarse matching, the candidate yaw in the image can be determined at the same time.
Figure 24.
For the same DEM skyline (enclosed in larger dotted red rectangles), the coarse matching of the image skyline may have multiple yaw candidates (enclosed in the smaller dotted red rectangles).
Figure 24.
For the same DEM skyline (enclosed in larger dotted red rectangles), the coarse matching of the image skyline may have multiple yaw candidates (enclosed in the smaller dotted red rectangles).
Figure 25.
Camera orientation causes sinusoidal distortion in the image. For a sloping path, the two sides of the picture are uplifted, whereas the middle part sinks to the right.
Figure 25.
Camera orientation causes sinusoidal distortion in the image. For a sloping path, the two sides of the picture are uplifted, whereas the middle part sinks to the right.
Figure 26.
Panoramic view from an unmanned aerial vehicle [
3]. Due to the influence of sea waves, the images have a sinusoidal distortion.
Figure 26.
Panoramic view from an unmanned aerial vehicle [
3]. Due to the influence of sea waves, the images have a sinusoidal distortion.
Figure 27.
The camera orientation estimation process: (
Top-Left) Find the DEM skyline lapel points matching the image lapel points (i.e., the result of coarse positioning outlined in
Section 3.3.2); (
Top-Right) calculate the difference of lapel height; (
Bottom-Left) fit the sine curve; and (
Bottom-Right) the physical meaning of sine function curve parameters. The variable
A represents the size of the superposition of roll and pitch vector in camera orientation, and
represents the direction of the superposition.
Figure 27.
The camera orientation estimation process: (
Top-Left) Find the DEM skyline lapel points matching the image lapel points (i.e., the result of coarse positioning outlined in
Section 3.3.2); (
Top-Right) calculate the difference of lapel height; (
Bottom-Left) fit the sine curve; and (
Bottom-Right) the physical meaning of sine function curve parameters. The variable
A represents the size of the superposition of roll and pitch vector in camera orientation, and
represents the direction of the superposition.
Figure 28.
The sine fitting results of mismatched skyline pairs: (Top row) h is large and the entire sine function shifts upward; and (Bottom row) the vertical offset of sine function is normal.
Figure 28.
The sine fitting results of mismatched skyline pairs: (Top row) h is large and the entire sine function shifts upward; and (Bottom row) the vertical offset of sine function is normal.
Figure 29.
Calibration effect of sinusoidal distortion: (
Left) The sine curve as fitted in
Section 3.3.3; and (
Right) The effect of calibration. It can be seen that there is a significant difference between the DEM skyline and image skyline before and after calibration.
Figure 29.
Calibration effect of sinusoidal distortion: (
Left) The sine curve as fitted in
Section 3.3.3; and (
Right) The effect of calibration. It can be seen that there is a significant difference between the DEM skyline and image skyline before and after calibration.
Figure 30.
An example of matching failure after sine correction. The features of skyline and DEM are matched in the two maps, but after correction, the difference of skyline pairs is significant, indicating that the locations of DEM sampling points for comparison are not the locations of where the images were taken.
Figure 30.
An example of matching failure after sine correction. The features of skyline and DEM are matched in the two maps, but after correction, the difference of skyline pairs is significant, indicating that the locations of DEM sampling points for comparison are not the locations of where the images were taken.
Figure 31.
The DEM data of experimental area. The red dots in the figure represent the place where the pictures were taken in the experiments.
Figure 31.
The DEM data of experimental area. The red dots in the figure represent the place where the pictures were taken in the experiments.
Figure 32.
Rendering of DEM skyline generated by 3D model: (Top) Rendering of skyline generated from DEM data at selected sampling points, where the white area is the sky, the non-white area is the peak, and the gray scale represents the distance between the mountain and the observation point; (Bottom) pseudo-color map of (Top) to facilitate the viewing of details of the peak.
Figure 32.
Rendering of DEM skyline generated by 3D model: (Top) Rendering of skyline generated from DEM data at selected sampling points, where the white area is the sky, the non-white area is the peak, and the gray scale represents the distance between the mountain and the observation point; (Bottom) pseudo-color map of (Top) to facilitate the viewing of details of the peak.
Figure 33.
Equipment used in the experiment: (Left) Camera can be rotated and images are captured by command; and (Right) installation of the camera on the experimental vehicle.
Figure 33.
Equipment used in the experiment: (Left) Camera can be rotated and images are captured by command; and (Right) installation of the camera on the experimental vehicle.
Figure 34.
Images captured by the experimental vehicle. The FoV of the camera is and each image was captured every rotation of the camera. The resolution of each image is . It can be seen that there are some overlaps between the adjacent images.
Figure 34.
Images captured by the experimental vehicle. The FoV of the camera is and each image was captured every rotation of the camera. The resolution of each image is . It can be seen that there are some overlaps between the adjacent images.
Figure 35.
The panoramic image. Using the image stitch algorithm, the sub images in
Figure 34 are merged into a panorama. The left side is the image captured when the optical axis of the camera coincides with the central axis of the vehicle.
Figure 35.
The panoramic image. Using the image stitch algorithm, the sub images in
Figure 34 are merged into a panorama. The left side is the image captured when the optical axis of the camera coincides with the central axis of the vehicle.
Figure 36.
Orientation estimation error. It can be seen that the estimated orientation is very close to the inertial navigation system.
Figure 36.
Orientation estimation error. It can be seen that the estimated orientation is very close to the inertial navigation system.
Figure 37.
The effects of positioning accuracy and efficiency between estimating orientation and providing orientation.
Figure 37.
The effects of positioning accuracy and efficiency between estimating orientation and providing orientation.
Figure 38.
Comparing the efficiency of the proposed orientation estimation method with the enumeration method in [
12]. The range of roll and pitch of the enumeration method are both [−5
, 5
], step size is 0.1
, and yaw step size is 1
.
Figure 38.
Comparing the efficiency of the proposed orientation estimation method with the enumeration method in [
12]. The range of roll and pitch of the enumeration method are both [−5
, 5
], step size is 0.1
, and yaw step size is 1
.
Figure 39.
The localization accuracy of the proposed method is similar to that of the traversal method.
Figure 39.
The localization accuracy of the proposed method is similar to that of the traversal method.
Figure 40.
Comparing lapel point clustering efficiency with traversal method.
Figure 40.
Comparing lapel point clustering efficiency with traversal method.
Figure 41.
The effect of the number of lapel points on the retrieval efficiency. If the lapel points in the skyline are removed, then the retrieval efficiency is greatly reduced.
Figure 41.
The effect of the number of lapel points on the retrieval efficiency. If the lapel points in the skyline are removed, then the retrieval efficiency is greatly reduced.
Figure 42.
The influence of adjacent angle noise on localization accuracy.
Figure 42.
The influence of adjacent angle noise on localization accuracy.
Figure 43.
The influence of incorrect lapel points: (Left) Influence of incorrect lapel point type on localization accuracy; and (Right) influence of surplus lapel points on localization accuracy.
Figure 43.
The influence of incorrect lapel points: (Left) Influence of incorrect lapel point type on localization accuracy; and (Right) influence of surplus lapel points on localization accuracy.
Table 1.
DEM lapel points information list.
Table 1.
DEM lapel points information list.
Item Name | Value Range | Item Explanation |
---|
Type | {Left, Right} | Lapel Type |
Angle | [0, 360] | Angle from due north |
Angle | [0, 360] | Angle between adjacent lapels clockwise |
(L-Dis, R-Dis) | ([0, ∞], [0, ∞]) | Distance between left and right sides of lapel |
Table 2.
Node contents of the index.
Table 2.
Node contents of the index.
Item Name | Value Range | Item Explanation |
---|
Type | {Left, Right} | Lapel Type |
(Min, Max) | ([0, 360], [0, 360]) | Angle Range between adjacent lapels clockwise |
Table 3.
The orientation of the image in
Figure 35.
Table 3.
The orientation of the image in
Figure 35.
Parameter | Value |
---|
Longitude | |
a Latitude | |
Pitch | |
Roll | |