Next Article in Journal
Design and Evaluation of a Permanently Installed Plane-Based Calibration Field for Mobile Laser Scanning Systems
Next Article in Special Issue
The High-Resolution Digital-Beamforming Airborne SAR System DBFSAR
Previous Article in Journal
A Drone-Based Bioaerosol Sampling System to Monitor Ice Nucleation Particles in the Lower Atmosphere
Previous Article in Special Issue
Suppressing False Alarm in VideoSAR viaGradient-Weighted Edge Information
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

DEM Generation With a Scale Factor Using Multi-Aspect SAR Imagery Applying Radargrammetry

1
Aerospace Information Research Institute, Chinese Academy of Sciences, Beijing 100190, China
2
Key Laboratory of Technology in Geo-spatial Information Processing and Application System, Chinese Academy of Sciences, Beijing 100190, China
3
University of Chinese Academy of Sciences, Beijing 101408, China
4
School of Electronic and Information Engineering, North China University of Technology, Beijing 100144, China
5
Beijing Institute of Remote Sensing Equipment, Beijing 100854, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2020, 12(3), 556; https://doi.org/10.3390/rs12030556
Submission received: 6 January 2020 / Revised: 30 January 2020 / Accepted: 5 February 2020 / Published: 7 February 2020
(This article belongs to the Special Issue Airborne SAR: Data Processing, Calibration and Applications)

Abstract

:
Digital elevation model (DEM) generation using multi-aspect synthetic aperture radar (SAR) imagery applying radargrammetry has become a hotspot. The traditional radargrammetric method is to solve the rigorous radar projection equations to obtain the three dimensional coordinates of targets. In this paper, we propose a new DEM generation method based on the offset between multi-aspect images formed on ground plane. The ground object will be projected to different positions from different viewing aspect angles if the height of object is not equal to the height of imaging plane. The linear relationship between the offset of imaging positions and height of the object is derived and scale factor is obtained finally. Height information can be retrieved from offset of imaging positions directly through the DEM extraction model presented in this paper. Thus the solution to nonlinear equations point by point can be avoided. Real C band airborne circular SAR images is used to verify the proposed approach. When extracted DEM applied in multi-aspect imaging process, superimposition of multi-aspect images will no longer be defocusing and can achieve finer observation of the scanned scene.

Graphical Abstract

1. Introduction

Synthetic aperture radar (SAR) imagery is widely utilized in remote sensing because of its features of all-day, all weather and high-resolution [1]. Additionally, digital elevation model (DEM) generation using high-resolution SAR images has become an important application.
DEM generation techniques based on SAR imagery can be roughly divided into two categories: interferometry [2,3] and radargrammetry [4,5]. For the interferometric configuration, the phase data is converted into height value [6]. There are demanding requirements on system and experiment conditions. For example, it is necessary that two images of the same area should be acquired with slightly different incidence angles [7].
The radargrammetric approach was first applied in 1950s [8]. Compared with the interferometric approach, radargrammetry makes use of image intensity rather than phase information. It can be applied in images with varying incidence angles or varying aspect angles. Varying incidence angle is common in radargrammetric application, while varying aspect angles of SAR trajectory has potential to be explored for DEM generation with the emergence of multi-aspect imaging mode [9]. Multi-aspect SAR refers to a wide angular aperture imaging mode that the radar beam always points to the center of the scene at different viewing aspect angles [10]. That new imaging mode satisfies the increasing demand of finer observation, where some targets have to be studied from multi aspect angles. For example, circular SAR (CSAR) is one of the representative systems of multi-aspect SAR, which performs 360-degree observation [11].
Multi-aspect images applied for DEM generation can also employ the conventional radargrammetric method. The traditional way is to find the imaging positions of the target on a correlation matched image pair formed on slant range plane or ground plane from different viewing angles [12,13]. Three-dimensional coordinates of the target are taken as unknown variables, and the coordinates of the target can be solved by solving the nonlinear equations listed according to the range-Doppler (RD) model [14] or rational polynomial coefficient (RPC) model [15]. The amount of computation of this process is high. From another point of view, authors try to find an estimated height with maximum correlation coefficient between two image patches for all possible height values at each resolution cell [16]. However, massive correlation evaluation carried out along a set of height values at each resolution grid is time consuming.
When radar images from different aspect angles are processed on the slant range plane, the imaging coordinate system of each image is different. However if multi-aspect images are all formed on the ground plane, images from different aspect angles are in the same coordinate system. In this case, height difference between true height of the target and imaging plane of fixed elevation is considered to be the only reason that causes imaging positions shift [9]. If height of the imaging plane is not equal to the height of object, the imaging positions from different aspect angles will be different [17]. Here we proposed a new DEM generation method for multi-aspect images generated on the ground plane. We derived the linear relationship between offset of imaging positions and height difference and deduced a scale factor to express it. Thus the offset between images from different aspect angles can be used to retrieve the elevation of the scene. So the solution to the RD equations in traditional procedure can be avoided.
CSAR images are used to validated the proposed method in this paper. If the extracted DEM applied in imaging process is equal to the height of target, the target will be projected to the exact position from different aspect angles. Superposition of images using magnitude accumulation of the entire circle of the scanned area will not be defocused and a high-quality all-aperture image with terrain information will be obtained [18]. In this way, the presented method is verified to be effective.
The rest of the paper is organized as follows. In Section 2, we show multi-aspect airborne SAR observation geometry and basic principles of DEM generation. The quantitative relationship between offset of imaging points and height difference is also given in Section 2. Then DEM extraction method based on multi-aspect SAR imagery is proposed in Section 3. Experiments and results are shown in Section 4. A discussion is given in Section 5. Conclusions are drawn in Section 6.

2. Basic Principle of DEM Generation

Herein, we introduce geometry of the elevation extraction model for multi aspect SAR. As mentioned above, well-known radargrammetry [19,20] is to solve the Range-Doppler equations. Our method targets at offset of imaging positions on the images formed directly on ground plane. The offset of imaging positions in different aspect images in ground coordinate is proportional to the height of target. We intend to establish a model, where scale factor k is defined to describe the relationship between height difference and offset of imaging positions from different aspect angles. To illustrate the deduction more clearly, definitions of equivalent incidence angle and equivalent aspect angle are given in this section.

2.1. Multi-Aspect Airborne SAR Observation Geometry

The observation geometry of multi-aspect airborne SAR is shown in Figure 1. A 1 ( X 1 , Y 1 , Z 1 ) and A 2 ( X 2 , Y 2 , Z 2 ) are positions of SAR platform at two viewing aspect angles. Velocity vector varies in both yaw direction and pitch direction. Yaw direction refers to the part of velocity vector on the plane which is parallel to the scene plane. Pitch direction refers to the vector outside the parallel plane. We consider a target P x p , y p , z p above the imaging plane. The height of the imaging plane is H r e f . Here Δ h denotes the height difference between the true height of the target P and the height of the imaging plane. B i is the point on the extension line passing position A i , and meanwhile the radial velocity relative to the target is zero.
The determination of the imaging position in the image space for target P ( x p , y p , z p ) in the object space is a projection process from a three-dimensional space to a two-dimensional space. This imaging mechanism of SAR can be described by Range-Doppler model. Imaging Position of each point in the scene is computed by the intersection of two radar rays with two different viewing aspect angles. This principle contains two basic equations [21].
When the target is located on the plane which is perpendicular to the flying direction of SAR platform, its Doppler frequency is zero. There also exists the slant range constraint among SAR platform, target and imaging position. In a local coordinate system the group of equations can be expressed as
v s x · ( X s X p ) + v s y · ( Y s Y p ) + v s z · ( Y s Y p ) = 0 ( X s X p ) 2 + ( Y s Y p ) 2 + ( Z s Z p ) 2 = R 2
where ( X p , Y p , Z p ) is coordinates of the target in the local coordinate system. ( X s , Y s , Z s ) is coordinates of SAR in the local coordinate system. ( v s x , v s y , v s z ) is coordinate components of the SAR platform velocity. R is the distance between SAR platform and the target.
Under different viewing geometry, imaging positions P 1 and P 2 of the target determined by (1) will deviate from its true position. The distance between the coordinates of imaging positions P 1 and P 2 is called offset Δ r . We have found that when on ground plane, offset Δ r is proportional to the height difference Δ h . We will give specific derivation of scale factor, denoted by k in Section 2.2.

2.2. Derivation of Scale Factor

If the target point is observed from different aspect angles, the imaging positions are not aligned due to the height difference. Therefore the imaging position is related with aspect angle, incidence angle and height difference. Actually the incidence angle and aspect angle of radar beam are different for target of different positions. Therefore in order to better describe the imaging positions of the target point, the definitions of equivalent incidence angle and equivalent aspect angle corresponding to specific target point are first introduced in this part.
Some geometric relations in detail under a certain viewing aspect angle will be introduced. As is shown in Figure 2, when SAR is at A i , θ i is the incidence angle and P i is imaging position. P ( x p , y p , H r e f ) is the point with the same two dimensional coordinates of target P. Under the far field condition, circular arc (denoted by a black circular arc) between target P and imaging position P i can be replaced by a right angle (denoted with a red line), approximately. Here i = 1 , 2 corresponds to two aspect angles.
The intersection line refers to the line between the zero Doppler plane passing the target and the scene plane corresponding to this viewing aspect angle A i . Line 1 denotes the line connecting target P and imaging position P i . Line 2 denotes the line connecting P i and the point P with the same two dimensional coordinates of target P in the X O Y plane. According to the geometric relationship shown in Figure 2, the angle between Line 1 and Line 2 is equal to incidence angle θ i . So this angle is defined as the equivalent incidence angle. The equivalent aspect angle φ i is defined as the angle between the Line 2 and the positive direction of the X-axis.
With the definition of these two angles, | P 1 P | and | P 2 P | (in Figure 1) can be expressed by height difference Δ h and equivalent incidence angles, shown as (2) and (3).
| P 1 P | = | Δ h | tan θ 1
| P 2 P | = | Δ h | tan θ 2
Under two different aspect angles, the coordinates of imaging positions P 1 and P 2 respectively are:
P 1 = ( x p + | P P 1 | · c o s φ 1 , y p + | P P 1 | · s i n φ 1 , H r e f )
P 2 = ( x p + | P P 2 | · c o s φ 2 , y p + | P P 2 | · s i n φ 2 , H r e f )
The distance Δ r between P 1 and P 2 is:
Δ r = | P 1 P 2 | = | P P 1 P P 2 | = | ( P P 2 · c o s φ 2 P P 1 · c o s φ 1 , P P 2 · s i n φ 2 P P 1 · s i n φ 1 , 0 ) | = | P P 2 | 2 + | P P 1 | 2 2 | P P 2 | · | P P 1 | · s i n φ 2 · s i n φ 1 2 | P P 2 | · | P P 1 | · c o s φ 2 · c o s φ 1 = | P P 2 | 2 + | P P 1 | 2 2 | P P 2 | · | P P 1 | · c o s ( φ 2 φ 1 ) = ( | Δ h | tan θ 2 ) 2 + ( | Δ h | tan θ 1 ) 2 2 · | Δ h | tan θ 2 · | Δ h | tan θ 1 · c o s ( φ 2 φ 1 ) = | Δ h | · ( 1 tan θ 2 ) 2 + ( 1 tan θ 1 ) 2 2 · 1 tan θ 2 · 1 tan θ 1 · c o s ( φ 2 φ 1 )
It can be rewritten as
| Δ h | = tan θ 1 · tan θ 2 tan 2 θ 1 + tan 2 θ 2 2 · tan θ 1 · tan θ 2 · cos ( φ 1 φ 2 ) · Δ r
Here, we let
k = tan θ 1 · tan θ 2 tan 2 θ 1 + tan 2 θ 2 2 · tan θ 1 · tan θ 2 · cos ( φ 1 φ 2 )
where, k is the height to relative shift scale factor. Essentially, this scale factor k depends only on the equivalent incidence angle θ 1 , θ 2 , equivalent aspect angle φ 1 , φ 2 of each target point. These two equivalent angles depend on the positions of target point. Therefore actually scale factor k in (8) varies spatially along with the position of target. We have found that within a certain range, scale factor k changes little and the variation can be neglected. That will be verified in the experiment section. Hence, according to (7), the relative shift between imaging positions Δ r (in meters) is linearly dependent on the height difference | Δ h | by this factor k within a a certain range. Besides, the scale factor is derived from the velocity vector at the instantaneous moment, so the accuracy is high. According to the requirement of elevation precision, we can solve the proportional scale factor by region. Then the following is solution to scale factor k at any position in the scene.

3. Methodology

In the previous section, we deduced the linear relationship of a target point between the offset of imaging positions and height difference at different aspect angles. Then we will calculate the height value for the whole scene using the images from different aspect angles.
When radar images from different aspect angles are processed on the images generated on the ground plane, multi-aspect images are in the same coordinate system. Then height difference between the true height of the target and imaging plane of fixed elevation causes imaging positions shift on the imaging plane. As the height difference increases, the offset between imaging positions also becomes larger. We have found that height difference | Δ h | is proportional to the offset between imaging positions Δ r . Based on the derivation of scale factor k given above, this section presents an approach for DEM generation with scale factor using offset of imaging positions in multi-aspect images. Then the procedure of DEM generation method is described in detail for multi-aspect images, which is shown in Figure 3. BP algorithms [22] stated in Step 1 will not be introduced in this paper. The subsequent parts focus on the Step 2 to Step 4.
Step 1: Each image is formed directly in ground coordinate by back projection (BP) algorithm.
Step 2: According to the velocity vector and position vector of the SAR platform at the instantaneous position, determine the imaging position of the target. Calculate the equivalent incidence angle and equivalent aspect angle. Figure out the scale factor k in the current scene. To simplify the process of computation for scale factor k, spatial variability of it needs to be analysed based on the real data for application.
Step 3: Normalized cross correlation (NCC) is conducted on two aspect images to obtain offset between imaging positions.
Step 4: Based on (7), absolute value of height difference Δ h is obtained and real positions that is three dimensional coordinate of targets can be solved.
If N groups of different aspect angle images are used, offset with maximum of correlation coefficient is considered to be a more correct value. N height maps were combined to get a final complete DEM.

3.1. Calculation of the Scale Factor

As is shown in (8), scale factor k is related with the equivalent incidence angle and the difference between the equivalent aspect angle. The coordinate ( x i , y i ) of the imaging position when the target is observed from a certain aspect angle can be obtained by Range-Doppler principle.

3.1.1. Calculation of Equivalent Incidence Angle

According to the geometric relationship shown in Figure 4, the equivalent incidence angle θ i corresponding to a viewing aspect angle can be obtained by (9).
tan θ i = | P P | | P P i |
Then, we can get
θ i = a r c t a n ( | P P | | P P i | ) = a r c t a n ( z p ( x p x i ) 2 + ( y p y i ) 2 )

3.1.2. Calculation of Difference Equivalent Aspect Angle

According to (8),it is not necessary to solve the specific values of equivalent angles φ 1 and φ 2 . Only difference of two aspect angles is needed. Figure 5 is the top view of the reference imaging plane. P 1 ( x p 1 , y p 1 ) , P 2 ( x p 2 , y p 2 ) are the imaging positions from two viewing aspect angles which are obtained by the Range-Doppler imaging model. φ 1 and φ 2 are equivalent angles under two viewing aspect angles respectively. The intersection refers to the intersection of zero Doppler plane and scene plane under one certain viewing aspect angle. Let Δ φ = φ 1 φ 2 .
As is shown in Figure 5, (11) can be obtained by the law of cosines.
| P P 1 | 2 + | P P 2 | 2 | P 1 P 2 | 2 = 2 · | P P 1 | · | P P 2 | · c o s ( Δ φ )
Therefore, difference of two aspect angles Δ φ can be solved by (12).
Δ φ = a r c c o s ( x p 1 x p 2 + y p 1 y p 2 x p 1 2 + y p 1 2 + x p 2 2 + y p 2 2 )
To simplify the process of computation for scale factor k, spatial variability of it needs to be analysed based on the real data for application. From engineering point of view, point targets are set in the ground coordinates in accordance with the coordinate system used in imaging process.Under any two viewing aspect angles, we computed scale factor k point by point. The precise scale factor k of each point target is obtained. Finally, we compare all of scale factors k of target points at different positions with the k obtained at the center of the scene. Then the variation of proportional coefficient in the given scene area is analysed.

3.2. Calculation of Offset between the Imaging Positions in Different Aspect Images

In this part, image registration is conducted to find homogeneous points in multi-aspect images and offset is calculated. NCC [23] is applied to obtain the best matching patches. ( x i , y i ) is the center of any sub-area in a primary image. The template window traverses on the searching area in the secondary image and normalized cross-correlation coefficient ρ of each corresponding area to the template window is calculated by (13).
ρ = 2 n + 1 2 n + 1 ( I A I ¯ A ) ( I B I ¯ B ) 2 n + 1 2 n + 1 ( I A I ¯ A ) 2 2 n + 1 2 n + 1 ( I B I ¯ B ) 2 ρ ( 1 , 1 )
where, I A and I B are intensity of template window in primary image and secondary image respectively. I ¯ A and I ¯ B are mean of the corresponding areas.
The pixel corresponding to the maximum correlation coefficient in secondary image is matching area and the offset of the corresponding areas is
Δ r = β x · ( x i x j ) 2 + β y · ( y i y j ) 2
where, β x and β y are actual size of imaging grid.
Afterwards, supposing that the homogeneous points obtained by image registration are precise, offset between the two points is exactly accurate. However, target can only be illuminated in certain aspects.So we can use multiple pairs of aspect images to obtain the homogeneous points. For each ground cell of imaging grid, the patches with maximum of correlation coefficient are considered to be more correct.

3.3. Elevation Computation and Solution to Three-Dimensional Coordinates

Offset is converted to absolute value of height difference easily by the elevation extraction model proposed in Section 2.2.
Three-dimensional coordinates needs to be solved for reconstruction. Δ h can be negative or positive depending on whether the target is above or below the imaging plane. First, discrimination method of ± Δ h is shown. Then Z coordinate of target can be obtained. Two-dimensional coordinates are solved by a system of equations. Two sets of solutions will be obtained, and the correct group of solutions is selected by relative relationship between the target and the imaging plane.

3.3.1. Determination of Z Coordinate

Supposing that at two viewing aspect angles 1 or 2, the plain coordinates of SAR platform are A ( x A , y A ) and B ( x B , y B ) respectively. The target that is above or below the imaging plane will be projected to different positions. Therefore, the angle between the SAR platform position vector A B and the imaging positions vector can be used to determine whether the target is above or below the imaging plane.
As shown in Figure 6, when target P ( x P , y P , z P ) is above imaging plane z p > H g . P 1 and P 2 are imaging positions when the plain coordinates of SAR are A and B respectively. It can be easily obtained that
P 1 P 2 · A B > 0
For target Q ( x Q , y Q , z Q ) below the imaging plane, which means z p < H g , Q 1 and Q 2 are imaging positions from two viewing aspect angles. Additionally, Q 1 is the imaging position when SAR is at position 1.
Q 1 Q 2 · A B < 0
Therefore, when imaging positions vector and radar positions vector satisfy with (15), which means the angle between two vectors is less than 180 , target is above the imaging plane. The Z coordinate of target is H g + | Δ h | .
When imaging positions vector and radar position vector satisfy with Equation (16), which means the angle between two vectors is more than 180 , target is below the imaging plane. The Z coordinate of target is H g | Δ h | .

3.3.2. Solution to Planimetric Coordinates

After Z coordinate of the target is derived, we need to solve the planimetric coordinates of the target: X coordinate and Y coordinate to get the complete three-dimensional coordinates. Assuming that the true three-dimensional coordinates of the target is P ( x p , y p , z p ) . According to the analysis in Section 2.2, geometric relationship of target, imaging positions and projections from two viewing aspect angles is shown in Figure 7.
Equations for x p , y p are
( x p 1 x p ) 2 + ( y p 1 y p ) 2 = ( | Δ h | tan θ 1 ) 2 ( x p 2 x p ) 2 + ( y p 2 y p ) 2 = ( | Δ h | tan θ 2 ) 2
where ( x p 1 , y p 1 ) , ( x p 2 , y p 2 ) are the plane coordinates of imaging positions obtained from two viewing aspect angle 1 and 2 respectively. Two groups of solutions will be obtained from the above equations.
The method for determining the correct group of solution is stated as follows:
Shown in Figure 6, if the target P is above the imaging plane whose Z coordinate is H g + | Δ h | , the correct group of solution ( x p , y p ) satisfies (18):
( x p 1 x A ) 2 + ( y p 1 y A ) 2 < ( x p x A ) 2 + ( y p y A ) 2
If the target Q is below the imaging plane whose Z coordinate is H g | Δ h | , the correct solution ( x Q , y Q ) satisfies (19):
( x Q 1 x A ) 2 + ( y Q 1 y A ) 2 > ( x Q x A ) 2 + ( y Q y A ) 2

4. Experimental Result

In this section, we experiment on C-band airborne CSAR data (see Table 1 for acquisition characteristics). The dataset is acquired over a test site in located in Zhuhai, a coastal city in Guangdong province of southern China, as shown in Figure 8a. Optical photograph of the scene is shown in Figure 8b. The circular flight experiment was conducted by the Aerospace Information Research Institute, Chinese Academy of Sciences. The terrain in this area is relatively flat. CSAR is capable of providing multi-aspect images which is called sub-aperture images [24]. Different sub-aperture images are formed in one ground coordinate system. The track of SAR platform is shown in Figure 9, which realizing multi-aspect observation. Velocity vector changes both in yaw direction and pitch direction. The following experiments were conducted with the MATLAB R2018a software on a computer with an Intel Core 3.2-GHz processor and 8.0 GB of physical memory.
BP algorithm [22] is used to form sub-aperture image in the ground coordinates. The reference imaging height plane is set to 20 m. The pixel spacing is set to 0.5 m. In this experiment, two pairs of sub-aperture images are chosen. Each sub-aperture image covers 6 via magnitude accumulation, which is shown in Figure 10 and Figure 11. It is obvious that the direction of illumination is different at each viewing aspect angle.
SAR images from two viewing aspect angles which are labelled by different colors are shown in Figure 12. Purple color and green color denote two images acquired at two different viewing aspect angles. It can be seen that offset of two images is obvious. The main difference in the two-aspect images is translation indicating that the terrain is relatively flat.
To simplify the process of computation for scale factor k, spatial variability of it will be analysed based on the C-band airborne CSAR data in the experiment. From an engineering point of view, point targets (each target is set at every 50 m in the plane range and every 5 m in the height range) with a coverage area of 1 km in length and 50 m in height is set in the ground coordinates in accordance with the coordinate system used in BP imaging process, shown in Figure 13. Under any two viewing aspect angles, we computed scale factor k point by point according to (8). The precise scale factor k of each point target is obtained. Finally, all of scale factors k of target points at different positions are compared with the k obtained at the center of the scene. Then we calculate the deviation value between them.
Deviation of scale factor k between any points and the target P ( 0 , 0 , 1 ) at scene center is shown in Figure 14. Deviation is between −0.2 and 0.2.
According to (7), height sensitivity versus pixel offset between the images can be deduced by (20)
δ h = k · δ r
where δ r is the offset of imaging positions in ground coordinates between two viewing aspect angles. As is defined before, scale factor k is related with equivalent incidence angle and difference of equivalent aspect angles. These two angles depend on different positions in the scene, which introduces spatial varying scale factors.
Therefore theoretical precision of height estimation depends on precision of offset Δ r and scale factor k. Pixel size in this experiment is 0.5 m. The scale factor k for the first pair of sub-aperture images is 2.0012. Corresponding to one-pixel accuracy in matching process, height error caused by mismatching is 1.0006 m, while height error caused by deviation of k is −0.2001 m to 0.2001 m. Scale factor k for the second pair is 0.76968. Corresponding to one-pixel accuracy in matching process, height error caused by mismatching is 0.3848 m, while height error caused by deviation of k is −0.077 m to 0.077 m.
As above analysis, though the scale factor k varies from place to place within the set range, the error caused by scale factor k is not obvious compared with the error caused by matching process. In this experiment we only need to calculate the scale factor in the scene center instead of point by point. Therefore scale factor k of target P ( 0 , 0 , 1 ) at scene center is chosen to be a global value.
NCC algorithm is conducted to get offset between imaging positions in each pair of images. In this experiment, the value of normalized correlation coefficient threshold is 0.3. Offset between the imaging positions can be calculated. Then offset is converted to absolute value of height difference easily by the elevation extraction model proposed in Section 2.2.
The preliminary elevation map of the scene generated by each pair of sub-aperture image is shown in Figure 15a,b. The areas which have been matched well can obtain relatively accurate results. Dark blue areas are unknown altitudes which are set to −10 m. When imaging at each viewing aspect angle, the range of radar beam coverage is different. Some areas are not illuminated in the first group of images but are illuminated in the second group of images. Therefore, the height information of such area obtained from the second group of images is relatively accurate. Therefore the height information obtained from two groups of aspect images provides a relatively complete height map. For example, for the area in an elliptic region, the result is better in Figure 15a than in Figure 15b. While for area in rectangular region, height information obtained in Figure 15b is better than in (a).
Take advantage of different viewing aspect angles, elevation with a larger correlation coefficient will be assigned to the target. Therefore, final elevation value will be obtained by considering two pairs of aspect angles information. Finally, we can get a final DEM after interpolation, shown in Figure 16. The terrain in this scene is between −3 m and 2 m.
The effectiveness of extracted DEM is evaluated in two ways. The first is to compared with the elevation value with TanDEM-X 90 m DEM provided by German Aerospace Center (DLR) [https://tandemx-90m.dlr.de/, in 2016]. Here is a brief quantitative analysis of the estimated errors compared with the height information. Mean value and root mean square error (RMSE) are used to quantitatively evaluate the estimation accuracy, shown in Table 2. RMSE can be calculated by
R M S E ( h ) = 1 N · i = 1 N ( h i h ^ i ) 2
where N denotes the number of all the pixels in the SAR image. h i is the height information provided by DLR. h ^ i is the height information obtained by proposed method.
Shown in Table 2, average height of this scene obtained by the proposed method is −0.4688 m. Mean value of the scene calculated by TanDEM-X 90m DEM is −1.6961 m. The RMSE of the heights obtained by our method is 2.0036 m.
The second way is to reimage the CSAR data using BP algorithm with extracted DEM. The image formed via magnitude accumulation of all the sub-aperture images is shown in Figure 17a. Compared with the image whose imaging reference plane is 20 m (Figure 17b), defocusing has been obviously improved. The imaging position in different sub-aperture images is projected to the right position. Therefore the extracted DEM helps to improve the quality of all-aperture image, which is validated the effectiveness.
Then we show some details. As shown in Figure 8, there are some buildings in region A and region B. In Figure 18b and Figure 19b, the defocusing phenomenon is so serious that it is impossible to distinguish what object it is.
Some false targets appear in the Figure 18b and Figure 19b, while in Figure 18a and Figure 19a, the outline, location and quantity of the objects can be seen.

5. Discussion

The method proposed in this paper is mainly aimed at the new multi-aspect imaging mode which is beneficial for finer observation. The common way is to establish the stereoscopic image pair by using the SAR images formed on slant plane or ground plane and then location of target is obtained by solving the the RD equations.
Actually, when images formed on the ground plane, the imaging coordinate system is unified. From the perspective of imaging, the target, which is above or below the imaging plane, will be projected to different positions from different aspect angles, shown in Figure 17b, and when height of target is equal to the imaging plane, it will be projected to the right position whichever the aspect angle is, shown in Figure 17a. We deduced that the offset between imaging positions is proportional to the height difference and obtained the scale factor to describe the linear relationship.
So for multi-aspect images such as CSAR images and wide-angular spot images, which are always generated on the ground plane, there is no need to solve the Range-Doppler equations point by point. We can retrieve the height value directly when the offset between imaging positions is acquired. If it is a lawn with similar terrain height, the offset of this area is roughly the same. Therefore only the overall offset needs to be solved instead of solving the location of targets point by point. Thus it is more efficient for multi-aspect imaging mode.
When DEM map is applied in imaging process, imaging positions from different aspect angles of the target will be projected to the same position, and thus the high quality wide-angular aperture image will be obtained after several different aspect images are incoherently superimposed, shown in Figure 18a and Figure 19a.
Since the scale factor is determined by the velocity vector and position vector of SAR at instantaneous time, the proposed method can be used to retrieve the elevation of any given two SAR images with high accuracy. In this experiment, the scale factor at the scene center is chosen to be a global value. From the analysis of variation for the scale factor in the scene, shown in Figure 14, we found that the error caused by the scale factor can be neglected. If the scene is rather large, the scale factor can by solved in several small regions.
The proposed method could be applied in urban spaces where studied objects have to be observed from different point of views. Special phenomena of the shadow region, overlays or reflectors in the high-resolution SAR images bring difficulties to SAR imagery understanding due to synthetic aperture [25]. Urbanized spaces are particularly characterized by those affects. Therefore, omni-directional observation and image sequence processing can be taken into consideration [26]. DEM generated by our method is suitable for multi-aspect imaging mode and can greatly improve the results of multi-aspect imaging. In this way, high-quality multi-aspect images can help a lot for urban SAR imagery understanding [27].
However, due to characteristics of urban environments, registration of images is difficult. Especially for the SAR images acquired at different aspects, backscattering characteristics of the same object at different aspects vary greatly [28], which makes it easy to be recognized as different objects in multi-aspect images. Meanwhile, intensity of multi-aspect images can also change a lot. Besides, there are so many corner reflectors in urban areas, which is extremely sensitive to different viewing aspects. That makes images over urban areas especially difficult to match [29]. Further investigation has reminded a challenging task.

6. Conclusions

In this paper, we presented a new method of DEM generation which takes advantage of images from different aspect angles formed on ground plane. The technique is based on radargrammetry, which means that we merely use the radar image intensity. The estimation subject in the previous work is actually the solution to non-linear equations point by point or massive correlation evaluation along height direction. With the derivation of scale factor, we found the implicit relationship between offset of imaging positions and height difference. In this way, the estimation process is simplified. Besides the DEM generated by the multi-aspect imaging mode can also improve the quality of imaging result, which makes the multi-aspect imaging process more efficient.

Author Contributions

S.F. proposed the idea of the method and wrote the paper; Y.L. and W.H. supervised the work and provided suggestions; Y.W. and Y.Y. provided suggestions about the experiments; W.S. and F.T. provided suggestions about the revision of the paper. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the National Natural Science Foundation of China under Grant 61571421, Grant 61431018 and Grant 61501210.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Xiang, Y.; Wang, F.; Wan, L.; You, H. SAR-PC: Edge detection in SAR images via an advanced phase congruency model. Remote Sens. 2017, 9, 209. [Google Scholar] [CrossRef] [Green Version]
  2. Mapelli, D.; Guarnieri, A.M.; Giudici, D. Generation and Calibration of High-Resolution DEM From Single-Baseline Spaceborne Interferometry: The “Split-Swath” Approach. IEEE Trans. Geosci. Remote Sens. 2013, 52, 4858–4867. [Google Scholar] [CrossRef]
  3. Moreira, A.; Krieger, G.; Hajnsek, I.; Hounam, D.; Werner, M.; Riegger, S.; Settelmeyer, E. TanDEM-X: A TerraSAR-X add-on satellite for single-pass SAR interferometry. In Proceedings of the IGARSS 2004 IEEE International Geoscience and Remote Sensing Symposium, Anchorage, AK, USA, 20–24 September 2004. [Google Scholar]
  4. Méric, S.; Fayard, F.; Pottier, É. Radargrammetric SAR image processing. In Geoscience and Remote Sensing; IntechOpen: London, UK, 2009; pp. 421–454. [Google Scholar] [CrossRef] [Green Version]
  5. Leberl, F.; Mayr, W.; Domik, G.; Kobrick, M. SIR-B stereo-radargrammetry of Australia. Int. J. Remote Sens. 1988, 9, 997–1011. [Google Scholar] [CrossRef]
  6. Zebker, H.A.; Goldstein, R.M. Topographic mapping from interferometric synthetic aperture radar observations. J. Geophys. Res. Solid Earth 1986, 91, 4993–4999. [Google Scholar] [CrossRef]
  7. Simonetto, E.; Oriot, H.; Garello, R. Rectangular building extraction from stereoscopic airborne Radar images. IEEE Trans. Geosci. Remote Sens. 2005, 43, 2386–2395. [Google Scholar] [CrossRef]
  8. Nascetti, A.; Capaldo, P.; Pieralice, F.; Porfiri, M.; Fratarcangeli, F.; Crespi, M. Radargrammetric Digital Surface Models Generation from High Resolution Satellite SAR Imagery: Methodology and Case Studies. Int. Assoc. Geodesy Symp. 2015. [Google Scholar] [CrossRef]
  9. Palm, S.; Oriot, H.; Cantalloube, H. Radargrammetric DEM Extraction Over Urban Area Using Circular SAR Imagery. IEEE Trans. Geosci. Remote Sens. 2012, 50, 4720–4725. [Google Scholar] [CrossRef]
  10. Zhang, F.; Fu, Z.; Zhou, Y.; Hu, W.; Hong, W. Multi-aspect SAR target recognition based on space-fixed and space-varying scattering feature joint learning. Remote Sens. Lett. 2019, 10, 998–1007. [Google Scholar] [CrossRef]
  11. Zhang, J.; Suo, Z.; Li, Z.; Zhang, Q. DEM Generation Using Circular SAR Data Based on Low-Rank and Sparse Matrix Decomposition. IEEE Geosci. Remote Sens. Lett. 2018, 15, 724–728. [Google Scholar] [CrossRef]
  12. Toutin, T. Airborne SAR stereo restitution in a mountainous area of Costa Rica: First results. IEEE Trans. Geosci. Remote Sens. 1995, 33, 500–504. [Google Scholar] [CrossRef]
  13. Mercer, J.B.; Griffiths, S.; Thornton, S. Large Area Topographic Mapping Using Stereo SAR. In Proceedings of the First International Airborne Remote Sensing Conference and Exhibition, Strasbourg, France, 12–15 September 1994. [Google Scholar]
  14. Yang, S.; Huang, G.; Zhou, Z. Generation of SAR stereo image pair. In Proceedings of the International Radar Conference, Guilin, China, 20–22 April 2009. [Google Scholar]
  15. Sowmya, D.; Rao, S.A.; Shenoy, P.D.; Venugopal, K. Generation of Digital Elevation Map for Steep Terrain Region Using Landsat-7 ETM+ Imagery. In Proceedings of the 2018 International Conference on Data Science and Engineering (ICDSE), Kochi, India, 7–9 August 2018. [Google Scholar]
  16. Capaldo, P.; Crespi, M.; Fratarcangeli, F.; Nascetti, A.; Pieralice, F. High-resolution SAR radargrammetry: A first application with COSMO-SkyMed spotlight imagery. IEEE Geosci. Remote Sens. Lett. 2011, 8, 1100–1104. [Google Scholar] [CrossRef]
  17. Lin, Y.; Hong, W.; Tan, W.; Wang, Y.; Xiang, M. Airborne circular SAR imaging: Results at P-band. In Proceedings of the 2012 IEEE International Geoscience and Remote Sensing Symposium, Munich, Germany, 22–27 July 2012. [Google Scholar]
  18. Ponce, O.; Prats, P.; Scheiber, R.; Reigber, A.; Moreira, A. Multibaseline 3-D circular SAR imaging at L-band. In Proceedings of the EUSAR 2012 9th European Conference on Synthetic Aperture Radar, Nuremberg, Germany, 23–26 April 2012. [Google Scholar]
  19. Fayard, F.; Meric, S.; Pottier, E. Matching stereoscopic SAR images for radargrammetric applications. In Proceedings of the IEEE International Geoscience and Remote Sensing Symposium, Barcelona, Spain, 23–28 July 2007. [Google Scholar]
  20. Wu, J.; Lin, D.C. Radargrammetric parameter evaluation of an airborne SAR image. Photogramm. Eng. Remote Sens. 2000, 66, 41–48. [Google Scholar]
  21. Leberl, F.W. Radargrammetric Image Processing; Technical Report; Artech House: Norwood, MA, USA, 1990. [Google Scholar]
  22. Ponce, O.; Prats, P.; Rodriguez-Cassola, M.; Scheiber, R.; Reigber, A. Processing of circular SAR trajectories with fast factorized back-projection. In Proceedings of the 2011 IEEE International Geoscience and Remote Sensing Symposium, Vancouver, BC, Canada, 24–29 July 2011. [Google Scholar]
  23. Yoo, J.C.; Han, T.H. Fast normalized cross-correlation. Circuits Syst. Signal Process. 2009, 28, 819. [Google Scholar] [CrossRef]
  24. Pinheiro, M.; Prats, P.; Scheiber, R.; Nannini, M.; Reigber, A. Tomographic 3D reconstruction from airborne circular SAR. In Proceedings of the 2009 IEEE International Geoscience and Remote Sensing Symposium, Cape Town, South Africa, 12–17 July 2009. [Google Scholar]
  25. Zhang, Y.; Ding, C.; Chen, H.; Wang, H. Special Phenomena of the Shadow Region in the High Resolution Synthetic Aperture Radar Image due to Synthetic Aperture. J. Infrared Millimeter Terahertz Waves 2012, 33, 1052–1070. [Google Scholar] [CrossRef]
  26. Sim, D.-G.; Park, R.-H.; Kim, R.-C.; Lee, S.-U.; Kim, I.-C. Integrated position estimation using aerial image sequences. IEEE Trans. Pattern Anal. Mach. Intell. 2002, 24, 1–18. [Google Scholar] [CrossRef]
  27. Baselice, F.; Budillon, A.; Ferraioli, G.; Pascazio, V. Layover solution in SAR imaging: A statistical approach. IEEE Geosci. Remote Sens. Lett. 2009, 6, 577–581. [Google Scholar] [CrossRef]
  28. Zhao, Y.; Lin, Y.; Hong, W.; Yu, L. Adaptive imaging of anisotropic target based on circular-SAR. Electron. Lett. 2016, 52, 1406–1408. [Google Scholar] [CrossRef]
  29. Ostrowski, J.; Cheng, P. DEM extraction from stereo SAR satellite imagery. In Proceedings of the IEEE International Geoscience and Remote Sensing Symposium, Honolulu, HI, USA, 24–28 July 2000. [Google Scholar]
Figure 1. Geometric relation of multi-aspect airborne observation geometry.
Figure 1. Geometric relation of multi-aspect airborne observation geometry.
Remotesensing 12 00556 g001
Figure 2. Imaging geometry of one certain aspect angle. A target is above the imaging plane.
Figure 2. Imaging geometry of one certain aspect angle. A target is above the imaging plane.
Remotesensing 12 00556 g002
Figure 3. Flowchart of the digital elevation model(DEM) generation processing.
Figure 3. Flowchart of the digital elevation model(DEM) generation processing.
Remotesensing 12 00556 g003
Figure 4. Schematic diagram of equivalent incidence angle.
Figure 4. Schematic diagram of equivalent incidence angle.
Remotesensing 12 00556 g004
Figure 5. Top view of ground coordinate.
Figure 5. Top view of ground coordinate.
Remotesensing 12 00556 g005
Figure 6. Discrimination of ± | Δ h | .
Figure 6. Discrimination of ± | Δ h | .
Remotesensing 12 00556 g006
Figure 7. Schematic diagram of solution about plain coordinate values.
Figure 7. Schematic diagram of solution about plain coordinate values.
Remotesensing 12 00556 g007
Figure 8. Location of test cite and optical photograph of the test site.
Figure 8. Location of test cite and optical photograph of the test site.
Remotesensing 12 00556 g008
Figure 9. Track of synthetic aperture radar(SAR) platform.
Figure 9. Track of synthetic aperture radar(SAR) platform.
Remotesensing 12 00556 g009
Figure 10. The first pair of images.On each sub-aperture image, the radar line of sight is indicated by red arrow.
Figure 10. The first pair of images.On each sub-aperture image, the radar line of sight is indicated by red arrow.
Remotesensing 12 00556 g010
Figure 11. The second pair of images.On each sub-aperture image, the radar line of sight is indicated by red arrow.
Figure 11. The second pair of images.On each sub-aperture image, the radar line of sight is indicated by red arrow.
Remotesensing 12 00556 g011
Figure 12. A superposed graph of two different sub-aperture images. Offset between two images is caused by height difference.
Figure 12. A superposed graph of two different sub-aperture images. Offset between two images is caused by height difference.
Remotesensing 12 00556 g012
Figure 13. A set of points in the scene. Blue points are points which cover the scene.Red point is the point which is at the center of the scene.
Figure 13. A set of points in the scene. Blue points are points which cover the scene.Red point is the point which is at the center of the scene.
Remotesensing 12 00556 g013
Figure 14. Deviation of scale factor k between any points and the target P ( 0 , 0 , 1 ) . Actually, the Y-axis has no measured unit. Here we use the unit equivalent to the size of the pixel to denote the value of it.
Figure 14. Deviation of scale factor k between any points and the target P ( 0 , 0 , 1 ) . Actually, the Y-axis has no measured unit. Here we use the unit equivalent to the size of the pixel to denote the value of it.
Remotesensing 12 00556 g014
Figure 15. Preliminary elevation map of the scene generated by each pair images.
Figure 15. Preliminary elevation map of the scene generated by each pair images.
Remotesensing 12 00556 g015
Figure 16. Final DEM generated by the presented method.
Figure 16. Final DEM generated by the presented method.
Remotesensing 12 00556 g016
Figure 17. All-aperture image formed via magnitude accumulation of all the sub-aperture images.
Figure 17. All-aperture image formed via magnitude accumulation of all the sub-aperture images.
Remotesensing 12 00556 g017
Figure 18. Region A: All-aperture image formed via magnitude accumulation of all the sub-aperture images.
Figure 18. Region A: All-aperture image formed via magnitude accumulation of all the sub-aperture images.
Remotesensing 12 00556 g018
Figure 19. Region B: All-aperture image formed via magnitude accumulation of all the sub-aperture images.
Figure 19. Region B: All-aperture image formed via magnitude accumulation of all the sub-aperture images.
Remotesensing 12 00556 g019
Table 1. Acquisition Characteristics.
Table 1. Acquisition Characteristics.
Carrier Frequency5.4 GHz (C-Band)
Bandwidth560 MHz
PRF2000 Hz
Flight Height3 km
Flight Radius5 km
PolarizationFull polarization
Table 2. Quality assessment of the estimation accuracy.
Table 2. Quality assessment of the estimation accuracy.
MetricsThe Proposed Method
Mean value−0.4688 m
RMSE2.0036 m

Share and Cite

MDPI and ACS Style

Feng, S.; Lin, Y.; Wang, Y.; Yang, Y.; Shen, W.; Teng, F.; Hong, W. DEM Generation With a Scale Factor Using Multi-Aspect SAR Imagery Applying Radargrammetry. Remote Sens. 2020, 12, 556. https://doi.org/10.3390/rs12030556

AMA Style

Feng S, Lin Y, Wang Y, Yang Y, Shen W, Teng F, Hong W. DEM Generation With a Scale Factor Using Multi-Aspect SAR Imagery Applying Radargrammetry. Remote Sensing. 2020; 12(3):556. https://doi.org/10.3390/rs12030556

Chicago/Turabian Style

Feng, Shanshan, Yun Lin, Yanping Wang, Yanhui Yang, Wenjie Shen, Fei Teng, and Wen Hong. 2020. "DEM Generation With a Scale Factor Using Multi-Aspect SAR Imagery Applying Radargrammetry" Remote Sensing 12, no. 3: 556. https://doi.org/10.3390/rs12030556

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop