Next Article in Journal
Remotely Sensing the Source and Transport of Marine Plastic Debris in Bay Islands of Honduras (Caribbean Sea)
Previous Article in Journal
Joint Effect of Spartina alterniflora Invasion and Reclamation on the Spatial and Temporal Dynamics of Tidal Flats in Yangtze River Estuary
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Relative Radiometric Calibration Using Tie Points and Optimal Path Selection for UAV Images

1
Research Planning Department, Seoul Institute of Technology, Seoul 03909, Korea
2
Department of Geoinformatic Engineering, Inha University, Incheon 22212, Korea
3
Image Engineering Research Center, 3DLabs Co. Ltd., Incheon 21984, Korea
4
Climate Change and Agroecology Division, National Institute of Agricultural Sciences, Jeollabuk-do 55365, Korea
*
Author to whom correspondence should be addressed.
Remote Sens. 2020, 12(11), 1726; https://doi.org/10.3390/rs12111726
Submission received: 1 April 2020 / Revised: 9 May 2020 / Accepted: 25 May 2020 / Published: 27 May 2020
(This article belongs to the Section Remote Sensing Image Processing)

Abstract

:
As the use of unmanned aerial vehicle (UAV) images rapidly increases so does the need for precise radiometric calibration. For UAV images, relative radiometric calibration is required in addition to the traditional vicarious radiometric calibration due to the small field of view. For relative radiometric calibration, some UAVs install irradiance sensors, but most do not. For UAVs without them, an intelligent scheme for relative radiometric calibration must be applied. In this study, a relative radiometric calibration method is proposed to improve the quality of a reflectance map without irradiance measurements. The proposed method, termed relative calibration by the optimal path (RCOP), uses tie points acquired during geometric calibration to define the optimal paths. A calibrated image from RCOP was compared to validation data calibrated with irradiance measurements. As a result, the RCOP method produces seamless mosaicked images with uniform brightness and reflectance patterns. Therefore, the proposed method can be used as a precise relative radiometric calibration method for UAV images.

1. Introduction

Remote sensing is a science that acquires information without contacting targets. Terrestrial, airborne, and spaceborne sensors have been used. Rapidly advancing technologies are making sensors smaller, more precise, and more popular, and the supply of unmanned aerial vehicle (UAV) images is increasing in various remote sensing fields. UAV images have attracted user attention as a means of efficiently observing limited-access areas [1] and are used to acquire three-dimensional spatial information and biophysical information. In agriculture and forestry, they are used to detect or monitor vegetation conditions, such as vigor, growth, yield, and the effects of disasters and disease [2,3,4,5,6,7,8,9]. Recently, UAV images were used to analyze surface conditions by using artificial intelligence, such as deep learning [10,11].
In order to acquire location and attribute information from remote sensing images, preprocessing, such as geometric and radiometric calibration, is required. In particular, radiometric calibration is required to convert digital numbers (DNs) into spectral reflectance, which is an important step in obtaining biophysical information from images. For radiometric calibration, vicarious calibration and the radiative transfer (RT) model are widely used [12,13,14,15,16,17]. Vicarious calibration converts DNs into spectral reflectance by installing ground reference panels with known reflectance on the ground and estimating conversion coefficients using the DNs of the targets and their reflectance [12,18]. The RT model converts DNs into reflectance based on a mathematical model. It does not require ground reference panels, but it has limitations such as complicate parameters and low accuracy [19,20,21,22,23,24].
UAV images are taken at a relatively low altitude, compared to aerial or satellite images, and they may not possess significant radiometric distortions. On the other hand, their small field of view makes mosaicking a necessary procedure. Each image may experience different turbulence, a different incidence angle, different illumination, or different signal processing chains. As a result, radiometric properties of UAV images may vary significantly. Relative radiometric properties among UAV images have to be adjusted so that the mosaic image can have a consistent spectral reflectance.
Therefore, for UAV images, we need relative radiometric calibration in addition to vicarious calibration, unless we install ground reference panels within the field of view of each image. Images with ground reference panels are calibrated through vicarious calibration. Images without ground reference panels are calibrated relatively to the DNs of the images with ground reference panels and are then calibrated through vicarious calibration [25,26]. Some UAV cameras include an irradiance sensor to measure the amount of sunlight for each image and they use the irradiance measurements for relative radiometric calibration [27]. For UAV cameras without an irradiance sensor, conversion coefficients have been estimated through regression analysis between the DNs of pixels on overlapping regions between two images [28]. However, this method is prone to geometric distortions and pixels with radiometric anomalies [29] and they often result in visual discontinuity between adjacent scenes [30]. Precise relative radiometric calibration of UAV images is still highly in demand.
This paper proposes a new method of relative radiometric calibration for UAV images acquired without an irradiance sensor. The proposed method uses tie points obtained during a geometric calibration process for relative radiometric calibration. Tie points refer to image points from different images, which correspond to the same ground points. Using DNs of tie points recorded in different images, coefficients of relative radiometric calibration between two images are estimated. DNs of images without ground reference panels are then converted to equivalent DNs of a reference image with ground reference panels. Some images may not overlap the reference image and do not have tie points corresponding directly with it. In this case, their DNs cannot be converted to the equivalent DNs of the reference image with a single conversion. Therefore, we need to link such images to the reference image in a cascade manner. The proposed method establishes an image network using tie points and finds an optimal path from one image to the reference image. The proposed method was verified by comparing its results with those from a calibrated image that uses an irradiance sensor.

2. Acquiring the Data

2.1. UAV Images

UAV images were acquired on 15 May 2019 from 15:32 h to 15:36 h at 50 degrees solar elevation under a clear sky. The UAV flew seven courses at an altitude of 100 m and acquired 108 images at a spatial resolution of 3 cm. Figure 1 shows the study area and locations of the images acquired. The area is above the campus of the National Institute of Agricultural Sciences in Wanju-gun, Jeollabuk-do, Korea. It includes trees, grass, soil, and sidewalk blocks as its major cover types.
A fixed-wing UAV (Figure 2a) was used for the experiment, equipped with a RedEdge-M camera installed onboard (Figure 2b). The camera captures images in five spectral bands: blue, green, red, red-edge, and near-infrared. It also has an irradiance sensor, a downwelling light sensor (DLS), for radiometric calibration. The measured irradiance from the DLS was used to produce validation data for this study.

2.2. Reflectance of Ground Reference Panels

Seven ground reference panels were deployed on the grass of a soccer field for vicarious radiometric calibration of the UAV images (Figure 3a). The panels were made of a specially coated fabric sized 1.2 m × 1.2 m to maintain constant reflectance through a spectral range of 435 nm to 1100 nm. Spectral reflectance of each panel was measured using a FieldSpec-3 spectro-radiometer (Figure 3b). The panels used for the experiment provided 3%, 5%, 11%, 22%, 33%, 44%, and 55% reflectance.

2.3. Validation Data

For validation, a reference reflectance map was generated using the measured irradiance data and the reference panels [31]. An image map with intensity as its pixel values was generated first through standard processing procedures of tie point extraction, bundle adjustment, digital surface model generation, ortho-rectification, and image resampling. A reflectance map with reflectance as its pixel values was then generated by converting DNs of the mosaicked image into reflectance values. For the conversion procedure, the measured irradiance data and the DN on the ground reference panels were used to calculate coefficients for the conversion. We used a commercial software (SW), Pix4D mapper 4.1 (Pix4D S.A., Switzerland), to produce the reflectance map. Figure 4 shows the reflectance map generated as validation data as natural and pseudo-infrared color composite, respectively. The pseudo-infrared color composite means that near-infrared (NIR), red and green bands are assigned as red, green, and blue, respectively, for color display.
The mean difference was measured between the reflectance map and the ground reference panels. In this study, only ground reference panels were used and in-situ reflectance measurements were not available. Table 1 shows mean differences of the reflectance map for each panel. All values were very small and between 0.001 and 0.101. Visible bands had relatively lower differences than red-edge and NIR bands.

3. The Proposed Method

3.1. Relative Radiometric Calibration Procedure

This section explains the procedure for relative radiometric calibration of UAV images based on tie points and the minimum distance path method. Figure 5 summarizes the procedure. First, tie points were extracted automatically from UAV images through structure from motion (SfM)-based geometric processing [32,33,34]. Commercial SW or open-source SW can be used for this process. In this study, Pix4D mapper 4.1 was used to extract tie points. Secondly, the image with ground reference panels was selected as the reference image. DN values of the targets within the reference image were measured and coefficients for vicarious calibration were estimated through regression analysis.
Next, an image network is formed by defining each image as a node. When there is a sufficient number of tie points between two images, a link between the two corresponding nodes is defined. After an image network is formed, we can find an optimal path from one image to another by following links between image nodes. In this experiment, we used the Dijkstra algorithm [35,36] to obtain the optimal path.
Next, tie points among the images within the optimal path are processed to estimate coefficients for relative radiometric calibration. DNs of an image are converted to equivalent DNs in the reference image using these relative calibration coefficients and eventually they are converted to reflectance using the vicarious calibration coefficients. Finally, a geometric mosaicking process is carried out on the reflectance images to generate a mosaicked reflectance map.

3.2. Vicarious Radiometric Calibration of the Reference Image

DNs of the reference image are converted to spectral reflectance using the vicarious radiometric calibration method. For each ground reference panel, locations of sample pixels were identified manually. To minimize errors, the locations were selected to avoid undulated parts within the reference panels. Figure 6 shows an example of the reference image and the zoomed image–location of the ground targets. The average DN around the location was calculated per spectral band and per reference panel.
DNs of the reference image are converted to reflectance using the following equation:
R = a a D N + b a
where R is reflectance, D N is the digital number of an image, a a is absolute radiometric gain, and b a is the absolute radiometric offset. These coefficients are estimated for each band via linear regression between the DN and the reflectance of the pixels corresponding to the ground reference panels (Figure 7). All DNs of the reference image were then converted to reflectance using Equation (1).

3.3. Optimal Path Selection

Generally, UAV cameras acquire hundreds of images per mission, all with extensive overlaps. It is very difficult to consider all the images for relative radiometric calibration, since there will be too many combinations of image pairs. Errors in relative radiometric calibration are accumulated due to the number of image pairs. One image may overlap many other images and there are many paths reaching from one image to another by connecting images with overlaps. Therefore, an intelligent scheme to select images for relative radiometric calibration is required to minimize radiometric errors. This paper suggests an optimal path selection method to cover the entire study area. It can minimize error accumulation by reducing the steps from the reference image to the last image located at the region boundary.
The Dijkstra algorithm [35] finds a path that minimizes the sum of weights from one starting point to all other points. It has been used in various fields, such as navigation, to search for an optimal path [36]. In this paper, the Dijkstra algorithm is used to select an optimal path to the reference image from an image not overlapping the reference image.
An image network is formed by defining each image as a node. Links between nodes are defined based on the number of tie points. When there is a sufficient number of tie points between two images, a link with a weight of 1 is defined between the two corresponding nodes. When the number of tie points is less than a certain threshold, a link with infinite weight is defined. All links have the number of tie points as an attribute.
For the threshold for the number of tie points, a fixed value of 100 was used. At this stage, the threshold value was not critical to overall performance. Tie points were to be filtered out further based on the DN differences between two images. For valid image pairs, the number of tie points was in the order of 100. We set this threshold only to illuminate false image pairs into consideration.
The Dijkstra algorithm finds an optimal path from one image to the reference image, which is the path with the minimum weight. If there are multiple paths with the same minimum weight, the path with the largest number of tie points is selected.
Figure 8a shows a schematic diagram of an image network. Here, S indicates an image for starting the optimal path search, which in our case is an image to be calibrated. F is the final destination of the optimal path search, which in our case is the reference image. The symbols a-d represent images distributed between S and F, and the lines between images mean there are tie points. Figure 8b shows the number of tie points between each image pair. Figure 8c shows the weight of the links based on a threshold of 100.
The sum of the weights equals 2 for the two shortest paths, S-a-F and S-d-F, which are represented in Figure 8a by red and green lines, respectively. Tie point number is considered when selecting an optimal path. Figure 9a shows the number of tie points along the two paths. Since the path S-a-F has 400 tie points, and path S-d-F has 550, S-d-F is selected as the optimal path, as shown in Figure 9b.
All images along the optimal path are selected as optimum images for relative radiometric calibration of the image of interest. Two successive images along the optimal path are used to find the coefficients for relative radiometric calibration. In Figure 9, optimal images for the relative radiometric calibration of S are d and F. Therefore, coefficients for relative radiometric calibration for S-d and d-F are estimated. The DNs of S are converted to equivalent DNs of d using S-d conversion, and then, to equivalent DNs of F using d-F conversion.

3.4. Relative Radiometric Calibration

As mentioned earlier, tie points are obtained from the geometric calibration process. They are extracted initially from two images using traditional tie point extraction algorithms [37,38]. Initial tie points contain many outliers and many of them are removed by incremental bundle adjustment based on SfM [32,39]. Some tie points that may be geometrically correct may contain radiometric abnormalities due to shadow, saturation, or sun glints. To improve the quality of relative radiometric calibration, tie points undergo radiometric filtering.
In order to remove radiometric outliers in the tie points, it is necessary to define an appropriate threshold for the number of tie points. An optimal threshold was defined by checking R 2 values. Mean and standard deviation of DN differences between reference and adjacent image was calculated for all tie points. We then selected tie points whose DN differences were within n times of the standard deviation (DN difference < mean ± n × standard deviation) and R 2 was calculated for the selected tie points. We changed the value of n and checked the variation of R 2 . We decide the threshold when R 2 was saturated. Table 2 shows an example of ratio of selected points to total number of tie points, number and R 2 of the tie points by n. When n was 1 and ratio was 68%, R 2 was saturated to 0.98 for most of image. Figure 10 shows an example of the distribution of DN values for all tie points (black crosses), and the accepted tie points are red crosses. Figure 11 shows an example of accepted tie points (red x’s) among all tie points.
Using the accepted tie points, coefficients of relative radiometric calibration are estimated using Equation (2):
D N = a r D N + b r
where D N is the digital number from the original image, D N is the digital number from the converted image, a r   is relative radiometric gain, and b r is relative radiometric offset. Using the estimated conversion coefficients, DNs of the original image are converted to equivalent DNs of the other image. Using successive image pairs along the optimal path, DNs of the original image are converted sequentially to equivalent DNs of the reference image. They are then converted to reflectance using Equation (1).

3.5. Validation of the Proposed Method

For validation, a mosaicked image (a reference reflectance map) was generated separately using irradiance measurements. For quantitative validation, reflectance obtained from relative radiometric calibration was compared to the validation data at the same points. The test samples were collected for a total of 200 pixels by random sampling (Figure 12). Error was calculated as root mean square error (RMSE) using the following equation:
R M S E = 1 n i = 1 n ( R i R i ^ ) 2
where n is the number of samples, R i and R i ^ represent reflectance of the calibrated image and the validation data, respectively, for the i-th validation point.
In order to check the effectiveness of using optimal images for relative calibration, we tested two additional methods. The first is applying a series of relative calibrations in the order of image acquisition. The second is applying the coefficients of vicarious calibration obtained from the reference image to all other images without relative calibration. For clarity, we call the proposed method relative calibration by the optimal path (RCOP); we refer to the first method described directly above as relative calibration by acquisition sequence (RCAS) and the second method above is no relative calibration (NoRC).

4. Results and Discussion

4.1. Visual Interpretation of Calibration Results

The three relative radiometric calibration methods (RCOP, RCAS, and NoRC) were applied to UAV images without irradiance. Figure 13 shows the reflectance map by natural color composites (a–c) and pseudo-infrared color composites (d–f) from RCOP, RCAS, and NoRC, respectively. In order to compare colors, each band was stretched based on the same range of reflectance.
The results from RCOP had similar color when compared with the validation data using irradiance measurement (Figure 4). The mosaic result was smooth without noticeable color differences at the boundaries between images. The result from RCAS showed good calibration results in the upper part. However, it became darker towards the lower part. The accumulation of errors from the image acquisition sequence seemed severe. Note that for RCOP, optimum paths from all images to the reference image had fewer than five links, whereas for RCAS, a path from an image to the reference increased in the order of acquisition sequence. For relative calibration, the proposed method could provide an efficient path for accumulative radiometric conversion.
The results from NoRC indicate that no relative calibration worked better than relative calibration with accumulation sequence. At a glance, it looks similar to the validation data. However, it deserves careful attention. The left side (west side) of the NoRC result is brighter than other areas. The reason is likely changes in camera exposure. The camera was set to adapt exposure automatically which is general setting in application fields. The exposure has changed depending on the amount of light reaching the camera. An exposure value (EV), which is a function of exposure time, International Organization for Standardization (ISO) value, and focal (F) number, can indicate the degree of exposure [40]. When the EV is high, the amount of the exposure is larger, and an image is brighter.
The whole mosaicked image was divided into 3 × 3 subregions (Figure 14) and average EVs for each subregion are listed in Table 3. As shown in the table, EVs of the left subregions are higher than other subregions. The UAV images of some subregions were taken brighter than other subregions. Therefore, the mosaicked image from NoRC on those subregions appears brighter than the validation data. From visual inspection and an analysis of EV, the results from the proposed RCOP provided better performance than the other methods.

4.2. Quantitative Accuracy Analysis

The results of the three relative calibration methods (RCOP, RCAS, and NoRC) were compared quantitatively with the validation data. Table 4 shows the RMSE for the reflectance from each method per band. The results from RCOP show lower RMSEs compared to RCAS and NoRC, with the exception of red-edge and NIR bands. For RCOP, RMSE of the visible (blue, green, red) bands was about 0.03 and for red-edge and NIR bands about 0.06 and 0.10, respectively. RMSE for NoRC was larger than RCOP (except for red-edge and NIR bands) and smaller than RCAS. RMSEs from the three methods agreed well with visual inspection.
As shown in Table 4, there was an exception for red-edge and NIR bands between the proposed RCOP and NoRC methods. There are two possible reasons. The first reason could be lower contrast of the bands that affects number of tie point for the bands. In this study, tie points were extracted from each band, and the relative radiometric calibration and mosaicking processes were carried out per band. This was to remove problems associated with band-to-band misalignments, since the camera captures five spectral bands through separate optical paths. While tie points from blue, green, and red bands were accurate and large in number, those from red-edge and NIR bands were less accurate and fewer in number. Figure 15 shows an example of the accepted tie points for red, red-edge, and NIR bands for one image pair. The numbers of tie points were 3775, 2108, and 1005 for the red, red-edge, and NIR bands, respectively. For red-edge and NIR bands, the numbers of tie points decreased due to lower contrast. Figure 16 shows the distributions of DNs and the results of regression analysis between two images for red, red-edge, and NIR bands. The DN distributions of red-edge and NIR bands showed weaker linearity than the red band. R 2 values decreased from 0.95 for the red band to 0.91 for red-edge, and 0.76 for the NIR band. This weak linearity affected the accuracy of the relative radiometric calibration for red-edge and NIR bands. This issue will be studied further in subsequent research, along with precise band-to-band alignment.
The second reason might be the error caused by land-cover type. Spectral reflectance is depending on surface materials. In particular, RMSE of red-edge and NIR bands could be highly affected by high reflectance of vegetation. The 200 validation points were grouped according to land-cover types and RMSE of each land-cover type was compared. Table 5 shows RMSEs of validation points for land-cover types. The RMSE of each class in each band was similar among the three relative radiometric calibration methods. This implies that the reason of higher RMSEs of red-edge and NIR bands is not likely due to spectral characteristics but due to the insufficient number of tie points in those bands.
Despite the exceptions for red-edge and NIR bands, qualitative analysis showed that the use of tie points and an optimal path can improve the quality of relative radiometric calibration.

5. Conclusions

In this study, we proposed a new method for relative radiometric calibration when irradiance measurement is not available during image acquisition. It can improve radiometric calibration quality using tie points and optimal path selection. It showed higher reliability and stability in the calibration results, compared to other methods. Therefore, the proposed method can be used to obtain a precise reflectance map, improving the quality of relative radiometric calibration.
Most UAVs acquire images without irradiance measurement and are used in applications where precise reflectance retrieval is crucial. The proposed method should contribute to improving accuracy of biophysical factor estimation or classification using UAV images. In further studies, precise band alignment will be carried out and we will check how exceptions reported in this paper can be improved. Additional studies will be carried out with images acquired under various weather condition, various exposure setting and validation with ground reflectance measurements for well-distributed in-situ measurements.

Author Contributions

Conceptualization, J.-I.S. and T.K.; data curation, J.-I.S., P.-C.L., H.-M.L., H.-Y.A., and C.-W.P.; formal analysis, J.-I.S. and T.K.; funding acquisition, C.-W.P. and T.K.; investigation, Y.-M.C., P.-C.L., and H.-M.L.; methodology, J.-I.S., Y.-M.C., and P.-C.L.; project administration, H.-Y.A. and C.-W.P.; resources, P.-C.L., H.-Y.A., and C.-W.P.; software, Y.-M.C.; supervision, T.K.; validation, J.-I.S., Y.-M.C., and T.K.; Visualization, Y.-M.C.; Writing—original draft, J.-I.S., Y.-M.C., and H.-M.L.; Writing—review and editing, T.K. All authors have read and agreed to the published version of the manuscript.

Funding

This study was carried out with the support of the Cooperative Research Program for Agriculture Science and Technology Development (PJ013500032020), Rural Development Administration, Republic of Korea.

Conflicts of Interest

The authors declare they have no conflicts of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of the data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Tsouros, D.C.; Bibi, S.; Sarigiannidis, P.G. A Review on UAV-Based Applications for Precision Agriculture. Information 2019, 10, 349. [Google Scholar] [CrossRef] [Green Version]
  2. Primicerio, J.; Gennaro, S.F.D.; Fiorillo, E.; Genesio, L.; Lugato, E.; Matese, A.; Vaccari, F.P. A flexible unmanned aerial vehicle for precision agriculture. Precis. Agric. 2012, 13, 517–523. [Google Scholar] [CrossRef]
  3. Berni, J.A.; Zarco-Tejada, P.J.; Suárez, L.; Fereres, E. Thermal and narrowband multispectral remote sensing for vegetation monitoring from an unmanned aerial vehicle. Ieee Trans. Geosci. Remote Sens. 2009, 47, 722–738. [Google Scholar] [CrossRef] [Green Version]
  4. Zhang, K.; Ge, X.; Shen, P.; Li, W.; Liu, X.; Cao, Q.; Zhu, Y.; Cao, W.; Tian, Y. Predicting Rice Grain Yield Based on Dynamic Changes in Vegetation Indexes during Early to Mid-Growth Stages. Remote Sens. 2019, 11, 387. [Google Scholar] [CrossRef] [Green Version]
  5. Merino, L.; Caballero, F.; Martínez-de-Dios, J.R.; Maza, I.; Ollero, A. An Unmanned Aircraft System for Automatic Forest Fire Monitoring and Measurement. J. Intell. Robot. Syst. 2012, 65, 533–548. [Google Scholar] [CrossRef]
  6. Smigaj, M.; Gaulton, R.; Barr, S.L.; Suárez, J.C. Uav-Borne Thermal Imaging for Forest Health Monitoring: Detection of Disease-Induced Canopy Temperature Increase. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, XL-3/W3, 349–354. [Google Scholar] [CrossRef] [Green Version]
  7. Maimaitijiang, M.; Sagan, V.; Sidike, P.; Hartling, S.; Esposito, F.; Fritschi, F.B. Soybean yield prediction from UAV using multimodal data fusion and deep learning. Remote Sens. Environ. 2020, 237, 111599. [Google Scholar] [CrossRef]
  8. Yu, N.; Li, L.; Schmitz, N.; Tian, L.F.; Greenberg, J.A.; Diers, B.W. Development of methods to improve soybean yield estimation and predict plant maturity with an unmanned aerial vehicle based platform. Remote Sens. Environ. 2016, 187, 91–101. [Google Scholar] [CrossRef]
  9. Adão, T.; Hruška, J.; Pádua, L.; Bessa, J.; Peres, E.; Morais, R.; Sousa, J. Hyperspectral imaging: A review on UAV-based sensors data processing and applications for agriculture and forestry. Remote Sens. 2017, 9, 1110. [Google Scholar] [CrossRef] [Green Version]
  10. Kerkech, M.; Hafiane, A.; Canals, R. Deep leaning approach with colorimetric spaces and vegetation indices for vine diseases detection in UAV images. Comput. Electron. Agric. 2018, 155, 237–243. [Google Scholar] [CrossRef]
  11. Liakos, K.; Busato, P.; Moshou, D.; Pearson, S.; Bochtis, D. Machine learning in agriculture: A review. Sensors 2018, 18, 2674. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  12. Del Pozo, S.; Rodríguez-Gonzálvez, P.; Hernández-López, D.; Felipe-García, B. Vicarious radiometric calibration of a multispectral camera on board an unmanned aerial system. Remote Sens. 2014, 6, 1918–1937. [Google Scholar] [CrossRef] [Green Version]
  13. Moran, M.S.; Bryant, R.; Thome, K.; Ni, W.; Nouvellon, Y.; Gonzalez-Dugo, M.P.; Qi, J.; Clarke, T.R. A refined empirical line approach for reflectance factor retrieval from Landsat-5 TM and Landsat-7 ETM+. Remote Sens. Environ. 2001, 78, 71–82. [Google Scholar] [CrossRef]
  14. Smith, G.M.; Milton, E.J. The use of the empirical line method to calibrate remotely sensed data to reflectance. Internat. J. Remote Sens. 1999, 20, 2653–2662. [Google Scholar] [CrossRef]
  15. Berni, J.A.J.; Zarco-Tejada, P.J.; Suárez, L.; González-Dugo, V.; Fereres, E. Remote sensing of vegetation from UAV platforms using lightweight multispectral and thermal imaging sensors. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2014. XXXVIII-1-4-7/W5. Available online: https://www.isprs.org/proceedings/xxxviii/1_4_7-W5/paper/Jimenez_Berni-155.pdf (accessed on 5 March 2020).
  16. Garzonio, R.; Di Mauro, B.; Colombo, R.; Cogliati, S. Surface reflectance and sun-induced fluorescence spectroscopy measurements using a small hyperspectral UAS. Remote Sens. 2017, 9, 472. [Google Scholar] [CrossRef] [Green Version]
  17. Honkavaara, E.; Saari, H.; Kaivosoja, J.; Pölönen, I.; Hakala, T.; Litkey, P.; Mäkynen, J.; Pesonen, L. Processing and assessment of spectrometric, stereoscopic imagery collected using a lightweight UAV spectral camera for precision agriculture. Remote Sens. 2013, 5, 5006–5039. [Google Scholar] [CrossRef] [Green Version]
  18. Kelcey, J.; Lucieer, A. Sensor Correction of a 6-Band Multispectral Imaging Sensor for UAV Remote Sensing. Remote Sens. 2012, 4, 1462–1493. [Google Scholar] [CrossRef] [Green Version]
  19. Yang, G.; Li, C.; Wang, Y.; Yuan, H.; Feng, H.; Xu, B.; Yang, X. The DOM generation and precise radiometric calibration of a UAV-mounted miniature snapshot hyperspectral imager. Remote Sens. 2017, 9, 642. [Google Scholar] [CrossRef] [Green Version]
  20. Hakala, T.; Honkavaara, E.; Saari, H.; Mäkynen, J.; Kaivosoja, J.; Pesonen, L.; Pölönen, I. Spectral imaging from UAVs under varying illumination conditions. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2013, XL-1/W2, 189–194. [Google Scholar] [CrossRef] [Green Version]
  21. Aasen, H.; Bolten, A. Multi-temporal high-resolution imaging spectroscopy with hyperspectral 2D imagers–From theory to application. Remote Sens. Environ. 2018, 205, 374–389. [Google Scholar] [CrossRef]
  22. Honkavaara, E.; Khoramshahi, E. Radiometric correction of close-range spectral image blocks captured using an unmanned aerial vehicle with a radiometric block adjustment. Remote Sens. 2018, 10, 256. [Google Scholar] [CrossRef] [Green Version]
  23. Roosjen, P.; Suomalainen, J.; Bartholomeus, H.; Kooistra, L.; Clevers, J. Mapping reflectance anisotropy of a potato canopy using aerial images acquired with an unmanned aerial vehicle. Remote Sens. 2017, 9, 417. [Google Scholar] [CrossRef] [Green Version]
  24. Schneider-Zapp, K.; Cubero-Castan, M.; Shi, D.; Strecha, C. A new method to determine multi-angular reflectance factor from lightweight multispectral cameras with sky sensor in a target-less workflow applicable to UAV. Remote Sens. Environ. 2019, 229, 60–68. [Google Scholar] [CrossRef] [Green Version]
  25. Mafanya, M.; Tsele, P.; Botai, J.O.; Manyama, P.; Chirima, G.J.; Monate, T. Radiometric calibration framework for ultra-high-resolution UAV-derived orthomosaics for large-scale mapping of invasive alien plants in semi-arid woodlands: Harrisia pomanensis as a case study. Internat. J. Remote Sens. 2018, 39, 5119–5140. [Google Scholar] [CrossRef] [Green Version]
  26. Agisoft PhotoScan User Manual: Standard Edition. Version 1.2. Available online: https://www.agisoft.com/pdf/photoscan_1_2_en.pdf (accessed on 5 March 2020).
  27. Mamaghani, B.; Salvaggio, C. Multispectral Sensor Calibration and Characterization for sUAS Remote Sensing. Sensors 2019, 19, 4453. [Google Scholar] [CrossRef] [Green Version]
  28. Suomalainen, J.; Hakala, T.; Alves de Oliveira, R.; Markelin, L.; Viljanen, N.; Näsi, R.; Honkavaara, E. A Novel Tilt Correction Technique for Irradiance Sensors and Spectrometers On-Board Unmanned Aerial Vehicles. Remote Sens. 2018, 10, 2068. [Google Scholar] [CrossRef] [Green Version]
  29. Xu, K.; Gong, Y.; Fang, S.; Wang, K.; Lin, Z.; Wang, F. Radiometric Calibration of UAV Remote Sensing Image with Spectral Angle Constraint. Remote Sens. 2019, 11, 1291. [Google Scholar] [CrossRef] [Green Version]
  30. Liu, Q.; Liu, W.; Zou, L.; Wang, J.; Liu, Y. A new approach to fast mosaic UAV images. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2011, XXXVIII-1/C22, 271–276. [Google Scholar] [CrossRef] [Green Version]
  31. Pix4Dmapper 4.1 User Manual. Available online: https://support.pix4d.com/hc/en-us/articles/204272989-Offline-Getting-Started-and-Manual-pdf- (accessed on 20 November 2018).
  32. Snavely, N.; Seitz, S.M.; Szeliski, R. Skeletal graphs for efficient structure from motion. In Proceedings of the 2008 IEEE Conference on Computer Vision and Pattern Recognition, Anchorage, AK, USA, 24–26 June 2008. [Google Scholar]
  33. Kim, J.I.; Kim, T.; Shin, D.; Kim, S. Fast and robust geometric correction for mosaicking UAV images with narrow overlaps. Internat. J. Remote Sens. 2017, 38, 2557–2576. [Google Scholar] [CrossRef]
  34. Rhee, S.; Kim, S.; Kim, T. A study on the possibility of using UAV stereo image for measuring tree height in urban area. Koreanj. Remote Sens. 2017, 33, 1151–1157. [Google Scholar]
  35. Dijkstra, E.W. A note on two problems in connexion with graphs. Numer. Math. 1959, 1, 269–271. [Google Scholar] [CrossRef] [Green Version]
  36. Yin, C.; Wang, H. Developed Dijkstra shortest path search algorithm and simulation. In Proceedings of the 2010 International Conference on Computer Design and Applications, Qinhuangdao, China, 25–27 June 2010. [Google Scholar]
  37. Lowe, D.G. Distinctive image features from scale-invariant keypoints. Internat. J. Comput. Vis. 2004, 60, 91–110. [Google Scholar] [CrossRef]
  38. Yoon, S.J.; Kim, T. Development of Stereo Visual Odometry Based on Photogrammetric Feature Optimization. Remote Sens. 2019, 11, 67. [Google Scholar] [CrossRef] [Green Version]
  39. Yoon, W.; Kim, H.G.; Rhee, S.; Rhee. Multi Point Cloud Integration based on Observation Vectors between Stereo Images. Koreanj. Remote Sens. 2019, 35, 727–736. [Google Scholar]
  40. Ray, S.F. Camera Exposure Determination. In The Manual of Photography: Photographic and Digital Imaging, 9th ed.; Jacobson, R.E., Ray, S.F., Atteridge, G.G., Axford, N.R., Eds.; Focal Press: Oxford, UK, 2000; pp. 310–335. [Google Scholar]
Figure 1. Flight trajectory of the UAV (green line) and the locations of the images acquired (red dots).
Figure 1. Flight trajectory of the UAV (green line) and the locations of the images acquired (red dots).
Remotesensing 12 01726 g001
Figure 2. (a) The KD-2 fixed-wing unmanned aerial vehicle (UAV) and (b) the multispectral camera with irradiance sensor.
Figure 2. (a) The KD-2 fixed-wing unmanned aerial vehicle (UAV) and (b) the multispectral camera with irradiance sensor.
Remotesensing 12 01726 g002
Figure 3. (a) Ground reference panels installed for vicarious radiometric calibration of UAV images and (b) spectral reflectance measured using a spectro-radiometer.
Figure 3. (a) Ground reference panels installed for vicarious radiometric calibration of UAV images and (b) spectral reflectance measured using a spectro-radiometer.
Remotesensing 12 01726 g003
Figure 4. The reflectance map generated for validation: (a) a natural color composite and (b) a pseudo-infrared color composite.
Figure 4. The reflectance map generated for validation: (a) a natural color composite and (b) a pseudo-infrared color composite.
Remotesensing 12 01726 g004
Figure 5. Overall procedure for radiometric calibration of UAV images.
Figure 5. Overall procedure for radiometric calibration of UAV images.
Remotesensing 12 01726 g005
Figure 6. Example of a reference image with ground reference panels, and a close-up of the reference panels where blue boxes are locations of sample pixels used for vicarious calibration.
Figure 6. Example of a reference image with ground reference panels, and a close-up of the reference panels where blue boxes are locations of sample pixels used for vicarious calibration.
Remotesensing 12 01726 g006
Figure 7. An examples of linear regression analysis between the digital number and the reflectance for blue band needed to estimate vicarious calibration coefficients ( a a and b a ) of a reference image.
Figure 7. An examples of linear regression analysis between the digital number and the reflectance for blue band needed to estimate vicarious calibration coefficients ( a a and b a ) of a reference image.
Remotesensing 12 01726 g007
Figure 8. (a) An image network in an area, and the two shortest paths from image S to F (red and green lines); (b) the number of tie points between each image pair, and (c) the weight of the image pairs and the two shortest paths (red circles and green triangles).
Figure 8. (a) An image network in an area, and the two shortest paths from image S to F (red and green lines); (b) the number of tie points between each image pair, and (c) the weight of the image pairs and the two shortest paths (red circles and green triangles).
Remotesensing 12 01726 g008
Figure 9. (a) Number of tie points in the two shortest paths and (b) selected optimal path from S to F (green line).
Figure 9. (a) Number of tie points in the two shortest paths and (b) selected optimal path from S to F (green line).
Remotesensing 12 01726 g009
Figure 10. An example of the distribution of digital number (DN) values for all tie points (black crosses) and accepted tie points (red crosses).
Figure 10. An example of the distribution of digital number (DN) values for all tie points (black crosses) and accepted tie points (red crosses).
Remotesensing 12 01726 g010
Figure 11. An example of accepted tie points (red ×’s) among all tie points.
Figure 11. An example of accepted tie points (red ×’s) among all tie points.
Remotesensing 12 01726 g011
Figure 12. Distribution of validation points within the mosaicked image.
Figure 12. Distribution of validation points within the mosaicked image.
Remotesensing 12 01726 g012
Figure 13. Reflectance map by natural color composites (ac) and pseudo-infrared color composites (df) from relative calibration by the optimal path (RCOP) (top), relative calibration by acquisition sequence (RCAS) (middle), and no relative calibration (NoRC) (bottom).
Figure 13. Reflectance map by natural color composites (ac) and pseudo-infrared color composites (df) from relative calibration by the optimal path (RCOP) (top), relative calibration by acquisition sequence (RCAS) (middle), and no relative calibration (NoRC) (bottom).
Remotesensing 12 01726 g013aRemotesensing 12 01726 g013b
Figure 14. Nine subregions of the mosaicked image (numbers indicate the line and column).
Figure 14. Nine subregions of the mosaicked image (numbers indicate the line and column).
Remotesensing 12 01726 g014
Figure 15. An example of accepted tie points for (a) red, (b) red-edge, and (c) NIR bands.
Figure 15. An example of accepted tie points for (a) red, (b) red-edge, and (c) NIR bands.
Remotesensing 12 01726 g015aRemotesensing 12 01726 g015b
Figure 16. Distribution of DNs for an image pair in (a) red, (b) red-edge, and (c) NIR bands.
Figure 16. Distribution of DNs for an image pair in (a) red, (b) red-edge, and (c) NIR bands.
Remotesensing 12 01726 g016aRemotesensing 12 01726 g016b
Table 1. Mean differences of reflectance between the reflectance map and the ground reference panels.
Table 1. Mean differences of reflectance between the reflectance map and the ground reference panels.
BandGround Reference Panel
3%5%11%22%33%44%55%
Blue0.0180.0170.0310.0270.0330.0510.024
Green0.0250.0210.0280.0080.0020.009−0.027
Red0.0190.0160.0250.0100.0060.016−0.022
Red-edge0.0370.0340.0390.0160.0080.009−0.027
Near-infrared0.1010.0920.0900.0460.0190.001−0.047
Table 2. An example of ratio, number, and R2 of tie points by n for the standard deviation.
Table 2. An example of ratio, number, and R2 of tie points by n for the standard deviation.
n32.5210.680.390.13
Ratio (%)99%98%95%68%50%30%10%
Number of tie points113510871066979884682275
R 2 0.930.970.970.980.980.980.98
Table 3. Exposure values (EVs) in the order of blue (B), green (G), red (R), red-edge (RE), and near-infrared (NIR) for nine subregions.
Table 3. Exposure values (EVs) in the order of blue (B), green (G), red (R), red-edge (RE), and near-infrared (NIR) for nine subregions.
Subregion Number
EVs for B, G, R, RE, NIR
Subregion Number
EVs for B, G, R, RE, NIR
Subregion Number
EVs for B, G, R, RE, NIR
(1, 1)
13.1, 12.5, 13.6, 13.6, 13.3
(1, 2)
12.8, 13.1, 12.4, 13.2, 13.3
(1, 3)
12.8, 13.3, 12.3, 12.7, 13.4
(2, 1)
13.2, 12.4, 13.5, 13.5, 13.4
(2, 2)
12.6, 12.8, 13.6, 13.5, 13.1
(2, 3)
12.8, 13.3, 12.3, 12.7, 13.5
(3, 1)
12,6, 12.5, 13.5, 13.5, 13.5
(3, 2)
12.0, 12.5, 13.1, 13.5, 13.3
(3, 3)
12.6, 13.0, 12.4, 13.7, 13.4
Table 4. Root mean square error (RMSE) of the three calibration methods for validation data.
Table 4. Root mean square error (RMSE) of the three calibration methods for validation data.
BandRCOPRCASNoRC
Blue0.0310.0840.060
Green0.0330.1380.054
Red0.0370.0960.050
Red-edge0.0620.1670.051
NIR0.1010.3430.094
Table 5. RMSE of the three calibration methods for land-cover types.
Table 5. RMSE of the three calibration methods for land-cover types.
BandLand-Cover TypeRCOPRCASNoRC
BlueVegetation0.0320.0800.060
Bare soil0.0290.0830.067
Concrete0.0300.0970.057
GreenVegetation0.0340.1350.055
Bare soil0.0320.1340.056
Concrete0.0330.1540.051
RedVegetation0.0390.1650.051
Bare soil0.0370.0890.057
Concrete0.0310.1050.042
Red-edgeVegetation0.0650.0950.051
Bare soil0.0570.1560.057
Concrete0.0590.1860.048
NIRVegetation0.1070.3420.092
Bare soil0.0940.3470.087
Concrete0.0930.3360.101

Share and Cite

MDPI and ACS Style

Shin, J.-I.; Cho, Y.-M.; Lim, P.-C.; Lee, H.-M.; Ahn, H.-Y.; Park, C.-W.; Kim, T. Relative Radiometric Calibration Using Tie Points and Optimal Path Selection for UAV Images. Remote Sens. 2020, 12, 1726. https://doi.org/10.3390/rs12111726

AMA Style

Shin J-I, Cho Y-M, Lim P-C, Lee H-M, Ahn H-Y, Park C-W, Kim T. Relative Radiometric Calibration Using Tie Points and Optimal Path Selection for UAV Images. Remote Sensing. 2020; 12(11):1726. https://doi.org/10.3390/rs12111726

Chicago/Turabian Style

Shin, Jung-Il, Yeong-Min Cho, Pyung-Chae Lim, Hae-Min Lee, Ho-Yong Ahn, Chan-Won Park, and Taejung Kim. 2020. "Relative Radiometric Calibration Using Tie Points and Optimal Path Selection for UAV Images" Remote Sensing 12, no. 11: 1726. https://doi.org/10.3390/rs12111726

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop