Next Article in Journal
Making Japenese Ukiyo-e Art 3D in Real-Time
Previous Article in Journal
Learning to Describe: A New Approach to Computer Vision Based Ancient Coin Analysis
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Data Fusion of Scanned Black and White Aerial Photographs with Multispectral Satellite Images

by
Dimitris Kaimaris
1,*,
Petros Patias
2,
Giorgos Mallinis
3 and
Charalampos Georgiadis
4
1
School of Spatial Planning and Development (Eng.), Aristotle University of Thessaloniki, GR-541 24 Thessaloniki, Greece
2
School of Rural & Surveying Engineering, Aristotle University of Thessaloniki, GR-541 24 Thessaloniki, Greece
3
Department of Forestry and Management of the Environment and Natural Resources, Democritus University of Thrace, GR-68200 Orestiada, Greece
4
School of Civil Engineering, Aristotle University of Thessaloniki, GR-541 24 Thessaloniki, Greece
*
Author to whom correspondence should be addressed.
Submission received: 10 May 2019 / Accepted: 2 June 2019 / Published: 23 April 2020

Abstract

:
To date, countless satellite image fusions have been made, mainly with panchromatic spatial resolution to a multispectral image ratio of 1/4, fewer fusions with lower ratios, and relatively recently fusions with much higher spatial resolution ratios have been published. Apart from this, there is a small number of publications studying the fusion of aerial photographs with satellite images, with the year of image acquisition varying and the dates of acquisition not mentioned. In addition, in these publications, either no quantitative controls are performed on the composite images produced, or the aerial photographs are recent and colorful and only the RGB bands of the satellite images are used for data fusion purposes. The objective of this paper is the study of the addition of multispectral information from satellite images to black and white aerial photographs of the 80s decade (1980–1990) with small difference (just a few days) in their image acquisition date, the same year and season. Quantitative tests are performed in two case studies and the results are encouraging, as the accuracy of the classification of the features and objects of the Earth’s surface is improved and the automatic digital extraction of their form and shape from the archived aerial photographs is now allowed. This opens up a new field of use for the black and white aerial photographs and archived multispectral satellite images of the same period in a variety of applications, such as the temporal changes of cities, forests and archaeological sites.

1. Introduction

Data fusion is the result of using two or more images and incorporating their content information in order for the new composite image to contain more information than can be originally captured by the sensors. Image synthesis methods and techniques are used to create a high spatial resolution image that attempts to maintain the spectral information of the lower spatial resolution original data. In the composite image, the accuracy of the geometric correction can be improved, the intertemporal changes can be better defined and in addition optimal visual interpretation and classification can be allowed. Some wider areas of application are cartography, environment, urban planning, town planning, etc. [1,2,3,4,5,6,7,8].
In general, satellite imagery providers have supplied or supply multispectral (MS) images with spatial resolution four times lower than panchromatic (PAN) (e.g., Ikonos-2 at nadir 1 m PAN, 4 m MS; QuickBird-2 at nadir 0.65 m PAN, 2.6 m MS; WorldView-4 at nadir 0.31 m PAN, 1.24 m MS). Numerous data fusion images have been used so far with these ratios (1/4) of spatial resolution [9,10,11]. Also, data fusion images with a smaller spatial resolution ratio (e.g., 1/3) have been created, but these images come from different providers of satellite data such as e.g., Spot 10 m PAN with Landsat TM 30 m MS [12]. Finally, relatively recently, there have been image fusion attempts with a much higher spatial resolution ratio (e.g., 1/60), also from different satellite data providers such as, for example, Eros B 0.7 m PAN with Landsat-8 30 m MS [13], WorldView-2 0.5 m PAN with Landsat-8 30 m MS [14]. For all the above-mentioned image fusion cases, different methodologies and techniques have been developed over time, which in many cases, gave the same or even better results (in the sense of maintaining the same or more of the initial spectral information) of the previous methodologies and techniques of image fusion.
The different levels at which data fusion can be done are at pixel level, at feature level and at decision level [15]. Mostly, methodologies are based on techniques applied at pixel level, from which the most important being Transformation Based Fusion (Principal Component Analysis/PCA), the Additive and Multiplicative Technique (Brovey Transform, Multiplicative Technique, Color Normalized Transformation), the Wavelet Method, the Filter Fusion Method, (High-Pass Filter Fusion Method, Smoothing Filter-based Intensity Modulation), and the Fusion Based on Interband Relation (Regression fusion, Look-up-table Fusion) [6,14,16].
There are, however, a few publications on data fusion of aerial photography with satellite images. In these publications, the acquisition of aerial photographs and satellite images vary from 2 to 4 years [17,18,19], and this has a result that there are smaller or larger differences in the characteristics and the objects of the earth’s surface which are mapped. No acquisition dates are referenced, although this information is quite important as different capturing times add errors when composing images. In addition, either quantitative controls are not performed on these tests [17], or the aerial photographs are recent and colorful [18], or only the RGB bands are used for satellite image fusion [19]. Therefore, it can be stated that the addition of multispectral information (MS) from an satellite image to an black and white aerial photograph of the 2nd half of the 20th century (1950–1999), with a small difference (just a few days) in their acquisition date, the same year and season, has not been studied in detail up to date. A feasibility study of the effectiveness of this idea will open up a new field of use of aerial photographs and satellite images.
The geometry of images differs in aerial photographs and satellite images, whether they are recent or older. For example, black and white aerial photographs are central projections, that is, they result from the projection rays of the Earth’s surface through the center of the lens onto the film plate. In the case of satellite images, e.g., of Landsat 5, where each pixel of the image is gradually captured by the rotation of the sensor mirror, the image is created by thousands of central projections. Besides this fact, the altitude of acquisition in aerial photographs and satellite images varies by hundreds of kilometers. Also, black and white aerial photographs do not provide any spectral information beyond the visible, but have a particularly high spatial resolution. As a result, it is not possible to apply classification algorithms to the image and, as a consequence, it is impossible to automatically extract the land covers of the Earth’s surface. Instead, numerous satellite image files (such as Landsat 5) have channels with a variety of spectral information, but they are accompanied by bad spatial resolution. In these images, classification can be performed, but, due to poor spatial resolution, pixel-level generalizations are large. Therefore, the question that arises refers to the possibility of re-using archived black and white aerial photographs after being improved by using the spectral information of archived satellite images. Obviously, it is important that the acquisition dates are identical or differ for just a few days and that they are acquired in the same year. Thus, the fusion of these archived data and the results of their classifications are the objectives of this paper.

2. Data

Black and white (B/W) aerial photographs (film of high sensitivity of 250 lp/mm) of 1987 and 1990 of Sparta and Pyrgos area (Figure 1 and Figure 2) were used respectively, at a scale of 1:20,000 (Table 1), which were scanned at 1200 dpi. In both areas, urban and sub-urban areas are included in aerial photographs, while the selection of locations was the result of the first searches of the editorial team for data with small difference in their acquisition date (just a few days), the same year and season. In each study area the aerial photographs belong to one strip and have an overlap of 60%. All aerial photographs were accompanied by camera calibration data (Camera Calibration RMK A 15/23 and Camera Calibration RMK A). Moreover, atmospheric corrected satellite images Landsat 5 of 1987 (Figure 3) and 1990 were collected in the respective geographic areas (U.S. Geological Survey [20]). Aerial photographs of 1987 were captured 7 days before, while 1990 aerial photographs were taken 1 day after the satellite images were captured in the above geographic areas. Finally, for the realization of the aerial triangulation, Ground Control Points (GCPs) and Check Points (CPs) were used, whose horizontal coordinates were collected from the Hellenic Cadaster [21], i.e., the official public provider of cartographic base maps in Greece, with horizontal accuracy of 1 m. The corresponding altitude information was collected from the DTM (5 × 5 m point grid, 1.5 m vertical accuracy) also from the Hellenic Cadastre.

3. Methodology and Processing of Data/Products

After collecting the necessary data, the aerial triangulation-s of aerial photographs (for the production of orthophoto mosaics) and geometric correction-s of satellite images are conducted in the study area. Following are the image fusions of the orthophoto mosaics of the aerial photographs with the Landsat 5 orthoimages, as well as the checking of the quality (correlation tables) of the fused spectral information of the Multispectral (MS) satellite. Finally, the following classifications, both on the composite as well as on the MS ortho images, will be checked for their capability to provide accurate area measurements of both built and open surfaces.

3.1. Aerial Triangulation

Aerial triangulation was performed with Leica Photogrammetry Suite (LPS) of Erdas Imagine©. Using the Camera Calibration files, the interior orientations of the aerial photographs were restored, 13 GCPs and 5 CPs were selected in the two study areas (Figure 4a and Figure 5a). Digital Surface Model (DSM) of 5 m spatial resolution and the orthophoto mosaics (Figure 4b and Figure 5b) of the study area with spatial resolution of 0.5 m were produced.
After the completion of the aerial triangulations, the CPs were used to calculate the estimates of the differences Δ ^ x and Δ ^ y and the standard deviations σ ^ X and σ ^ Y (Table 2).

3.2. Geometric Correction of Satellite Images

For the collection of GCPs, in order to perform the geometric correction of satellite images (image resection using Erdas Imagine©), the products of the aerial photography were utilized, namely the orthophoto mosaics and DSMs, were used. Substantially the image (of the satellite) was recorded on another image (orthophoto mosaic of aerial photographs). In each study area, 10 GCPs and 5 CPs were used. After the geometric corrections, the CPs were used to calculate the estimates of the differences Δ ^ x and Δ ^ y and the estimates of the standard deviations σ ^ X and σ ^ Y (Table 2).

3.3. Fusion of Images

The fusion of Orthophoto mosaics of the aerial photographs with the Landsat 5 orthoimages, as well as the checking of the quality of the fused spectral information of the Multispectral (MS) satellite, was carried out in Erdas Imagine©. Figure 6 and Figure 7 show the b/w orthophoto mosaics (Figure 6a and Figure 7a) and the multi-spectral satellite images (Figure 6b and Figure 7b) in rectangular sections in the study areas. Initially, Resolution Merge was performed using the Principal Component Analysis method, Bilinear Interpolation and Output 8 bit data reconstruction technique, producing a data-fusion image for each study area (Figure 6c,d and Figure 7c,d). The evaluation of fused image is based on qualitative-visual analysis and quantitative-statistical analysis. The qualitative-visual analysis is subjective and is directly related to the experience of the fused image creator (e.g., are more details recognized in the image or are colors, contrasts preserved? etc.) [7]. The quantitative-statistical analysis is objective and is based on spectral analysis of images. The most commonly used method is the correlation coefficient between the original bands of the MS image and the corresponding bands of the fused image. The correlation coefficient values range from −1 to 1. Usually, the values between the corresponding bands of the two images (MS and fused image) must be from 0.9 to 1, so that the fused image can be used for the e.g., successful classification of earth’s surface coverings and objects [7,22,23,24]. In order to create the correlation tables (Table 3 and Table 4) of ortho multispectral satellite images with composite images, composite images were degraded spatially (60 times, image degradation) to obtain the spatial resolution of MS images.

3.4. Classifications and Area Measurements

The classifications that follow, both on the composite as well as on the MS ortho images, will be checked for the capability of providing correct area measurements of both built and open surfaces, into smaller geographical areas (Figure 6c compared with Figure 8a and Figure 7c with Figure 9a). The use of smaller geographic areas will avoid generalizations more apparent in geographically larger areas. As reference data for and comparison reasons, the results of photo interpretation and manual rendering of complex images will be used.
For the clarification of composite and MS satellite orthoimages [25], the Unsupervised -a Pixel-based technique- and ISODATA as a classifier were used. 35 classes (by estimation) of classification were selected, then grouped into regions of built and open surface (Figure 8c,d and Figure 9c,d) and, finally, their areas were automatically calculated (Table 5). The above processing was performed with Erdas Imagine©.
In order to determine the actual areas of the built and open surface, composite images were introduced into ArcGIS©, where the digitization (Figure 8e and Figure 9e) and the automated calculations of their areas (Table 5) were performed.

4. Results

The wider areas of two Greek cities are studied in the paper, in order to check the effectiveness of the composition process of satellite images with b/w aerial photographs with small difference (just a few days) in the image acquisition date, the same year and season, in areas with diversified coverages.
The geometric accuracy of the produced orthophoto mosaic and orthoimages is presented in Table 2. In particular, the achieved results in terms of standard errors on the X and Y axis range 0.4–1.8 m (estimates of the differences 1.0–2.4 m) in the case of orthophoto mosaics of aerial photographs and 3.6–5.7 m (estimates of the differences 7.2–9.5 m) in the case of satellite orthoimages.
Satellite image providers supply MS images with a spatial resolution four times larger than the panchromatic. This was simulated by producing the orthophoto mosaics of the aerial photographs spatially degraded from 0.5 m to 7.5 m (four times smaller than the spatial resolution of the MS image) and then the fusion of the images was carried out. From the Correlation Tables that created, it is shown that the rates of retaining the original spectral information of the ortho MS images in the composite images are similar (at least if not better) to the corresponding percentages of the non-initial degradation of the orthophoto mosaics of the aerial photographs presented in this paper. For this reason, the case of the initial degradation of orthophoto mosaics was not further analyzed in the paper.
In addition, during the fusion of the images (with or without initial degradation of the orthomosaics of aerial photographs) apart from the Principal Component Analysis Method (Transformation Based Fusion), other methods were used, such as Multiplacative and Brovey Transform (Additive and Multiplicative Technique) [6,26,27,28,29,30,31,32,33,34], which did not provide better results in the maintenance of spectral information, and, therefore, were not further analyzed in the paper.
According to the Correlation Table (Table 3) for the wider area of the city of Sparta, the transmission of the spectral information from the MS satellite orthoimage to the composite image is performed satisfactory by 70–85% (except for NIR that is 53%). Also, for the wider area of the city of Pyrgos, it is found that the transmission of the spectral information from the MS satellite orthoimage to the composite image is, also, performed satisfactory by 74–82%. Consequently, despite the fact that the rates are not from 90–100% (ideally), these are generally satisfactory and composite images may be used to perform further classifications.
Table 5 shows that in the case of the composite image, the overall estimate of built surfaces is about 20% underestimated, while the overall estimate of open surfaces is about 6% over-estimated. In the case of the original MS satellite images, the overall estimate of built surfaces is approximately 31% over-estimated, while the overall estimate of open surfaces is about 9% underestimated. The important conclusion is that the calculated areas of both the built and the open surfaces in the case of classification of composite images are much closer to reality. In addition, it is of major importance that in the case of composite image classification, it is possible to automatically extract the geometry/shape of the earth’s objects (compare Figure 8c with Figure 8e and Figure 9c to Figure 9d), which is not possible with the classification of the Landsat 5 satellite images (compare Figure 8d with Figure 8e and Figure 9d with Figure 9e). The above is obviously due to the very good spatial resolution of composite images with respect to the lower spatial resolution of Landsat 5 images.

5. Discussion

Few publications have been made on data fusion of aerial photography with satellite images. In addition to their scarcity, the acquisition date of aerial photographs differs from the acquisition date of satellite images from 2 to 4 years. This fact clearly results in differences—small or large—in the characteristics and objects of the earth’s surface that are documented, triggering difficulties and distortions during their study. Moreover, the acquisition dates are not mentioned, despite their importance, as different acquisition periods create errors in the composition of images. Lastly, three issues emerge, not necessarily at the same time: only the RGB bands are used for satellite image fusion, no additional quantitative controls are conducted and the aerial photographs are recent (21th century) and colorful.
No new technique for image composition is presented in the paper, but for the first time in international literature, the addition of multispectral information from satellite images to b/w aerial photographs with small difference in their image acquisition date (just a few days), the same year and season is presented.
In fact, in several cases of composing modern PAN and MS images of the same satellite system, the acceptable limits of retaining the original MS image spectral information in the composite image are either not satisfied or marginally satisfied. Even more so, if images come from such different sensors as in this paper, it is expected that the above limits could not be met. Therefore, it is of primary interest to determine the percentage of spectral information retained in the composite image, and whether this percentage permits the classification of new images, the digital identification of basic structures of the terrestrial surface (built area and non-built area) and the extraction of quantitative data. The results have allowed the above to reach a satisfactory level of acceptance and, in consequence, the research on the creation of new techniques and methodologies for their improvement can now begin.

6. Conclusions

The paper highlights a new possibility of using archived b/w aerial photographs, since their spectral information can be improved. It is a prerequisite that the corresponding spatial multi-spectral satellite data have the same date of acquisition or differ for just a few days from the acquisition dates of the aerial photographs. New possibilities of using archived Landsat 5 satellite images are also highlighted.
Flights for b/w aerial photographs, in the past, aimed at studies on Cadastre, Forestry, Town Planning, Spatial Planning, etc. Each flight was accompanied by the acquisition of hundreds of aerial photographs, which, nowadays, will allow the enrichment with multi-spectral data of orthophoto mosaics that map spatially large areas. Therefore, automatic extraction of temporal information, e.g., of built and open areas on hundreds of square kilometers of land, will no longer be the result of continuous photointerpretation process and rendering of orthophoto mosaics, but of their automatic extraction through automatic classification. Finally, a variety of new applications is open, such as the identification of the temporal changes of cities, forests, archaeological sites, etc.
Obviously, the data fusion of high spatial resolution images with low spatial resolution ones, as in our case, is accompanied by the problem of large ratios of their spatial resolutions, which often causes the appearance of spectral distortions. However, as the first results are encouraging, the development of optimizations to minimize these spectral distortions are now possible.

Author Contributions

D.K. proposed the core concept of the paper. The work presented in this paper was carried out in collaboration between all authors. All authors approved the submitted manuscript. All authors contributed to the scientific content, the processing of data, the interpretation of the results and manuscript revisions. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Panda, C.B. Remote Sensing. Principles and Applications in Remote Sensing, 1st ed.; Viva Books: New Delhi, India, 1995; pp. 234–267. [Google Scholar]
  2. Schowengerdt, R.A. Remote Sensing: Models and Methods for Image Processing, 2nd ed.; Academic Press: Orlando, FL, USA, 1997. [Google Scholar]
  3. Bethune, S.; Muller, F.; Donnay, P.J. Fusion of multi-spectral and panchromatic images by local mean and variance matching filtering techniques. In Proceedings of the Second International Conference en Fusion of Earth Data, Nice, France, 28–30 January 1998; pp. 31–36. [Google Scholar]
  4. Wald, L. Some terms of reference in data fusion. IEEE Trans. Geosci. Remote Sens. 1999, 37, 1190–1193. [Google Scholar] [CrossRef] [Green Version]
  5. Gonzalez, R.; Woods, R. Digital Image Processing, 2nd ed.; Prentice Hall: Upper Saddle River, NJ, USA, 2002. [Google Scholar]
  6. Choodarathnakara, L.A.; Ashok Kumar, T.; Koliwad, S.; Patil, G.C. Assessment of Different Fusion Methods Applied to Remote Sensing Imagery. Int. J. Comput. Sci. Inf. Technol. 2012, 3, 5447–5453. [Google Scholar]
  7. Fonseca, L.; Namikawa, L.; Castejon, E.; Carvalho, L.; Pinho, C.; Pagamisse, A. Image Fusion for Remote Sensing Applications. In Image Fusion and Its Applications, 1st ed.; Zheng, Y., Ed.; IntechOpen: Rijeka, Croatia, 2011; pp. 153–178. [Google Scholar]
  8. Shi, W.; Zhu, C.; Tian, Y.; Nichol, J. Wavelet-based image fusion and quality assessment. Int. J. Appl. Earth Obs. Geoinf. 2005, 6, 241–251. [Google Scholar] [CrossRef]
  9. Zhang, H.K.; Huang, B. A new look at image fusion methods from a Bayesian perspective. Remote Sens. 2015, 7, 6828–6861. [Google Scholar] [CrossRef] [Green Version]
  10. Helmy, A.K.; El-Tawel, G.S. An integrated scheme to improve pan-sharpening visual quality of satellite images. Egypt. Inf. J. 2015, 16, 121–131. [Google Scholar] [CrossRef]
  11. Jelének, J.; Kopačková, V.; Koucká, L.; Mišurec, J. Testing a modified PCA-based sharpening approach for image fusion. Remote Sens. 2016, 8, 794. [Google Scholar] [CrossRef] [Green Version]
  12. Chavez, P.S.; Sides, S.C.; Anderson, J.A. Comparison of three different methods to merge multiresolution and multispectral data: Landsat TM and SPOT Panchromatic. Photogramm. Eng. Remote Sens. 1991, 57, 295–303. [Google Scholar]
  13. Fryskowska, A.; Wojtkowska, M.; Delis, P.; Grochala, A. Some Aspects of Satellite Imagery Integration from EROS B and LANDSAT 8. In Proceedings of the International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Prague, Czech Republic, 12–19 July 2016; pp. 647–652. [Google Scholar]
  14. Grochala, A.; Kedzierski, M. A Method of Panchromatic Image Modification for Satellite Imagery Data Fusion. Remote Sens. 2017, 9, 639. [Google Scholar] [CrossRef] [Green Version]
  15. Pohl, C.; Van Genderen, J.L. Multisensor image fusion in remote sensing: Concepts, methods and applications. Int. J. Remote Sens. 1998, 19, 823–854. [Google Scholar] [CrossRef] [Green Version]
  16. Aiazzi, B.; Baronti, S.; Selva, M. Improving component substitution pansharpening through multivariate regression of MS + Pan data. IEEE Trans. Geosci. Remote Sens. 2007, 45, 3230–3239. [Google Scholar] [CrossRef]
  17. Erdogan, M.; Maras, H.H.; Yilmaz, A.; Özerbil, T.Ö. Resolution merge of 1:35000 scale aerial photographs with Landsat 7 ETM imagery. In Proceedings of the International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Beijing, China, 3–11 July 2008; Volume XXXVII, Part B7. pp. 1281–1286. [Google Scholar]
  18. Stabile, M.; Odeh, I.; McBratney, A. Fusion of high-resolution aerial orthophoto with Landsat TM image for improved object-based land-use classification. In Proceedings of the 30th Asian Conference on Remote Sensing 2009 (ACRS 2009), Beijing, China, 18–23 October 2009; pp. 114–119. [Google Scholar]
  19. Siok, K.; Jenerowicz, A.; Woroszkiewicz, M. Enhancement of spectral quality of archival aerial photographs using satellite imagery for detection of land cover. J. Appl. Remote Sens. 2017, 11, 036001. [Google Scholar] [CrossRef]
  20. LandsatLook Viewer. Available online: https://landsatlook.usgs.gov/ (accessed on 29 May 2019).
  21. Hellenic Cadastre, Ortho images. Available online: http://gis.ktimanet.gr/wms/ktbasemap/default.aspx (accessed on 29 May 2019).
  22. Wald, L.; Ranchin, T.; Mangolini, M. Fusion of satellite images of different spatial resolutions: Assessing the quality of resulting images. Photogramm. Eng. Remote Sens. 1997, 63, 691–699. [Google Scholar]
  23. Ranchin, T.; Aiazzi, B.; Alparone, L.; Baronti, S.; Wald, L. Image fusion-The ARSIS concept and some successful implementation schemes. ISPRS J. Photogramm. Remote Sens. 2003, 58, 4–18. [Google Scholar] [CrossRef] [Green Version]
  24. Selva, M.; Santurri, L.; Baronti, S. On the Use of the Expanded Image in Quality Assessment of Pansharpened Images. IEEE Geosci. Remote Sens. Lett. 2018, 15, 320–324. [Google Scholar] [CrossRef]
  25. Li, M.; Zang, S.; Zhang, B.; Li, S.; Wu, C. A Review of Remote Sensing Image Classification Techniques: The Role of Spatio-contextual Information. Eur. J. Remote Sens. 2014, 47, 389–411. [Google Scholar] [CrossRef]
  26. Gillespie, A.R.; Kahle, A.B.; Walker, E.R. Color enhancement of highly correlated images-II. Channel ratio and ‘chromaticity’ transformation techniques. Remote Sens. Environ. 1987, 22, 343–365. [Google Scholar] [CrossRef]
  27. Liu, J.G.; Moore, J.M. Pixel block intensity modulation: Adding spatial detail to TM band 6 thermal imagery. Int. J. Remote Sens. 1998, 19, 2477–2491. [Google Scholar]
  28. Zhang, Y. A new merging method and its spectral and spatial effects. Int. J. Remote Sens. 1999, 20, 2003–2014. [Google Scholar] [CrossRef]
  29. Liu, J.G. Smoothing filter-based intensity modulation: A spectral preserve image fusion technique for improving spatial details. Int. J. Remote Sens. 2000, 21, 3461–3472. [Google Scholar] [CrossRef]
  30. González-Audícana, M.; Saleta, J.L.; Catalán, G.R.; García, R. Fusion of multispectral and panchromatic images using improved IHS and PCA mergers based on wavelet decomposition. IEEE Trans. Geosci. Remote Sens. 2004, 42, 1291–1299. [Google Scholar] [CrossRef]
  31. Wang, Z.; Ziou, D.; Armenakis, C. A Comparative Analysis of Image Fusion Methods. IEEE Trans. Geosci. Remote Sens. 2005, 43, 1391–1402. [Google Scholar] [CrossRef]
  32. Helmy, K.A.; Nasr, H.A.; El-Taweel, S.G. Assessment and Evaluation of Different Data Fusion Techniques. Int. J. Comput. 2010, 4, 107–115. [Google Scholar]
  33. Susheela, D.; Pradeep, K.G.; Mahesh, K.J. A comparative study of various pixel based image fusion techniques as applied to an urban environment. Int. J. Image Data Fusion 2013, 4, 197–213. [Google Scholar]
  34. Jong-Song, J.; Jong-Hun, C. Application Effect Analysis of Image Fusion Methods for Extraction of Shoreline in Coastal Zone Using Landsat ETM+. Atmos. Ocean. Sci. 2017, 1, 1–6. [Google Scholar]
Figure 1. Map of Greece with the location of Sparta and Pyrgos.
Figure 1. Map of Greece with the location of Sparta and Pyrgos.
Sci 02 00029 g001
Figure 2. The case of aerial photographs of the Spartan area. Flight direction left to right, west to east.
Figure 2. The case of aerial photographs of the Spartan area. Flight direction left to right, west to east.
Sci 02 00029 g002
Figure 3. The case of the satellite image Landsat 5, 10/06/1987, 30 m, B-G-NIR.
Figure 3. The case of the satellite image Landsat 5, 10/06/1987, 30 m, B-G-NIR.
Sci 02 00029 g003
Figure 4. (a) The distribution of GCPs and CPs in the wider area of the city of Sparta. (b) The orthophoto mosaic of aerial photographs.
Figure 4. (a) The distribution of GCPs and CPs in the wider area of the city of Sparta. (b) The orthophoto mosaic of aerial photographs.
Sci 02 00029 g004
Figure 5. (a) The DSM of the wider area of the city of Pyrgos. (b) The orthophoto mosaic of aerial photographs.
Figure 5. (a) The DSM of the wider area of the city of Pyrgos. (b) The orthophoto mosaic of aerial photographs.
Sci 02 00029 g005
Figure 6. Wider area of Sparta. (a) Cutting the orthophoto mosaic of the aerial photographic into a rectangle. (b) The corresponding area of the Landsat 5 orthoimage. (c) The composite image (B-G-R). (d) The composite image (B-G-NIR).
Figure 6. Wider area of Sparta. (a) Cutting the orthophoto mosaic of the aerial photographic into a rectangle. (b) The corresponding area of the Landsat 5 orthoimage. (c) The composite image (B-G-R). (d) The composite image (B-G-NIR).
Sci 02 00029 g006
Figure 7. Wider area of Pyrgos. (a) Cutting the orthophoto mosaic of the aerial photographic into a rectangle. (b) The corresponding area of the Landsat 5 orthoimage. (c) The composite image (B-G-R). (d) The composite image (B-G-NIR).
Figure 7. Wider area of Pyrgos. (a) Cutting the orthophoto mosaic of the aerial photographic into a rectangle. (b) The corresponding area of the Landsat 5 orthoimage. (c) The composite image (B-G-R). (d) The composite image (B-G-NIR).
Sci 02 00029 g007
Figure 8. (a) Part of the original composite image (B-G-R) of Sparta’s wider area. It includes a built and open surface, urban and agricultural. (b) The corresponding part of the orthoimage MS Landsat 5 (B-G-R). (c) Classification of the composite image. (d) Classification of the orthoimage MS Landsat 5. (e) Digitization of the composite image in ArcGIS©.
Figure 8. (a) Part of the original composite image (B-G-R) of Sparta’s wider area. It includes a built and open surface, urban and agricultural. (b) The corresponding part of the orthoimage MS Landsat 5 (B-G-R). (c) Classification of the composite image. (d) Classification of the orthoimage MS Landsat 5. (e) Digitization of the composite image in ArcGIS©.
Sci 02 00029 g008
Figure 9. (a) Part of the original composite image (B-G-R) of Pyrgos wider area. It includes a built and open surface, urban and agricultural. (b) The corresponding part of the orthoimage MS Landsat 5 (B-G-R). (c) Classification of the composite image. (d) Classification of the orthoimage MS Landsat 5. (e) Digitization of the composite image in ArcGIS©.
Figure 9. (a) Part of the original composite image (B-G-R) of Pyrgos wider area. It includes a built and open surface, urban and agricultural. (b) The corresponding part of the orthoimage MS Landsat 5 (B-G-R). (c) Classification of the composite image. (d) Classification of the orthoimage MS Landsat 5. (e) Digitization of the composite image in ArcGIS©.
Sci 02 00029 g009
Table 1. Characteristics of Aerial photographs and satellite images.
Table 1. Characteristics of Aerial photographs and satellite images.
DataLocationNumber of ImagesDate of CaptureSpectral ResolutionSpatial ResolutionRadiometric Resolution
Aerial photographsSparta503/06/1987b/w, visible spectrum0.50 m8 bit
Pyrgos529/08/19900.50 m
Satellite images Landsat 5Sparta110/06/19876 Bands: R-G-B-NIR-SWIR1-SWIR230 m
Pyrgos128/08/1990
Table 2. Spatial accuracy test of the geometric-corrected image (measurement units: meters).
Table 2. Spatial accuracy test of the geometric-corrected image (measurement units: meters).
Estimated Indices (units m)Orthophoto Mosaic from Aerial PhotographsOrthoimagery from Satellite Images
Study Areas
SpartaPyrgosSpartaPyrgos
Δ ^ x = i = 1 n δ x i n = i = 1 n | x O R T H O , i x C P , i | n ,
where δ x i the differenced of CPs in the X axis between the orthoimage and the actual values, x O R T H O , i the values of CPs in the X axis in the orthoimage, x C P , i the actual values of CPs in the X axis, and n the number of observations (=5).
2.11.09.57.2
Δ ^ y = i = 1 n δ y i n = i = 1 n | y O R T H O , i y C P , i | n 2.41.09.28.3
σ ^ X = 1 n 1 i = 1 n ( δ x i Δ ^ x ) 2 = 1 n 1 i = 1 n ( | x O R T H O , i x C P , i | ) Δ ^ x ) 2 1.50.43.65.7
σ ^ Y = 1 n 1 i = 1 n ( δ y i Δ ^ y ) 2 = 1 n 1 i = 1 n ( | y O R T H O , i y C P , i | ) Δ ^ y ) 2 1.80.34.24.8
Table 3. Correlation table of the satellite multispectral orthoimages with the spatially degraded composite image in Spartan city area.
Table 3. Correlation table of the satellite multispectral orthoimages with the spatially degraded composite image in Spartan city area.
LANDSAT 5DATAFUSION
BandsBlueGreenRedNIRSWIR1SWIR2BlueGreenRedNIRSWIR1SWIR2
LANDSAT 5Blue10.9790.9270.2030.7510.8830.6970.7500.7740.1930.5840.775
Green0.97910.9690.2390.8270.9250.6930.7700.8150.2400.6560.815
Red0.9270.96910.2040.8940.9430.6580.7510.8430.2390.7210.833
NIR0.2030.2390.20410.3730.154−0.0100.0480.0680.5280.1460.028
SWIR10.7510.8270.8940.37310.9140.4670.5810.7050.2970.7350.742
SWIR20.8830.9250.9430.1540.91410.6120.7010.7810.1790.7070.849
DATAFUSIONBlue0.6970.6930.658−0.0100.4670.61210.9780.9090.5640.8030.878
Green0.7500.7700.7510.0480.5810.7010.97810.9640.5650.8660.934
Red0.7740.8150.8430.0680.7050.7810.9090.96410.5050.9140.963
NIR0.1930.2400.2390.5280.2970.1790.5640.5650.50510.6560.442
SWIR10.5840.6560.7210.1460.7350.7070.8030.8660.9140.65610.913
SWIR20.7750.8150.8330.0280.7420.8490.8780.9340.9630.4420.9131
Table 4. Correlation table of the satellite multispectral ortho multispectral satellite images with the spatially degraded composite image in Pyrgos city area.
Table 4. Correlation table of the satellite multispectral ortho multispectral satellite images with the spatially degraded composite image in Pyrgos city area.
LANDSAT 5DATAFUSION
BandsBlueGreenRedNIRSWIR1SWIR2BlueGreenRedNIRSWIR1SWIR2
LANDSAT 5Blue10.9670.947−0.1530.7070.8320.8110.7730.726−0.2900.5530.595
Green0.96710.963−0.0920.7380.8550.8000.8180.756−0.2550.5820.622
Red0.9470.9631−0.2440.8180.9210.7920.7860.789−0.3990.6680.702
NIR−0.153−0.092−0.2441−0.106−0.260−0.179−0.110−0.2250.870−0.228−0.288
SWIR10.7070.7380.818−0.10610.9260.5680.5800.639−0.2790.7370.681
SWIR20.8320.8550.921−0.2600.92610.6760.6810.716−0.4020.7150.736
DATAFUSIONBlue0.8110.8000.792−0.1790.5680.67610.9520.947−0.4210.7570.809
Green0.7730.8180.786−0.1100.5800.6810.95210.948−0.3480.7430.793
Red0.7260.7560.789−0.2250.6390.7160.9470.9481−0.5000.8720.910
NIR−0.290−0.255−0.3990.870−0.279−0.402−0.421−0.348−0.5001−1−0.569
SWIR10.5530.5820.668−0.2280.7370.7150.7570.7430.872−0.52410.938
SWIR20.5950.6220.702−0.2880.6810.7360.8090.7930.910−0.5690.9381
Table 5. Comparison of the built and open surface resulting between the comparison of the GIS digitization with the classification of the images.
Table 5. Comparison of the built and open surface resulting between the comparison of the GIS digitization with the classification of the images.
Digitization in GISClassification Landsat 5Classification Datafusion
Area (sqm)Area (sqm)Difference %Area (sqm)Difference %
SpartaBuilt surface349,332.31453,600.0029.80274,119.00−21.5
Open surface1,022,910.195918,642.50−10.191,098,123.507.40
PyrgosBuilt surface159,466.43211,500.0032.60129,493.75−18.80
Open surface676,364.56624,330.99−7.70706,337.244.43

Share and Cite

MDPI and ACS Style

Kaimaris, D.; Patias, P.; Mallinis, G.; Georgiadis, C. Data Fusion of Scanned Black and White Aerial Photographs with Multispectral Satellite Images. Sci 2020, 2, 29. https://doi.org/10.3390/sci2020029

AMA Style

Kaimaris D, Patias P, Mallinis G, Georgiadis C. Data Fusion of Scanned Black and White Aerial Photographs with Multispectral Satellite Images. Sci. 2020; 2(2):29. https://doi.org/10.3390/sci2020029

Chicago/Turabian Style

Kaimaris, Dimitris, Petros Patias, Giorgos Mallinis, and Charalampos Georgiadis. 2020. "Data Fusion of Scanned Black and White Aerial Photographs with Multispectral Satellite Images" Sci 2, no. 2: 29. https://doi.org/10.3390/sci2020029

Article Metrics

Back to TopTop