Dense Local Azimuth–Elevation Map for the Integration of GIS Data and Camera Images
Abstract
1. Introduction
2. Related Works
- Orthophoto or georeferenced aerial images;
- Georeferenced 3D point cloud;
- Digital elevation model (DEM).
2.1. Georegistration/Localisation with Orthophotos or Georeferenced Aerial Images
2.2. Georegistration/Localisation with Georeferenced 3D Point Clouds
2.3. DEM-Based Georegistration/Localisation
2.4. Final Comments
3. Local Azimuth–Elevation Map at Camera Position
3.1. Scenario
3.2. The ENU and AER Local Coordinate Systems
- Up is the direction of the vector , denoted by in Figure 3.
- North is the direction at of the meridian, i.e., the ellipse through the geodetic north, and is taken to be positive towards the geodetic north. In Figure 3, the north axis is represented in the plane parallel to , and denoted by .
- East is the direction orthogonal to the other two, making ENU is a right-handed system. It is denoted by in Figure 3.
- The azimuth is the clockwise angle from the axis to the direction .
- The elevation (or altitude) is the angle between the plane and direction ; positive up, negative down.
- The (slant) range r is the Euclidean distance between points and P.
3.3. Local Azimuth–Elevation Map (LAEM)
3.3.1. Mapping to the Azimuth–Elevation Space
- The formula used to derive the image coordinates from the AER coordinates of the 3D point is different from that for deriving the normalised camera image coordinates from the camera-centric Cartesian coordinates .
- A line segment in 3D space, the endpoints of which map to different points, does not map to a line segment but to another curve.
3.3.2. LAEM of a Surface
3.3.3. Discrete LAEM
3.4. Computing a Discrete LAEM of a Gridded DEM
3.4.1. Methods from Surface Rendering
3.4.2. Principles of the Proposed Method
3.4.3. Computing DEM Vertical Lines
Planar Horizontal Datum
Spheric Horizontal Datum
Maximal Errors in the Vertical Direction
3.4.4. Detailed Method Description
| Listing 1. LAEM computation. | ||
| 1 | % compute coordinates of DEM points | |
| 2 | for each : | |
| 3 | compute | |
| 4 | % add coordinates of interpolation points | |
| 5 | for each : | |
| 6 | for each and each : | |
| 7 | compute making use of (14a), (13b), (13c) and (13d) | |
| 8 | % fill the LAEM | |
| 9 | ||
| 10 | for each : | |
| 11 | if : | |
| 12 | for each : | |
| 13 | % spheric approximation | %planar approximation |
| 14 | compute with Equation (19) | nop |
| 15 | if | if |
| 16 | ||
| 17 | else | else |
| 18 | break | break |
- Line 3:
- The values , , and result directly from the representation of in the AER coordinate system. represents either the horizontal distance (planar approximation) or slant range (spheric approximation) of . and determine the position of in the LAEM. They are derived from the azimuth and elevation of according to the choices made in Section 3.3.3 for the discretisation of the space. The equations are as follows:where is the unwrapped azimuth angle, related to the azimuth angle by
- Line 7:
- The interpolation is explained in Section 3.4.2 and illustrated in Figure 8. j determines the discrete unwrapped azimuth angle . The explicit Equations (14a), (13b), (13c) and (13d) allow us to compute from the ENU coordinates of the interpolation point between and and between and . From the ENU coordinates of the interpolation point , the corresponding and values are derived with the ENU-to-AER coordinate transform. Finally, the elevation index is computed from using Equation (20b). This new coordinate triplet is added to the set (or list) of coordinates that will be considered to fill the LAEM.
- lines 9 and 11:
- According to Section 3.3.2, from the points having the same LAEM position, we keep only the “visible” one, i.e., the one with the smallest d value. The LAEM is hence initialised at line 9 with a specific value, which is larger than any possible d value; here, the choice is . Furthermore, at line 11, the position of the currently evaluated point is only updated if the value is smaller than the value stored in at this position.
- Lines 12 to 18:
- together with determine the position of the currently evaluated DEM point. According to Section 3.4.3, points on the vertical line from a DEM point have the same azimuth, hence here the same value. i values larger than , each together with , determine hence the LAEM positions of 3D points located on the vertical line down from the DEM point. In the case of planar approximation, points on the vertical line have the same horizontal distance value as the DEM point, i.e., . In the case of spheric approximation, points on the vertical line have a slant range computed from using Equation (19). The LAEM is updated at positions with the values and , respectively, but only as long as they are smaller than the existing value .
3.4.5. DEM Downsampling as Preprocessing
4. Implementation and Experimental Results
4.1. The swissALTI3D DEM
4.2. Camera Set-Up and Images
4.3. Determining the Camera Position
4.4. Preselection and Download of the DEM Tiles
4.5. Transformation to Geodetic Coordinates
4.6. LAEM Computation
4.6.1. Tile-Level Processing
| Listing 2. Tile-level processing. | ||
| 1 | for each tile : | |
| 2 | determine | |
| 3 | if , current tile is | |
| 4 | for each tile in : | |
| 5 | determine | |
| 6 | build B according to Equation (23) | |
| 7 | compute for each | |
| 8 | compute with Equation (24) | |
- Line 2:
- and are easy to determine. If the tile format is ASCII XYZ, they can be determined directly from the file name. If the tile format is COG, they can also be found in the file metadata.
- Line 3:
- If the LV95 coordinates of the camera satisfy this condition, the camera position is within the tile currently considered. This tile is denoted by . As explained before, adjacent tiles do not have common extremal coordinates, so the camera position cannot be in more than one tile. However, the camera position may be between two adjacent tiles. In this case, is simply the empty set ∅.
- Line 5:
- Extremal height values and can be easily determined by scanning the file once.
- Line 6:
- With the extremal coordinate values , and , we build a tile bounding box, which is a rectangular cuboid in the coordinate system LV95/height. For reasons that will become clear, we use not only the eight vertices of the rectangular cuboid but also the edges. The set of coordinates is hence
- Line 7:
- The coordinates of the points belonging to B are transformed into the values , , and d of the local coordinate system, where d is the horizontal distance in the case of the plane approximation of the DEM horizontal datum and d is the slant range r in the case of spheric approximation.
- Line 8:
- The following extremal , , and d values are computed for the points belonging to B:We use T to denote a tile in order to not confuse these and values with the extreme LAEM azimuth and elevation values introduced in Section 3.3.3.Without demonstration, we claim that the values , , and of any tile point P satisfy the following:Note that this statement is not true if B is reduced to the eight vertices of the rectangular cuboid.
4.6.2. LAEM Computation with Tile Sorting and Filtering
| Listing 3. LAEM computation with tile sorting and filtering. | ||
| 1 | %Initialise and | |
| 2 | , | |
| 3 | update and with | |
| 4 | for each tile in sorted ascending according to : | |
| 5 | compute | |
| 6 | compute | |
| 7 | if : | |
| 8 | update and with | |
- Lines 3 and 8:
- is updated as described in Listing 1, and by inserting the following instruction after line 11 in Listing 1: if .
- Line 4:
- Processing without sorting the tiles would yield the same final LAEM but more tiles would need to be processed.
- Line 5:
- From and , the indexes of the corresponding discrete azimuth values and are computed using Equations (21) and (20a). From , the index of the corresponding discrete elevation value is computed using Equation (20b).
- Line 6:
- These are extremal values in the section going from to of the apparent horizon . is the minimal elevation in the section. Formally, this can be expressed as follows:is the maximal LAEM value in the section, which can be formally expressed as
- Line 7:
- This is the formal expression of the condition expressed with words at the beginning of the section.
4.7. Results
4.8. Result Evaluation
5. Discussion
- It contains only the portion of a DEM that is visible from a given position.
- It is not a subset in any order, but DEM points are arranged with a spatial consistency.
- The dimensionality is reduced from 3D to 2D. The LAEM is comparable to an image of the DEM but with distance information (horizontal distance or slant range r) instead of apparent colour, and another 2D coordinate system.
- In the case using spheric approximation, the scalar value in the 2D space, the slant range r, is rotation-invariant.
Supplementary Materials
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
| 2D | two-dimensional |
| 3D | three-dimensional |
| AER | azimuth–elevation–range |
| ASCII | American Standard Code for Information Interchange |
| COG | Cloud Optimized GeoTIFF |
| DEM | Digital Elevation Model |
| ECEF | Earth-centred, Earth-fixed |
| ESRI | Environmental Systems Research Institute, Inc. |
| FOV | field of view |
| GIS | geographic information system |
| GRS80 | Geodetic Reference System 1980 |
| LGH | local geodetic horizon |
| LGHS | local geodetic horizon system |
| LAEM | local azimuth–elevation map |
| LN02 | Swiss national levelling network 1902 |
| LV95 | Swiss national survey coordinate system 1995 |
| PTZ | pan–tilt–zoom |
| ROI | region of interest |
| UTC | Coordinated Universal Time |
References
- Meyer, T. Grid, ground, and globe: Distances in the GPS era. Surv. Land Inf. Syst. 2002, 62, 179–202. [Google Scholar]
- Van Sickle, J. Basic GIS Coordinates, 3rd ed.; CRC Press, Taylor & Francis Group: Boca Raton, FL, USA, 2017. [Google Scholar]
- Hastings, J.T.; Hill, L.L. Georeferencing. In Encyclopedia of Database Systems; Liu, L., Özsu, M.T., Eds.; Springer: Boston, MA, USA, 2009; pp. 1246–1249. [Google Scholar] [CrossRef]
- Shekhar, S.; Xiong, H.; Zhou, X. (Eds.) Encyclopedia of GIS, 2nd ed.; Springer: Cham, Switzerland, 2017. [Google Scholar] [CrossRef]
- Guth, P.L.; Van Niekerk, A.; Grohmann, C.H.; Muller, J.P.; Hawker, L.; Florinsky, I.V.; Gesch, D.; Reuter, H.I.; Herrera-Cruz, V.; Riazanoff, S.; et al. Digital Elevation Models: Terminology and Definitions. Remote Sens. 2021, 13, 3581. [Google Scholar] [CrossRef]
- Federal Office of Topography Swisstopo. swissALTI3D. Available online: https://www.swisstopo.admin.ch/en/height-model-swissalti3d (accessed on 1 November 2025).
- EuroGeographics AISBL. Open Maps for Europe|Eurogeographics. Available online: https://www.mapsforeurope.org/datasets/euro-dem (accessed on 30 October 2025).
- McGlone, J.C. (Ed.) Manual of Photogrammetry, 6th ed.; American Society for Photogrammetry and Remote Sensing: Bethesda, MD, USA, 2013. [Google Scholar]
- Szeliski, R. Computer Vision: Algorithms and Applications, 2nd ed.; The University of Washington: Seattle, WA, USA, 2022; Available online: https://szeliski.org/Book (accessed on 3 November 2025).
- Hartley, R.; Zisserman, A. Multiple View Geometry in Computer Vision, 2nd ed.; Cambridge University Press: Cambridge, UK, 2004. [Google Scholar] [CrossRef]
- Akenine-Möller, T.; Haines, E.; Hoffman, N.; Pesce, A.; Iwanicki, M.; Hillaire, S. Real-Time Rendering, 4th ed.; A K Peters/CRC Press: Boca Raton, FL, USA, 2018; p. 1200. [Google Scholar]
- Huai, J.; Shao, Y.; Jozkow, G.; Wang, B.; Chen, D.; He, Y.; Yilmaz, A. Geometric Wide-Angle Camera Calibration: A Review and Comparative Study. Sensors 2024, 24, 6595. [Google Scholar] [CrossRef] [PubMed]
- Slama, C.C. (Ed.) Manual of Photogrammetry, 4th ed.; American Society for Photogrammetry: Falls Church, VA, USA, 1980. [Google Scholar]
- Bösch, J.; Goswami, P.; Pajarola, R. RASTeR: Simple and Efficient Terrain Rendering on the GPU. In Proceedings of the Eurographics 2009—Areas Papers; Ebert, D., Krüger, J., Eds.; The Eurographics Association: Eindhoven, The Netherlands, 2009. [Google Scholar] [CrossRef]
- Remondino, F.; Fraser, C. Digital Camera Calibration Methods: Considerations and Comparisons. In Proceedings of the ISPRS Commission V Symposium ’Image Engineering and Vision Metrology’; Maas, H.G., Schneider, D., Eds.; International Society for Photogrammetry and Remote Sensing: Dresden, Germany, 2006; pp. 266–272. Available online: https://www.isprs.org/proceedings/xxxvi/part5/paper/remo_616.pdf (accessed on 8 November 2025).
- Hieronymus, J. Comparison of methods for geometric camera calibration. ISPRS-Archives 2012, XXXIX-B5, 595–599. [Google Scholar] [CrossRef]
- Salvi, J.; Armangué, X.; Batlle, J. A comparative review of camera calibrating methods with accuracy evaluation. Pattern Recognit. 2002, 35, 1617–1635. [Google Scholar] [CrossRef]
- Long, L.; Dongri, S. Review of camera calibration algorithms. In Proceedings of the Advances in Computer Communication and Computational Sciences; Bhatia, S.K., Tiwari, S., Mishra, K.K., Trivedi, M.C., Eds.; Springer: Singapore, 2019; pp. 723–732. [Google Scholar]
- Lensch, H.P.A.; Heidrich, W.; Seidel, H.P. A Silhouette-Based Algorithm for Texture Registration and Stitching. Graph. Model. 2001, 63, 245–262. [Google Scholar] [CrossRef]
- Troccoli, A.J.; Allen, P.K. A Shadow Based Method for Image to Model Registration. In Proceedings of the 2004 Conference on Computer Vision and Pattern Recognition Workshop, Washington, DC, USA, 27 June–2 July 2004; p. 169. [Google Scholar] [CrossRef]
- Feng, M.; Hu, S.; Ang, M.; Lee, G.H. 2D3D-MatchNet: Learning to Match Keypoints Across 2D Image and 3D Point Cloud. arXiv 2019, arXiv:1904.09742. [Google Scholar] [CrossRef]
- Pham, Q.H.; Uy, M.A.; Hua, B.S.; Nguyen, D.T.; Roig, G.; Yeung, S.K. LCD: Learned Cross-Domain Descriptors for 2D-3D Matching. arXiv 2019, arXiv:1911.09326. [Google Scholar] [CrossRef]
- Wang, B.; Chen, C.; Cui, Z.; Qin, J.; Lu, C.X.; Yu, Z.; Zhao, P.; Dong, Z.; Zhu, F.; Trigoni, N.; et al. P2-Net: Joint Description and Detection of Local Features for Pixel and Point Matching. arXiv 2021, arXiv:2103.01055. [Google Scholar] [CrossRef]
- Li, M.; Qin, Z.; Gao, Z.; Yi, R.; Zhu, C.; Guo, Y.; Xu, K. 2D3D-MATR: 2D-3D Matching Transformer for Detection-free Registration between Images and Point Clouds. arXiv 2023, arXiv:2308.05667. [Google Scholar] [CrossRef]
- Wilson, D.; Zhang, X.; Sultani, W.; Wshah, S. Image and Object Geo-Localization. Int. J. Comput. Vis. 2024, 132, 1350–1392. [Google Scholar] [CrossRef]
- Rameau, F.; Choe, J.; Pan, F.; Lee, S.; Kweon, I. CCTV-Calib: A toolbox to calibrate surveillance cameras around the globe. Mach. Vis. Appl. 2023, 34, 125. [Google Scholar] [CrossRef]
- Shan, Q.; Wu, C.; Curless, B.; Furukawa, Y.; Hernandez, C.; Seitz, S.M. Accurate Geo-Registration by Ground-to-Aerial Image Matching. In Proceedings of the 2014 2nd International Conference on 3D Vision, Tokyo, Japan, 8–11 December 2014; Volume 1, pp. 525–532. [Google Scholar] [CrossRef]
- Li, Y.; Snavely, N.; Huttenlocher, D.; Fua, P. Worldwide Pose Estimation Using 3D Point Clouds. In Proceedings of the Computer Vision—ECCV 2012; Fitzgibbon, A., Lazebnik, S., Perona, P., Sato, Y., Schmid, C., Eds.; Springer: Berlin/Heidelberg, Germany, 2012; pp. 15–29. [Google Scholar] [CrossRef]
- Li, Y.; Snavely, N.; Huttenlocher, D.P. Location Recognition Using Prioritized Feature Matching. In Proceedings of the Computer Vision—ECCV 2010; Daniilidis, K., Maragos, P., Paragios, N., Eds.; Springer: Berlin/Heidelberg, Germany, 2010; pp. 791–804. [Google Scholar] [CrossRef]
- Sattler, T.; Leibe, B.; Kobbelt, L. Fast image-based localization using direct 2D-to-3D matching. In Proceedings of the 2011 International Conference on Computer Vision, Barcelona, Spain, 6–13 November 2011; pp. 667–674. [Google Scholar] [CrossRef]
- Härer, S.; Bernhardt, M.; Corripio, J.G.; Schulz, K. PRACTISE - Photo Rectification And ClassificaTIon SoftwarE (V.1.0). Geosci. Model Dev. 2013, 6, 837–848. [Google Scholar] [CrossRef]
- Milosavljević, A.; Rančić, D.; Dimitrijević, A.; Predić, B.; Mihajlović, V. A Method for Estimating Surveillance Video Georeferences. ISPRS Int. J. Geo-Inf. 2017, 6, 211. [Google Scholar] [CrossRef]
- Portenier, C.; Hüsler, F.; Härer, S.; Wunderle, S. Towards a webcam-based snow cover monitoring network: Methodology and evaluation. Cryosphere 2020, 14, 1409–1423. [Google Scholar] [CrossRef]
- PTZOptics. CAD Line Drawings and 3D Renders—PTZOptics. Available online: https://ptzoptics.com/cad-line-drawings/ (accessed on 25 November 2025).
- Google. Google Maps. Available online: https://maps.google.com (accessed on 30 November 2025).
- The MathWorks, Inc. Comparison of 3-D Coordinate Systems—MATLAB & Simulink. Available online: https://www.mathworks.com/help/map/choose-a-3-d-coordinate-system.html (accessed on 19 November 2025).
- The MathWorks, Inc. geodetic2enu—Transform Geodetic Coordinates to Local East-North-Up—MATLAB. Available online: https://www.mathworks.com/help/map/ref/geodetic2enu.html (accessed on 19 November 2025).
- pymap3d.enu API Documentation. Available online: https://geospace-code.github.io/pymap3d/enu.html#pymap3d.enu.geodetic2enu (accessed on 19 November 2025).
- The MathWorks, Inc. enu2geodetic—Transform Local East-North-Up Coordinates to Geodetic—MATLAB. Available online: https://www.mathworks.com/help/map/ref/enu2geodetic.html (accessed on 5 December 2025).
- pymap3d.enu API Documentation. Available online: https://geospace-code.github.io/pymap3d/enu.html#pymap3d.enu.enu2geodetic (accessed on 5 December 2025).
- Roy, A.E.; Clarke, D. Astronomy: Principles and Practice, 4th ed.; CRC Press, Taylor & Francis Group: Boca Raton, FL, USA, 2003. [Google Scholar]
- The MathWorks, Inc. geodetic2aer—Transform Geodetic Coordinates to Local Spherical–MATLAB. Available online: https://www.mathworks.com/help/map/ref/geodetic2aer.html (accessed on 19 November 2025).
- pymap3d.aer API Documentation. Available online: https://geospace-code.github.io/pymap3d/aer.html#pymap3d.aer.geodetic2aer (accessed on 19 November 2025).
- The MathWorks, Inc. aer2geodetic—Transform Local Spherical Coordinates to Geodetic—MATLAB. Available online: https://www.mathworks.com/help/map/ref/aer2geodetic.html (accessed on 6 December 2025).
- pymap3d.aer API Documentation. Available online: https://geospace-code.github.io/pymap3d/aer.html#pymap3d.aer.aer2geodetic (accessed on 6 December 2025).
- The MathWorks, Inc. Image Coordinate Systems—MATLAB & Simulink. Available online: https://www.mathworks.com/help/images/image-coordinate-systems.html (accessed on 2 December 2025).
- OpenCV Team. OpenCV: Operations with images. Available online: https://docs.opencv.org/4.12.0/d5/d98/tutorial_mat_operations.html#autotoc_md342 (accessed on 2 December 2025).
- Roth, S.D. Ray casting for modeling solids. Comput. Graph. Image Process. 1982, 18, 109–144. [Google Scholar] [CrossRef]
- Wikimedia Foundation. “Extremes on Earth—Wikipedia”. Available online: https://en.wikipedia.org/wiki/Extremes_on_Earth#Greatest_vertical_drop (accessed on 14 December 2025).
- Holton, T. Digital Signal Processing: Principles and Applications; Cambridge University Press: Cambridge, UK, 2021. [Google Scholar] [CrossRef]
- Federal Office of Topography Swisstopo. The Swiss Coordinates LV95. 2024. Available online: https://www.swisstopo.admin.ch/en/the-swiss-coordinates-system (accessed on 21 December 2025).
- Federal Office of Topography Swisstopo. Swiss National Levelling Network LN02. 2025. Available online: https://www.swisstopo.admin.ch/en/swiss-national-levelling-network-ln02 (accessed on 21 December 2025).
- Bundesamt für Landestopographie Swisstopo. swissALTI3D: Das Hoch Aufgelöste Terrainmodell der Schweiz. Technical Report, Bundesamt für Landestopographie Swisstopo. 2022. Available online: https://www.swisstopo.admin.ch/dam/de/sd-web/D6hcUzfZuiQc/swissALTI3D-ProdInfo-DE.pdf (accessed on 14 February 2026).
- Office Fédéral de Topographie Swisstopo. swissALTI3D: Le Modèle de Terrain à Haute réSolution de la Suisse. Technical Report, Office fédéRal de Topographie Swisstopo. 2022. Available online: https://www.swisstopo.admin.ch/dam/fr/sd-web/D6hcUzfZuiQc/swissALTI3D-ProdInfo-FR.pdf (accessed on 14 February 2026).
- Axis Communications AB. AXIS M5525-E PTZ Network Camera—Product Support|Axis Communications. Available online: https://www.axis.com/products/axis-m5525-e/support (accessed on 17 December 2025).
- Maître, G. PTZcalDB Public (1.0). EUDAT Collaborative data infrastructures. EUDAT 2024. [Google Scholar] [CrossRef]
- Federal Office of Topography Swisstopo. Maps of Switzerland—Swiss Confederation—map.geo.admin.ch. Available online: https://map.geo.admin.ch (accessed on 18 December 2025).
- Federal Office of Topography Swisstopo. Geoid: The Swiss Geoid Model CHGeo2004. Available online: https://www.swisstopo.admin.ch/en/geoid-en (accessed on 19 December 2025).
- Federal Office of Topography Swisstopo. REFRAME. Available online: https://www.swisstopo.admin.ch/en/coordinate-conversion-reframe (accessed on 19 December 2025).
- Federal Office of Topography Swisstopo. REST Web Geoservices (REFRAME Web API). Available online: https://www.swisstopo.admin.ch/en/rest-api-geoservices-reframe-web (accessed on 19 December 2025).
- Federal Office of Topography Swisstopo. REFRAME DLL/JAR. Available online: https://cms.geo.admin.ch/ogd/geodesy/reframedll.zip (accessed on 19 December 2025).
- Federal Office of Topography Swisstopo. GeoSuite (LTOP/REFRAME/TRANSINT). Available online: https://www.swisstopo.admin.ch/en/geodetic-software-geosuite (accessed on 19 December 2025).
- The MathWorks, Inc. Projinv—Unproject x-y Map Coordinates to Latitude-Longitude Coordinates—MATLAB. Available online: https://www.mathworks.com/help/map/ref/projcrs.projinv.html (accessed on 19 December 2025).
- The MathWorks, Inc. Readgeoraster–Read Geospatial Raster Data File—MATLAB. Available online: https://www.mathworks.com/help/map/ref/readgeoraster.html (accessed on 20 December 2025).
- The MathWorks, Inc. interp2—Interpolation for 2-D Gridded Data in Meshgrid Format—MATLAB. Available online: https://www.mathworks.com/help/matlab/ref/interp2.html (accessed on 14 February 2026).
- Federal Office of Topography Swisstopo. swissSURFACE3D Raster. Available online: https://www.swisstopo.admin.ch/en/height-model-swisssurface3d-raster (accessed on 13 February 2026).
- Thakur, A.; Rajalakshmi, P. LiDAR and Camera Raw Data Sensor Fusion in Real-Time for Obstacle Detection. In Proceedings of the 2023 IEEE Sensors Applications Symposium (SAS), Ottawa, ON, Canada, 18–20 July 2023; pp. 1–6. [Google Scholar] [CrossRef]
- An, Y.; Shao, C.; Li, Z.; Zhuang, Y.; Yan, Y. Discontinuity Identification from Range Data Based on Similarity Indicators. IFAC Proc. Vol. 2011, 44, 3153–3158. [Google Scholar] [CrossRef]
























d | Planar Approximation αρ(0°) [°] | Spheric Approximation (αρ(0°) − αν(0°)) [°] |
|---|---|---|
| 100 m | 0.000904 | 6.054193 × |
| 1 km | 0.009044 | 6.054193 × |
| 10 km | 0.090437 | 0.000605 |
| 100 km | 0.904294 | 0.006053 |
| Min. | Max. | Mean | Total | ||
|---|---|---|---|---|---|
| Tile-level processing | 0.011555 | 0.020217 | 0.014606 | 81.239482 | |
| LAEM | Selected | 0.286338 | 7.930825 | 0.662097 | 536.960388 |
| construction | Not selected | 0.000002 | 0.001180 | 0.000012 | 0.055992 |
| Tile ID | [m] | N pts | N trunc. | 0.5% Min. [m] | 0.5% Max. [m] | 1% Mean [m] |
|---|---|---|---|---|---|---|
| 2593-1120 | 521.9 | 2,057,832 | 20,578 | −3.88 | 1.20 | −0.35 |
| 2594-1120 | 522.3 | 2,762,000 | 27,620 | −28.74 | 1.47 | −3.31 |
| 2594-1118 | 630.2 | 6,283,020 | 62,830 | −3.50 | 0.97 | −0.67 |
| 2593-1118 | 631.5 | 1,523,272 | 15,232 | −7.91 | 1.52 | −0.72 |
| 2595-1119 | 838.2 | 251,398 | 2512 | −1.48 | 0.49 | −0.28 |
| 2595-1118 | 909.5 | 3,553,016 | 35,530 | −3.49 | 1.19 | −0.47 |
| 2592-1119 | 1358.2 | 657,121 | 6570 | −2.31 | 1.81 | −0.29 |
| 2592-1120 | 1358.9 | 797,997 | 7978 | −2.78 | 1.60 | −0.21 |
| 2592-1118 | 1403.8 | 13,677 | 136 | −3.62 | 1.46 | −0.36 |
| 2593-1121 | 1464.3 | 861,586 | 8614 | −2.99 | 0.89 | −0.42 |
| ⋯ | ⋯ | ⋯ | ⋯ | ⋯ | ⋯ | ⋯ |
| 2626-1134 | 34,854.5 | 10361 | 102 | −5.86 | 36.28 | 7.05 |
| 2562-1103 | 34,947.9 | 3991 | 38 | −4.44 | 19.13 | 4.23 |
| 2561-1105 | 35,027.7 | 3267 | 32 | −1.07 | 11.61 | 4.45 |
| 2626-1135 | 35,281.4 | 5416 | 54 | −1.59 | 26.18 | 6.59 |
| 2561-1103 | 35,848.9 | 10,000 | 100 | −8.61 | 17.63 | 5.13 |
| 2628-1132 | 35,942 | 3260 | 32 | −0.07 | 24.50 | 6.35 |
| 2627-1135 | 36,185.7 | 2885 | 28 | −1.23 | 18.56 | 3.90 |
| 2631-1133 | 39,105.2 | 3967 | 38 | −3.01 | 24.85 | 8.21 |
| 2631-1137 | 40,659.7 | 2876 | 28 | −11.44 | 50.99 | 8.96 |
| 2631-1138 | 41,098.6 | 3157 | 30 | −2.15 | 25.06 | 5.78 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2026 by the author. Published by MDPI on behalf of the International Society for Photogrammetry and Remote Sensing. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Maître, G. Dense Local Azimuth–Elevation Map for the Integration of GIS Data and Camera Images. ISPRS Int. J. Geo-Inf. 2026, 15, 131. https://doi.org/10.3390/ijgi15030131
Maître G. Dense Local Azimuth–Elevation Map for the Integration of GIS Data and Camera Images. ISPRS International Journal of Geo-Information. 2026; 15(3):131. https://doi.org/10.3390/ijgi15030131
Chicago/Turabian StyleMaître, Gilbert. 2026. "Dense Local Azimuth–Elevation Map for the Integration of GIS Data and Camera Images" ISPRS International Journal of Geo-Information 15, no. 3: 131. https://doi.org/10.3390/ijgi15030131
APA StyleMaître, G. (2026). Dense Local Azimuth–Elevation Map for the Integration of GIS Data and Camera Images. ISPRS International Journal of Geo-Information, 15(3), 131. https://doi.org/10.3390/ijgi15030131

