Next Article in Journal
Identification of Emission Source Using a Micro Sampler Carried by a Drone
Next Article in Special Issue
Super-Resolution Images Methodology Applied to UAV Datasets to Road Pavement Monitoring
Previous Article in Journal
Optimization of False Target Jamming against UAV Detection
Previous Article in Special Issue
New Supplementary Photography Methods after the Anomalous of Ground Control Points in UAV Structure-from-Motion Photogrammetry
Order Article Reprints
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:

UAV Mapping and 3D Modeling as a Tool for Promotion and Management of the Urban Space

School of Spatial Planning and Development, Faculty of Engineering, Aristotle University of Thessaloniki (AUTh), University Campus, 54124 Thessaloniki, Greece
Architetto Giovanni Bouvet, Corso Bramante 76, 10126 Torino, Italy
Authors to whom correspondence should be addressed.
Drones 2022, 6(5), 115;
Received: 15 March 2022 / Revised: 29 April 2022 / Accepted: 30 April 2022 / Published: 3 May 2022
(This article belongs to the Special Issue UAV Photogrammetry for 3D Modeling)


In the past few decades, the management of urban spaces with appropriate tools has been in constant discussion due to the plethora of new technologies that have emerged for participatory planning, drone mapping, photogrammetry and 3D modeling. In a multitude of situations, considerable progress has been made regarding the strategic impact of the successful use of technology for the development of urban spaces. The current era provides us with important digital tools and the opportunity to test new perspectives in the sustainable development of cities. This paper aims to explore the contribution of UAVs to the spatial mapping process of urban space, with the goal of collecting quantifiable and qualitative information to use for 3D modeling that can enable a more comprehensive understanding of the urban environment, thus facilitating urban regeneration processes. Three-dimensional models of high accuracy are not mandatory for this research. The location of the selected research area is particularly interesting due to its boundaries, urban voids and public space that can evolve through public participation. The results can be used for crowdsourcing in participatory decision-making processes and for exploring the consequences that these have on the built environment, and they can be used as a new means of involvement of citizens in local decision-making processes.

1. Introduction

Urbanization is rapidly increasing; by the year 2050, the current proportion of 55% of the world’s population which already lives in urban areas is expected to increase to 68% [1]. This phenomenon has shaped every demographic and socioeconomic process of urban growth, creating an urgent need for new international policies and practices that focus on the development of urban resilience and sustainability [2]. An improved anticipation and analysis of the complexity of urban dynamics can safeguard its benefits and improve our way of life.
The urban environment consists of bordering areas of anthropogenic artificial surfaces used for transport, trade, production, administration and housing, and vegetation surfaces that are intensively managed and directly influenced by the artificial cover [3]. For a better understanding of the urban system, extensive mapping and monitoring of the built environment are required. Nevertheless, achieving high accuracy in these tasks is challenging due to the impervious and complex nature of constructed surfaces and buildings. Photogrammetry and remote sensing (RS) with the use of unmanned aerial vehicles (UAVs) is a fast-developing approach that aims to confront this problem.
The potentials of UAV-sourced imagery have played a significant role in modern photogrammetry and remote sensing. The combination of these two disciplines with the UAV medium gave us state-of-the-art products that can have high accuracy and resolution, generated with technology that costs a minimal amount in comparison to the results produced [4].
Fixed-wing and rotary UAVs can gather photogrammetric data with digital SLR (single-lens reflex) cameras or cameras attached to the UAV, perform automated or semi-automated flight modes and usually have a GNSS/INS sensor (global navigation satellite system/inertial navigation system) for the direct georeferencing of the images that are captured [5] with more reliable and precise results. This method finds widespread use when low-metric-quality images are required, such as fast data acquisition during natural disasters [6].
UAVs can be used for three-dimensional (3D) city modeling and road inspection, but also for vegetation monitoring and forestry, agriculture [7], environmental analysis (such as the monitoring of circulation and density of animals) [8] and the documentation of cultural heritage [9].

1.1. Mapping in the Urban Space

UAVs can be classified according to their weight but also according to their flying altitude, range and endurance. Their usage is subjected to several limitations since this is a newly explored field and the existing legislation involving their practice is ambiguous or scarce in many countries. Furthermore, meteorological elements also constrain their usage, and they are heavily restricted by their scarce energy autonomy. However, they possess an excellent capability of sensing and perceiving the environment, and they can analyze, communicate, plan and reach fast decisions by employing on-board gear. All these attributes are further complemented by their flexibility, as they can be programmed to adapt to various tasks and locations accordingly.
Remote sensing systems consist of two types: active and passive [10]. In the active remote sensing system, the energy required for object detection is provided by the sensors, which transmit radiation towards the target and then measure the reflected radiation. Active remote systems operate on the microwave scale of the electromagnetic field. They include laser altimeter, light detection and ranging (LiDAR), radars, ranging instruments, scatterometers and sounders [11]. Passive remote sensing systems tread between the visible and the microwave portions of the electromagnetic spectrum [11] and they include tools such as accelerometers, radiometers, spectrometers, etc.
In recent years, the extensive usage of UAVs in academic and commercial applications has encouraged their expansion, putting forward the challenge of identifying common procedures and feasibilities of UAVs, along with benchmarks for researchers [12]. Questions such as which methods and frameworks are most frequently used for obtaining applicable information from the remote sensing data are being raised [13]. As far as 3D modeling is concerned, there are several methods, which can be classified as follows: by level automation (automatic, semi-automatic, manual), source data type (results of photogrammetry from aerial, satellite or close-range imagery) and by the results of laser scanning (air or ground) [14]. These methods can be combined into a hybrid if needed.
Currently, there is no specific approach to automated 3D reconstruction since it depends on each project’s unique characteristics [15]. A UAV aided by GNSS or real-time kinematic positioning (RTK) and autopilot systems are very commonly used in data gathering for 3D modeling since they can capture high-resolution photogrammetric images with ease [12], images which are made available through several types of cameras, ranging from visible-band, multi-spectral cameras and near-infrared (NIR) to hyperspectral cameras and thermal infrared (TIR) [4]. These images can be processed to obtain an orthophoto, a point cloud or a 3D model of the area, while an on-board GNSS device allows the georeferencing of the data collected [16].
Unfortunately, every rose has its thorn. UAVs are not a panacea, and their infrastructure limits their capabilities, especially when their weight is of concern. Their limited payload capacities restrict their equipment, restraining the number and types of sensors they can carry and, subsequently, the produced results [11].
Direct georeferencing techniques cannot be used proficiently in situations such as environments where visibility of satellites is impossible or limited due to the strong degradation of the GNSS signal and the difficulty of automated flight modes [5]. Sequential processing is sometimes problematic as it is feasible only for ordered image datasets, and random flight paths can deliver unordered datasets that can be difficult to use for surface reconstructions [17].
While the disadvantages have been pointed out, the simplicity and the direct utility of UAV photogrammetry and remote sensing products do not translate into less qualitative results and are even more favorable than traditional airborne results, as geoimaging and navigation technologies can benefit from the miniaturization of the smaller and more lightweight data acquisition systems [4].

1.2. A Methodological Framework

This paper aims to explore the contribution of UAVs to the spatial mapping process of urban space, with the goal of collecting quantifiable and qualitative information for 3D modeling, enabling a more comprehensive understanding of the urban environment and thus facilitating urban regeneration processes. However, this research does not target 3D models of high accuracy; 3D models are used as the vehicle for public space management and thus, high fidelity is not mandatory. The paper starts with a review of the current state of the art of UAV technologies and systems. In the second section, a review of mapping the urban space with the use of UAVs is presented, along with the advantages and disadvantages of UAV technology regarding its limitations and restrictions during its usage. The workflow of the field research is then analyzed, starting by examining the background of the area and the methodology of the image creation, and closing with feature extraction (the type of drones that are used, the flight planning, the area considerations and the camera calibration). It continues with the image acquisition for the surface reconstruction and the programs that are used in order to allow the dense point cloud model generation, the digital surface model (DSM) creation, the process of digitization and the quality assessment of the accuracy of the feature extraction. The final section consists of the recapitulation of the results, considering the accuracy of mapping, the role of UAV mapping and 3D modeling as a tool for promotion and management of the urban space; an outlook on the dissemination process of the findings; and future perspectives, development and steps that can be put into action from the results.

2. Materials and Methods

2.1. Study Area

Pylaia is a suburb of the city of Thessaloniki in the prefecture of Central Macedonia in Greece. Its origin has its roots in ancient Greece, as it was the place where the eastern city gates were located.
In the context of the urban area of the city, this suburb has an interesting position within it, as it is a sparsely populated area that connects the center of Thessaloniki with the outskirts of the city, so it has both dense habitable areas but also numerous urban voids due to faulty urban development, the relatively small roads and the not-so-well-defined boundaries. This is why it was chosen as a case study, as the area was suitable for aerial images (due to the medium density, the huge amount of green areas and the relatively small height of buildings), and the material taken could be of paramount importance for the strategic infrastructural planning of the city.
In Greece, there are specific zones in which it is illegal to fly a drone, such as natural protected areas, airports or zones of military interest. The case study area was neither located in a no-fly zone nor had any particular restriction, and as a result, all the flights could be legally taken [18].
The data processing workflow was structured as follows: project planning and camera calibration, path planning determination, acquisition of aerial images, pre-processing of images, image 3D modeling using Pix4D Mapper (photogrammetric software) and rendering and texture mapping using Blender (3D rendering and visualization suite) (Figure 1).

2.2. The UAV System

For the case study presented in this paper, the aerial images were recorded using the UAV system shown in Figure 2, with the technical specifications of the platform, the sensors and the flight control data presented in Table 1 [19].
The UAV system was developed by the Chinese technology company DJI. It is equipped with a navigation system that includes a GNSS/INS system which allows the UAV to perform position control and altitude stabilization and, together with a third-party program (Pix4D Capture), the autonomous navigation of the drone. It also includes a magnetometer, altimeter, built-in data logger, hardware that is connected with a remote controller which can be connected with a cell phone application for visual control and monitoring image acquisition, and a digital camera that is connected through a built-in 3 axis gimbal.

2.3. Planning and Acquisition of Aerial Images

Aerial imagery was the method chosen in order to achieve coverage of the area. The collected data were acquired by mapping a predefined geometrical area of interest with the use of Pix4D Capture software, and the flying height was 93.34 m above ground. The area that was selected spanned a total of 81,000 m2. The images were acquired with the drone camera (the main parameters are presented in Table 2).
The calibration procedure was performed before the actual flight. It is important to point out that equipment such as UAVs, remote controllers and computers must be checked to ensure if they work well or not to avoid crashes and system failures due to malfunction. Subsequently, this phase was carried out carefully to verify that the UAV was functioning and ready for take-off.
The UAV was connected to the software via a radio transmitter and the automated flight could be accomplished. The take-off and landing were exercised manually via the remote controller for safety reasons. From the moment the UAV was flying, the automated flight mode was taken over by the software. The UAV pilot could visualize at any time the position of the aircraft on the software together with data such as acceleration, speed and navigation. The predefined route (Figure 3) was flown twice to ensure the collection of a high number of quality images (Figure 4). The entire data acquisition procedure lasted around two hours and the flight duration was around 20 minutes per flight due to technical restrictions related to the drone model, and batteries were changed for the second flight deployment. For this study, a total of 231 geotagged images were obtained, with 80% overlap.

2.4. Data Extraction and Surface Reconstruction

The obtained images were processed with the Pix4Dmapper photogrammetric software. The software is used for transforming UAV-collected images via computer vision and photogrammetry for the purpose of providing a range of outputs, such as 3D point clouds, orthomosaics and digital surface models (DSM) [20].
Before the modeling procedure, the geotagged images were selected in order to be processed. Through an initial processing, the optimization of the key points and photo matches were computed, as well as the analysis of geoinformation and tie points. After the initial processing, the generation of dense point cloud and 3D mesh started through the generation of points that were used for the surface reconstruction, the decimation of the mesh, the generation of texture atlas tiles and the exporting of the 3D textured mesh (Figure 5 and Figure 6).
After the point cloud and mesh were generated, the point cloud points were interpolated in order to generate the DSM. From the acquired digital surface model, the process of orthorectification was performed in order to produce the final orthomosaic (Figure 7).

2.5. Texture Mapping and Rendering

As the models generated from photogrammetric software have flaws in most cases, a 3D render engine can be used to upgrade the geometry. For a photo-realistic rendering of the 3D model, the open source 3D render software Blender was used. Blender is a Python-based software that can produce renders and texture mapping of 3D and BIM (building information modeling) models at a high quality and can be used for architecture, urban planning and construction [21].
The 3D file (FBX format) exported from Pix4D Mapper was imported in Blender. The render engine that was used was Cycles. Correct light exposure and camera angles were set, the model was scaled and a denoise filter was applied with additional contrast and saturation of the texture. For the texture mapping and rendering, Blender takes a projection of the subsets of the source image on the faces that corresponds to the 3D model, which is called UV mapping. As the 3D model had a huge number of meshes, the triangles of the model were tied with the source image. When it was not possible due to physical obstacles such as trees, plants, lighting poles or lack of overlapped images captured by the UAV, a similar-looking position and a curb of points were used in order to overcome this situation (Figure 8).

3. Results

In this work, we demonstrate the production from UAV-captured images of a geometrically and photorealistically accurate 3D model of the area of Pylaia in the city of Thessaloniki. Following the texture mapping with the aerial images and the subsequent reconstructed 3D mesh, the area of interest could be visualized with a low percentage of footprint deviation. The level of detail (LOD) of the 3D model achieved from the production workflow was LOD1 (level of detail 1) (Figure 9) [22]. The mean reprojection error yielded by bundle adjustment was below 1 pixel (0.238 pixels), which can be regarded as an adequate result for the measurement taken. The model achieved an RMSE of 0.42 m in X, 0.40 m in Y and 1.35 m in Z, and thus, although displaying moderate geolocation accuracy, is in line with the requirements of the present research. This parameter is primarily due to the lack of GCP (ground control points), the image orientation only with geotags and the insufficient accuracy of the GNSS system at the time of the acquisition of the images, but the mitigation of these problems was not a concern, as the product was used for participatory planning research and not for precision droning imagery.
Using the 3D texture mesh and the orthomosaic generated from the data collected, the built environment, urban voids and the boundaries of the area could be accurately detected (Figure 10). With regard to the spatial planning and management of the area of interest, the generated model can enable a more comprehensive understanding of the urban environment, thus facilitating urban regeneration processes. It can either be used for visualization and presentation purposes, or in a variety of subject integrations and measurements, such as in smart cities, environmental applications, strategic urban planning and participatory planning (Figure 11 and Figure 12). The results can be used for crowdsourcing in participatory decision-making processes and for exploring the consequences these have on the built environment, and they can be used as a new way to involve citizens in local decision-making processes. All untrained parties can better understand the ramifications of every strategic impact on their community, made possible by the potential of 3D modeling of the urban space analyzed. The process that was followed was to measure and describe spatial problems and conflicts and to prioritize problems of the area in order to scope and appraise a potential range of results; to identify and seek alternative solutions; and to monitor, implement and revaluate different potentials of strategic design processes and detailed design solutions. Although the action phase of the design implementation of the research is still speculative and the research does not integrate the practical demonstration and usage of the models obtained in the process (as the scope of the research limits providing new means for information needs in spatial participatory planning), the geo-spatial information processed in the research can provide both impact and counter models that can address location, resources and conflicts of events; potential models to plan operation on the area; and spatial visualization and possible intervention scenarios.

4. Discussion and Conclusions

As a result of the high number of aerial images taken, fine coverage of the area could be acquired. The 3D model was also well-structured due to a measurement of tie points to precision and due to many aerial overlaps, which made it possible for the facades and roofs of the buildings to be visible in the acquisition of the images.
It must be pointed out that when comparing airplane or helicopter images to UAV-acquired aerial images, the latter method appears to have many advantages. A wide range of different cameras can be mounted on UAVs, so every single image that is required can be taken with the same camera and the same camera parameters and calibration. Moreover, the quadcopter only needs a relatively small area for aerial deployment and landing, and thus can operate in limited areas (such as the area of interest) where airborne vehicles such as helicopters or airplanes cannot. Furthermore, it can enable higher-resolution images to be created in areas of lower altitude and it is helpful in areas that are in constant need of incremental updating of their maps, such as areas of interest that are crowded with urban voids and questionable boundaries.
In the field of photogrammetry, models obtained by utilizing photogrammetric workflows (using UAVs) can complement models obtained by terrestrial laser scanning. With aerial images, a photorealistic 3D model can be produced efficiently, with reduced time and cost in order to realize any project. This is also linked to the simplicity of data collection from nadir points of view which would not otherwise be visible from the ground, would be related to the requirements of regulations and private ownership, and would require time-consuming and costly procedures in order to be generated.
In the field of participatory planning, this mapping method can integrate data collected from the built environment to local spatial knowledge and the production of geo-referenced and scaled models. The model can also be given to the community in order to facilitate future administration of the area and for subsequent implementations on the evolution of public space, such as land use, community gardening, pocket parks and cultural ephemeral spaces. The information provided for participatory spatial planning can help stakeholders with the identification and analysis of the participatory problems and can advocate the priorities and the systematic exploration of potential solutions, and with joint monitoring, feedback and implementation of the activities, can address collaborative design between stakeholders and participants.
The application of UAV mapping for photogrammetry and 3D modeling in the urban space will significantly increase in the near future, as UAVs will become more affordable and their use will have major impacts on numerous areas in the field. As the surveying and urban and participatory planning research community is set to embrace these new technologies, their use in the urban space can become the standard procedure of work in the respective sector.

Author Contributions

All authors contributed equally to this work. E.S. and E.V. conceived and designed the UAV flights and the acquisition of images; I.T., N.T. and G.A.B. analyzed the data; and A.S., E.K. and I.S. wrote the paper. All authors have read and agreed to the published version of the manuscript.


This research received no external funding.


This paper was supported by the company Hellenic Petroleum through the scholarship of Alexandros Skondras.

Conflicts of Interest

The authors declare no conflict of interest.


  1. United Nations. World Urbanization Prospects: The 2018 Revision; United Nations: New York, NY, USA, 2019. [Google Scholar]
  2. Prakash, M.; Ramage, S.; Kavvada, A.; Goodman, S. Open earth observations for Sustainable Urban Development. Remote Sens. 2020, 12, 1646. [Google Scholar] [CrossRef]
  3. Van der Linden, S.; Okujeni, A.; Canters, F.; Degericx, J.; Heiden, U.; Hostert, P.; Priem, F.; Somers, B.; Thiel, F. Imaging spectroscopy of urban environments. Surv. Geophys. 2018, 40, 471–488. [Google Scholar] [CrossRef][Green Version]
  4. Colomina, I.; Molina, P. Unmanned aerial systems for photogrammetry and remote sensing: A review. ISPRS J. Photogramm. Remote Sens. 2014, 92, 79–97. [Google Scholar] [CrossRef][Green Version]
  5. Barazzetti, L.; Remondino, F.; Scaioni, M.; Brumana, R. Fully automatic UAV image-based sensor orientation. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2012, 38, 6. [Google Scholar]
  6. Calantropio, A.; Chiabrando, F.; Sammartano, G.; Spanò, A.; Teppati Losè, L. UAV strategies validation and remote sensing data for damage assessment in post-disaster scenarios. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2018, XLII-3/W4, 121–128. [Google Scholar] [CrossRef][Green Version]
  7. Remondino, F.; Barazzetti, L.; Nex, F.; Scaioni, M.; Sarazzi, D. UAV photogrammetry for mapping and 3D modeling—Current status and future perspectives. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2011, XXXVIII-1/C22, 25–31. [Google Scholar] [CrossRef][Green Version]
  8. Wich, S.; Koh, L. Conservation drones: The use of unmanned aerial vehicles by ecologists. GIM Int. 2012, 26, 29–33. [Google Scholar]
  9. Everaerts, J. The use of unmanned aerial vehicles (UAVs) for remote sensing and mapping. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2008, XXXVII Pt B1, 1187–1192. [Google Scholar]
  10. NASA. Remote Sensors; NASA: Washington, DC, USA, 2022. Available online: (accessed on 17 April 2022).
  11. Shakhatreh, H.; Sawalmeh, A.H.; Al-Fuqaha, A.; Dou, Z.; Almaita, E.; Khalil, I.; Othman, N.S.; Khreishah, A.; Guizani, M. Unmanned Aerial Vehicles (UAVs): A survey on civil applications and key research challenges. IEEE Access 2019, 7, 48572–48634. [Google Scholar] [CrossRef]
  12. Yao, H.; Qin, R.; Chen, X. Unmanned aerial vehicle for remote sensing applications: A review. Remote Sens. 2019, 11, 1443. [Google Scholar] [CrossRef][Green Version]
  13. Wellmann, T.; Lausch, A.; Andersson, E.; Knapp, S.; Cortinovis, C.; Jache, J.; Scheuer, S.; Kremer, P.; Mascarenhas, A.; Kraemer, R.; et al. Remote sensing in urban planning: Contributions towards ecologically sound policies? Landsc. Urban Plan. 2020, 204, 103921. [Google Scholar] [CrossRef]
  14. Li, M.; Nan, L.; Smith, N.; Wonka, P. Reconstructing building mass models from UAV images. Comput. Graph. 2016, 54, 84–93. [Google Scholar] [CrossRef][Green Version]
  15. He, M.; Petoukhov, S.; Hu, Z. Advances in Intelligent Systems, Computer Science and Digital Economics; Springer International Publishing: Berlin/Heidelberg, Germany, 2020; ISBN 978-3-030-39216-1. [Google Scholar]
  16. Koeva, M.; Muneza, M.; Gevaert, C.; Gerke, M.; Nex, F. Using UAVs for map creation and updating. A case study in Rwanda. Surv. Rev. 2016, 50, 312–325. [Google Scholar] [CrossRef][Green Version]
  17. Irschara, A.; Kaufmann, V.; Klopschitz, M.; Bischof, H.; Leberl, F. Towards fully automatic photogrammetric reconstruction using digital images taken from UAVs. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2010, 38 Pt 7A, 65–70. [Google Scholar]
  18. Drone Aware. GR. HCAA. Available online: (accessed on 17 April 2022).
  19. Phantom 4 Pro-DJI. DJI Official. Available online: (accessed on 10 February 2022).
  20. PIX4Dmapper: Professional Photogrammetry Software for Drone Mapping. Pix4D. Available online: (accessed on 10 February 2022).
  21. Foundation, Blender. Home of the Blender Project—Free and Open 3D Creation Software. Available online: (accessed on 10 February 2022).
  22. Biljecki, F.; Ledoux, H.; Stoter, J. An improved LOD specification for 3D building models. Comput. Environ. Urban Syst. 2016, 59, 25–37. [Google Scholar] [CrossRef][Green Version]
Figure 1. Data processing workflow.
Figure 1. Data processing workflow.
Drones 06 00115 g001
Figure 2. UAV system used in the research (source: [19]).
Figure 2. UAV system used in the research (source: [19]).
Drones 06 00115 g002
Figure 3. Drone path used for mapping the area (source: Pix4Dmapper).
Figure 3. Drone path used for mapping the area (source: Pix4Dmapper).
Drones 06 00115 g003
Figure 4. Example of aerial images from the matrix route conducted.
Figure 4. Example of aerial images from the matrix route conducted.
Drones 06 00115 g004
Figure 5. The point cloud generated from the data collected (source: Pix4Dmapper).
Figure 5. The point cloud generated from the data collected (source: Pix4Dmapper).
Drones 06 00115 g005
Figure 6. The 3D texture mesh generated from the data collected (source: Pix4Dmapper).
Figure 6. The 3D texture mesh generated from the data collected (source: Pix4Dmapper).
Drones 06 00115 g006
Figure 7. The orthomosaic generated from the data collected (source: Pix4Dmapper).
Figure 7. The orthomosaic generated from the data collected (source: Pix4Dmapper).
Drones 06 00115 g007
Figure 8. The rendered image of the mapped area of research from a nadir view (source: Blender).
Figure 8. The rendered image of the mapped area of research from a nadir view (source: Blender).
Drones 06 00115 g008
Figure 9. The five levels of detail (LODs) of a 3D model (source: [22]).
Figure 9. The five levels of detail (LODs) of a 3D model (source: [22]).
Drones 06 00115 g009
Figure 10. Urban analysis of the research area using the data collected.
Figure 10. Urban analysis of the research area using the data collected.
Drones 06 00115 g010
Figure 11. The rendered image of the mapped area of research from an oblique view, with the addition of new buildings to improve the urban space (source: Blender).
Figure 11. The rendered image of the mapped area of research from an oblique view, with the addition of new buildings to improve the urban space (source: Blender).
Drones 06 00115 g011
Figure 12. The rendered image of the mapped area of research from an oblique view, with the addition of new buildings to improve the urban space (source: Blender).
Figure 12. The rendered image of the mapped area of research from an oblique view, with the addition of new buildings to improve the urban space (source: Blender).
Drones 06 00115 g012
Table 1. Characteristics of the UAV system used.
Table 1. Characteristics of the UAV system used.
UAV model:DJI Phantom 4 Pro
Take-off weight:1388 g
Max. flight time:30 min
Camera sensor:1″ CMOS
Resolution:20 MP
Lens:FOV 84 24 mm f: 2.8-f: 11
Max. video rec. resolution:4K 60 FPS
Operating frequency:2.4 GHz/5.8 GHz
Geolocation:On-board GPS
Table 2. Camera parameters.
Table 2. Camera parameters.
Camera model:DJI FC300C
F-stop:f: 2.8
Focal length:4 mm
Dimensions:4000 × 3000
GSD (ground sampling distance) 2.56 cm/px
Horizontal and vertical resolution:72 dpi
Bit depth:24
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Skondras, A.; Karachaliou, E.; Tavantzis, I.; Tokas, N.; Valari, E.; Skalidi, I.; Bouvet, G.A.; Stylianidis, E. UAV Mapping and 3D Modeling as a Tool for Promotion and Management of the Urban Space. Drones 2022, 6, 115.

AMA Style

Skondras A, Karachaliou E, Tavantzis I, Tokas N, Valari E, Skalidi I, Bouvet GA, Stylianidis E. UAV Mapping and 3D Modeling as a Tool for Promotion and Management of the Urban Space. Drones. 2022; 6(5):115.

Chicago/Turabian Style

Skondras, Alexandros, Eleni Karachaliou, Ioannis Tavantzis, Nikolaos Tokas, Elena Valari, Ifigeneia Skalidi, Giovanni Augusto Bouvet, and Efstratios Stylianidis. 2022. "UAV Mapping and 3D Modeling as a Tool for Promotion and Management of the Urban Space" Drones 6, no. 5: 115.

Article Metrics

Back to TopTop