Next Article in Journal
Securing the Insecure: A First-Line-of-Defense for Body-Centric Nanoscale Communication Systems Operating in THz Band
Next Article in Special Issue
Individualization of Pinus radiata Canopy from 3D UAV Dense Point Clouds Using Color Vegetation Indices
Previous Article in Journal
Rotation Active Sensors Based on Ultrafast Fibre Lasers
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

UAV Photogrammetry under Poor Lighting Conditions—Accuracy Considerations

by
Pawel Burdziakowski
* and
Katarzyna Bobkowska
Department of Geodesy, Faculty of Civil and Environmental Engineering, Gdansk University of Technology, Narutowicza 11-12, 80-233 Gdansk, Poland
*
Author to whom correspondence should be addressed.
Sensors 2021, 21(10), 3531; https://doi.org/10.3390/s21103531
Submission received: 22 April 2021 / Revised: 15 May 2021 / Accepted: 17 May 2021 / Published: 19 May 2021
(This article belongs to the Special Issue Unmanned Aerial Systems and Remote Sensing)

Abstract

:
The use of low-level photogrammetry is very broad, and studies in this field are conducted in many aspects. Most research and applications are based on image data acquired during the day, which seems natural and obvious. However, the authors of this paper draw attention to the potential and possible use of UAV photogrammetry during the darker time of the day. The potential of night-time images has not been yet widely recognized, since correct scenery lighting or lack of scenery light sources is an obvious issue. The authors have developed typical day- and night-time photogrammetric models. They have also presented an extensive analysis of the geometry, indicated which process element had the greatest impact on degrading night-time photogrammetric product, as well as which measurable factor directly correlated with image accuracy. The reduction in geometry during night-time tests was greatly impacted by the non-uniform distribution of GCPs within the study area. The calibration of non-metric cameras is sensitive to poor lighting conditions, which leads to the generation of a higher determination error for each intrinsic orientation and distortion parameter. As evidenced, uniformly illuminated photos can be used to construct a model with lower reprojection error, and each tie point exhibits greater precision. Furthermore, they have evaluated whether commercial photogrammetric software enabled reaching acceptable image quality and whether the digital camera type impacted interpretative quality. The research paper is concluded with an extended discussion, conclusions, and recommendation on night-time studies.

1. Introduction

Low-level air photogrammetry using unmanned aerial vehicles (UAVs) has attracted huge interest from numerous fields over the last ten years. Photogrammetric products are used in various economic sectors, and thus intensively contribute to their growth. This situation primarily results from the development and widespread availability of UAVs equipped with good-quality non-metric cameras, the development of software base and easy-to-use photogrammetric tools, as well as increased computing power of personal computers. Despite the already widespread use of the aforementioned techniques, there is still a large number of issues associated with the processing of low-level photogrammetry products. This is mainly influenced by a relatively young age of this technology, the dynamic development of sensor design technology and modern computing methods. It can be stated without doubt that the complete potential of photogrammetry has not yet been fully discovered and unleashed, which is why scientists and engineers are constantly working on improving and developing the broadly understood UAV photogrammetry.
Works in the field of developing UAV measurement technologies are conducted concurrently on many levels. Scientists quiet rightly focus on selected elements of the entire photogrammetric product process, studying particular relationships, while suggesting new and more effective solutions. Research in the field of UAV photogrammetry can be divided into several mainstreams, with the main ones including:
  • Carrier system technology and techniques [1,2,3,4]: Works in this group focus on improving the navigation-wise aspects of flight execution, georeference accuracy, or sensor quality in order to achieve even better in-flight performance, flight time and stability [5,6,7], as well as the accuracy of navigation systems feeding their data to measuring modules [8,9].
  • Optimization of photogrammetric product processes [10,11,12,13]: Scientists look at processes ongoing at each stage of image processing and suggest optimal recommendations in terms of acquisition settings, image recording format [14], flight planning, or application of specific software settings [15].
  • Evaluating the quality of results obtained using UAV photogrammetry [16,17]: These analyses address the issues of errors obtained for photogrammetric images and products, based on applied measuring technologies (from the acquisition moment, through data processing using specialized software).
  • Focusing on the development of new tools improving the quality of low-level images [18,19,20]: Studies in this group involve a thorough analysis of the procedure of acquiring and processing UAV images and suggest new, mainly numerical, methods for eliminating identified issues [21,22,23]. As it turns out, photos taken from a dynamically moving UAV under various weather and UAV lighting conditions exhibit a number of flaws. These flaws result directly from the acquisition method and impact photogrammetric product quality.
  • Showing new applications and possibilities for extracting information from spatial orthophotoimages based on photos taken in the visible light range [24,25,26,27] and by multi-spectral cameras [28,29,30]: Unmanned aerial vehicles are able to reach places inaccessible to traditional measurement techniques [31,32]. However, they carry an incomplete spectrum of sensors onboard, due to their restricted maximum take-off mass (MTOM). Therefore, scientists focus on methods that enable extracting significant information from this limited number of sensors and data, e.g., only from a single camera, but used at various time intervals [33] or a limited number of spectral channels in cameras used on the UAV.
  • Presenting new photogrammetric software and tools [34,35,36]: The increasing demand for easy-to-use software obviously results in the supply of new products, both typically commercial and non-commercial products. New technologies and methods are developed in parallel.
  • Using sensory data fusion [37,38,39]: The issue in this group is the appropriate harmonization of data obtained from a dynamically moving UAV and other stationary sensors. Very often, these data have a slightly different structure, density, or accuracy, e.g., integration of point cloud data obtained during a photogrammetric flight, terrestrial laser scanning, and bathymetric data [40,41].
The above research, focusing strictly on specified certain narrow aspects of developing a photogrammetric product based on data acquired from an unmanned aerial vehicle, translate into further application studies, case studies, and new practical applications. Naturally, the most populated group of application studies are works addressing the issue of analysing the natural environment and urban areas. Popular application-related research subjects include analysing the wood stand in forestry [42,43,44,45], supporting precise agriculture and analysing the crop stand [46,47,48], and geoengineering analyses for the purposes of landform change and landslide analyses [49,50,51,52].
As evidenced above, the application of low-level photogrammetry is very broad, and studies in this field are conducted in many aspects. It should be noted that all the aforementioned research is based on image data acquired during the day, which seems natural and obvious. However, the authors of this paper draw attention to the potential and possible use of UAV photogrammetry during the darker time of the day. Previously, the potential of night-time images has not been yet widely recognized, since correct scenery lighting or a lack of scenery light sources remain obvious issues [53].
Studies dealing with night-time photogrammetry that point to the potential of such photos are still a niche topic [54]. A good example is the case of images inside religious buildings, obtained during the night and supported by artificial lighting. Such methods are used in order to improve the geometric model quality of the studied buildings through avoiding reflections and colour changes caused by sunlight penetrating into the interior through colourful stained-glass windows [55]. Similar conclusions were drawn by the authors of [56], who studied the issues associated of modelling building facades. The dark time of the day favours background elimination and extracting light sources. This property was utilized by the authors of [57], who used light markers built from LEDs to monitor the dynamic behaviour of wind turbines. This enabled to achieve a high contrast of reference light points on the night-time photos, which improved their location and identification. Similar, too, was the case with analysing landslide dynamics [58]. Santise et al. [59] focused on analysing the reliability of stereopairs taken at night, with various exposure parameters, for the purposes of geostructural mapping of rock walls. These examples show that terrestrial night-time photogrammetry has been functioning for years, albeit with a small and narrow scope of applications.
Night-time photogrammetry using UAVs has been developing for several years. The appearance of very sensitive digital camera sensors with low specific noise and small cameras working in the thermal infrared band made it quite easy to use them on UAVs [60,61]. So far, the main target of interest for UAV night-time photogrammetry has been urban lighting analyses.
The problem of artificial lighting analysis may comprise of numerous aspects. The first one is safety [62]. This topic covers analysing the intensity of lighting, which is important from the perspective of local safety, and enables optimizing lamppost arrangement, designing illumination direction, power, and angle, and thus selecting lighting fixtures. A properly illuminated spot on the road allows to see danger in time, is cost-efficient, and does not dazzle drivers [63,64,65]. On the other hand, artificial lighting also has a negative impact on humans and animals [66]. It has been an actor in the evolution of nature and humans for a short time, and its presence, especially accidental, is defined as artificial light pollution [67]. The issue associated with this phenomenon is increasingly noticed already at society level [68]. Night-time spatial data will be an excellent tool for analysing light pollution. Data from digital cameras mounted on UAVs and data from terrestrial laser scanning (TLS) for analysing street lighting is used for this purpose [69]. The authors of [70] showed that UAV data enable capturing urban lighting dynamics both in the spatial and temporal domains. In this respect, the scientists utilized the relationship between observed quality and terrestrial observations recorded using a sky quality meter (SQM) light intensity tool. For the purposes of determining luminance, the authors of [71] present methods for calibrating digital cameras fixed on UAVs and review several measuring scenarios.
UAVs equipped with digital cameras are becoming a tool that can play an important role in analysing artificial light pollution [72,73]. The possibility of obtaining high-resolution point clouds and orthophotoimages is an important aspect in this respect. The aforementioned research did not involve an in-depth analysis of the photogrammetric process and the geometric accuracy of typical models based on night-time photos (with existing street lighting). All analyses based on spatial data obtained through digital imaging should be supported with analysing point cloud geometric errors and evaluating the quality of developed orthomosaic. Night-time acquisition, as in the above cases, should not be treated the same as day-time measurements. These studies assumed, a priori, that typical photogrammetric software was equally good with developing photos taken in good lighting and night-time photographs. The issues that impact night-time image quality can also be scenery lighting discontinuity or too poor lighting at the scene, and increased levels of noise and blur appearing on digital images obtained from UAVs.
As a result of the above considerations, the authors of this paper put forward a thesis that a photogrammetric product based on night-time photos will exhibit lower geometric accuracy and reduced interpretative quality. It seems that this thesis is rather obvious. However, after a deeper analysis of the photogrammetric imaging process, starting with data acquisition, through processing and to spatial analyses, it is impossible to clearly state which of the elements of this process will have the greatest influence on the quality of a photogrammetric product. This leads to questions concerning which process element had the greatest impact on degrading a night-time photogrammetric product, which measurable factor directly correlated with image accuracy, and whether commercial photogrammetric software enabled reaching acceptable image quality and whether the digital camera type impacted interpretative quality. Therefore, the authors of this study set a research objective of determining the impact of photogrammetric process elements on the quality of a photogrammetric product, for the purposes of identifying artificial lighting at night.

2. Materials and Methods

2.1. Research Plan

In order to verify the thesis and research assumptions of this paper, a test schedule and computation process were developed. This process in graphic form, and the used tools, are shown in Figure 1. The test data were acquired using two commercial UAVs. The study involved 4 research flights, two during the day, around noon, and two at night. Study data were processed in popular photogrammetric software, and typical photogrammetric products were then subjected to comparative analysis. The study involved taking day-time photos in automatic mode, and night-time ones manually. This enabled correctly exposing the photos at night, without visible blur. The details of these operations are shown in the further section of this paper.

2.2. Data Acquisition

Research flights were conducted using two UAV types: DJI Mavic Pro (MP1) and DJI Mavic Pro 2 (MP2) (Shenzhen DJI Sciences and Technologies Ltd., Shenzhen, China). Such an unmanned aerial vehicles are the representatives of commercial aerial systems, designed and intended primarily for recreational flying and for amateur movie-makers. The versatility and reliability of these devices was quickly appreciated by the photogrammetric community. Their popularity results mainly from their operational simplicity and the very intuitive ground station software.
Both UAVs are equipped with an integrated digital camera with a CMOS (complementary metal-oxide-semiconductor) sensor Table 1. An FC220 digital camera installed onboard the MP1 is a compact device with a very small 1/2.3″ (6.2 × 4.6 mm) sensor and a minor maximum ISO (1600) sensitivity. The more recent Hasselblad L1D-20c structure, installed onboard the MP2, is characterized by 4× the sensor area of 1” (13.2 × 8.8 mm) and a maximum ISO of 12800. In light of the technical specification of the L1D-20c camera, it can be presumed that it will provide greater flexibility at night, and will allow to obtain images of better quality.
The dimensions of the test area are 290 × 630 m, with all the flights conducted at an altitude of 100 m AGL (above ground level), with a forward and lateral overlap of 75%. Depending on the case, a total of 114 to 136 photos with metadata and the actual UAV position were taken. The data are saved in the EXIF (exchangeable image file format). Day-time flights were programmed and conducted automatically, according to a single grid plan, over the designated terrain. Nigh-time flights were conducted following a single grid plan, although in manual mode. In order to minimize blur induced by long exposure time at night, every time after taking the position to take the photo, the operator would hover for 2–3 s, and manually release the shutter after stabilizing the vehicle. The aim of this procedure was to obtain possibly sharp images without blur. All images acquired at night were visually assessed as sharp and without blur.
Sensor sensitivity was set manually at a constant value of 400 ISO, which extended the exposure time, but enabled minimizing noise. As is was deeply investigated by Roncella et al. in [74], the most affecting parameter on the overall performance of the photogrammetric system in extreme low-light is the ISO setting. The higher ISO always increases the level of noise of the images, making the matching process less accurate and reliable. The peak signal-to-noise ratio (PSNR), calculated as in [74] for test images (Figure 2, with reference to the ISO 100 test image, proves that the image quality reduces, and noise increases for higher ISO for utilized cameras. With the procedure used to take the image (while hovering), increasing the ISO above 400 would increase the noise, but without affecting the blur significantly. Reducing ISO below 400 resulted in longer exposure time and increased blur. The right ISO setting is a balance between the noise and the shutter speed for particular lighting conditions.
The white balance was also set manually to a fixed value. It should be noted that the digital camera used within this research can effectively take photos automatically both during the day and night. In automatic mode, the processor adapts all adjustable exposure parameters. In the case of the white balance, its value is also adjusted for each shot. As far as measurement night-time photos are concerned, exposure automation can be tricky. During the night, the processor attempts to maximize sensor sensitivity, at the expense of reduced exposure time and minimizing blur. This situation is exacerbated when flying over a study area without artificial lighting. The white balance can be changed even for every photo, especially, as seen in the presented case, when the urban lighting colour is variable. A summary of research flights is shown in Table 2.
A photogrammetric network was developed that comprised of 23 ground control points (GCPs), non-uniformly arranged throughout the study area, and their positions were measured with the GNSS RTK accurate satellite positioning method. The non-uniform distribution of points was forced directly by the lack of accessibility to the south-eastern and central parts of the area. The south-eastern area is occupied by the closed part of the container terminal, and the distribution of points in the central part, where the viaduct passes, was unsafe due to the high car traffic. GCP position was determined relative to the PL-2000 Polish state grid coordinate system and their altitude relative to the PL-EVRF2007-NH quasigeoid. All control points were positioned in characteristic, natural locations. These are mainly easyto identify in both day-time and night-time photos of terrain fragments with variable structure, road lane boundaries, and manhole covers. When locating control points, a priority rule that GCPs had to be located at spots illuminated with streetlamp lighting was adopted (Figure 3a,b). Furthermore, for analytical purposes, several points located within a convenient area were, however, not illuminated with artificial lighting (Figure 3c,d).

2.3. Data Processing

The flights enabled to obtain images that were used to generate a photogrammetric product without any modifications. Standard products in the form of a point cloud, digital terrain model (DTM), and orthophotoimages in Figure 4 were developed in Agisoft Metashape v1.7.1 (Agisoft LLC, St. Petersburg, Russia).
Products were developed smoothly. The processing for each data set followed the same sequence, with the same settings. The operator tagged all visible GCPs both at night and day. Some GCPs at night were not readily identifiable on the photos due to insufficient lighting. Table 3 shows a brief summary of the basic image processing parameters.
All of them contain a similar number of photos with the flight conducted at the same altitude of 100 m AGL (above ground level). As said above, MP1-N flight altitude is significantly underestimated, which is typical for photos of reduced quality, as demonstrated in [18]. Such an incorrect calculation is a symptom of errors that translate to the geometric quality of a product, which will be proven later in the paper.
Table 4 shows a summary of tie point data. As can be seen, day-time products have a similar number of identified tie points, while night-time ones exhibit significantly fewer.
Night-time data for the suggested analysis area enabled identifying approximately 30–40% tie points of the ones identifiable during the day. In addition, the mean key point size in the case of night-time photos is from 165% to 312% higher than the respective day-time cases.
Table 5 and Table 6 show the mean camera position error and root mean square error (RMSE) calculated for ground control point positions, respectively. The mean camera position error is the difference in the camera position, determined by an on-board navigation and positioning system resulting from aerotriangulation. The values of errors in the horizontal plane (x,y) fall within the limits typical for GPS receivers used in commercial UAVs. The absolute vertical plane error (z) is significantly higher (from 13 to 45 m) and arises directly from different altitude reference systems. Table 7 presents the root mean square error (RMSE) calculated for check points position.
A total of 26 ground control points were located in the field. All points visible on the photos were tagged by the operator and included in software calculations. Further, 13 and 14 points, respectively, were located in the processing area at daytime, while during the night 10 and 9 were visible. The points located in spots without lighting were not visible on the photos or their visibility was so low that it was impossible to properly mark them in the processing software with sufficient certainty. In order to control the quality of the process, three check points (CPs) were established and distributed over the study area. The GCPs and CPs distribution over the research area is presented in Figure 5.

2.4. Camera Calibration

The Agisoft Metashape software uses a parametric lens distortion model developed by Brown [75,76]. For a perspective pinhole camera, transformation of points ( X c , Y c , Z c ) in space 3 to coordinates in image plane ( x , y ) in space 2 , omitting the final units of the image coordinate system, can be expressed as [77]:
x = f X c Z c ,
y = f Y c Z c ,
where f means the image plane distance from the projection centre.
The camera’s local system has its origin in the centre of the camera system projections ( O c ), axis Z c is oriented towards the observation direction, axis X c is oriented towards the right, axis Y c is oriented downwards, while the origin of the image coordinate system (background system) is located in the centre of the image plane.
The camera’s optical system introduces certain deformations called distortions that cause displacement of point ( x , y ) within the background plane into a different position ( x , y ) . The new position, taking into account radial and decentring distortion, can be expressed as:
x = x ( 1 + K 1 r 2 + K 2 r 4 + K 3 r 6 + K 4 r 8 ) + ( P 1 ( r 2 + 2 x 2 ) + 2 P 2 x y ) ,
y = y ( 1 + K 1 r 2 + K 2 r 4 + K 3 r 6 + K 4 r 8 ) + ( P 2 ( r 2 + 2 y 2 ) + 2 P 1 x y ) ,
where: K 1 ,   K 2 ,   K 3 ,   K 4 are radial symmetric distortion coefficients, P 1 ,   P 2 are decentring distortion coefficients (including both radial asymmetric and tangential distortions), while r is the radial radius, defined as:
r = x 2 + y 2 .
Because in the discussed case we are dealing with an image plane in the form of CMOS sensors, it is necessary to convert ( x , y ) into image coordinates expressed in pixels. In the case of digital images, coordinates are usually given in accordance with the sensor indexation system adopted in the digital data processing software. Therefore, for this image, axis x is oriented to the right, axis y downwards, and the origin is located at the centre of the pixel ( 1 , 1 ) . Coordinates for this system are expressed in pixels. Therefore, taking into account the image matrix size, the physical size of the image pixel, affinity non-orthogonality, principal point offset, and the projected point coordinates in the image coordinate system, we can write:
u = 0.5 w + c x + x f + x B 1 + y B 2 ,
v = 0.5 h + c y + y f ,
where: u ,   v is expressed in pixels (px), f denotes focal length (px), c x , c y principal point offset (px), B 1 ,   B 2 affinity and non-orthogonality (skew) coefficients (px), w ,   h image width and height, image resolution (px), and they are all defined as intrinsic orientation parameters (IOP).
Correct determination of IOPs (intrinsic orientation parameters) and distortion parameters is very important in terms of photogrammetry. In the traditional approach, these parameters are determined at a laboratory and are provided with a metric camera. UAV photogrammetry usually uses small, non-metric cameras, where the intrinsic orientation and distortion parameters are unknown and unstable (not constant and can slightly change under the influence of external factors like temperature and vibrations). Due to the fact that the knowledge of current IOPs is required, the photogrammetric software used in this study calibrates the camera in each case, based on measurement photos, and calculates current IOPs for given the conditions and camera. Such a process ensures result repeatability. However, it functions correctly mainly in the case of photos taken under good conditions, i.e., during the day, sharp, and well-illuminated. Table 8 shows IOPs and distortion parameters determined by the software for each case.

3. Results

3.1. Geometry Analysis

The geometric quality of developed relief models was evaluated using the methods described in [78]. An M3C2 distance map (multiscale model to model cloud comparison) was developed for each point cloud. The M3C2 distance map computation process utilized 3-D point precision estimates stored in scalar fields. Appropriate scalar fields were selected for both point clouds (referenced and tested) to describe measurement precision in X, Y, and Z ( σ X , σ Y , σ Z ). The results for the cases are shown in Figure 6.
The statistical distribution of M3C2 distances is close to normal, which means that a significant part of the observations is concentrated around the mean (Table 9). The mean (µ) for the comparison of night-time and day-time cases shows, both from MP1 and MP2, that the models are 0.4 to 1.6 m apart. Furthermore, standard deviation (σ) (above 4 m) indicates that there are significant differences between the models. A visual analysis Figure 6 explicitly shows that the highest distance differences can be seen in the area of the flyover, which is a structure located much higher than the average elevation of the surrounding terrain in the case of the flight with UAV MP1 (Figure 6a). The same comparison of the night-time and day-time models for UAV MP1 does not exhibit such significant differences in this area (Figure 6b). It should be noted that M3C2 values are clearly higher in the area remote from GCP (south-eastern section of Figure 6a,b, values in yellow). It follows directly that, within the model development process, upon a significantly reduced tie point number, which is the case for night-time photos, aerotriangulation introduces a significant error. This is error is greater the farther the tie points are from identified GCPs. This phenomenon does not occur to such an extent for day-time cases (day MP1/day MP2) shown in Figure 6c. It is confirmed by the visual and parametric assessment of statistical data (Table 9).
The D-MP1/MP2 case exhibits values µ = 0.11 m σ = 2.52 m, and a clear elevation of the deviation occurs only in the flyover area. The analysis of the night/night case clearly emphasizes the issue of reconstructing higher structures (flyover, building), which is particularly based on correct aerotriangulation. The error of reconstructing objects located higher than the average terrain elevation, with rapidly increasing elevation is significant. This confirms that a reduction in the tie point number caused by low intensity has a great impact on reconstruction errors. It is clearly demonstrated in the area of the flyover highway, where the software does not identify so many tie points.

3.2. Autocalibration Process Analysis

Intrinsic orientation parameters (IOPs), including lens distortion parameters, have a significant impact on the geometry of a photogrammetric product. The automatic calibration process executed by Agisoft is based primarily on correctly identified tie points. Figure 7 shows a graphical comparison of the parameters, previously shown in Table 8. The graphical presentation of grouped results with corresponding error, for day/night cases, shows a noticeable trend and relationships that translate to the above reduced image quality.
Theoretically, and in line with previous experience [18,19,79], the calibration parameters for a single camera should be the same or very similar. Typically, especially for non-metric cameras, the recovered IOP are only valid for that particular project, and they can be slightly different for another image block under different conditions. In the case of night-time calibration, which can be seen in Figure 7, IOPs are significantly different than in the case of day-time calibration. This difference is particularly higher for cameras with lower quality and sensitivity (UAV MP1). The focal length (F) for MP1 changes its value by 50% during the night, and only by 6% in the case of MP2. Tie point location ( C x , C y ) for MP1 was displaced by more than 60 pixels in the x-axis during the night. B coefficients for night-time cases tend to strive to zero. We can also observe significant differences in terms of radial and tangential distortion for UAV MP1. Whereas the distortion parameters for the MP2 camera remain similar, they exhibit a higher error at night. The intrinsic orientation and distortion parameter determination error is significantly higher at night in all cases.
In order to achieve additional comparison of calibration parameters, their distribution within the image plane and a profile depend on the radial radius. Figure 8 and Figure 9 show corresponding visualisations for MP1 and MP2, respectively.
As noted, the IOP and distortion determination error is significantly higher at night in each case. This means that these values are determined less precisely than during the day, which directly translates to reduced model development precision. In order to verify this hypothesis, the authors conducted additional analyses and calculated the maximum possible point ground displacement for a single air photo at an altitude of 100 m, taking into account the value of intrinsic orientation and distortion parameters increased by the maximum error value reported by photogrammetric software. In other words, formulas (1) to (7) were used to convert u and v to the spatial position of the points ( X c , Y c , Z c ) for a flight altitude of 100 m, considering IOPs and distortion parameters increased by the maximum error. The result of this operation is shown in Figure 10.
The statistical values of the maximum error in metres, for a flight altitude of 100 m, are shown in Table 10.
As shown in Table 10 the maximum displacement of ground points can occur in the MP1-N case and can even exceed 3 metres, with the mean displacement of approximately 60 cm. This displacement, resulting from the occurrence of a maximum error, is only informative, and according to Gaussian distribution, achieving such a situation in real life is very unlikely. Nonetheless, the increased error analysis conducted during night-time calibration proves that methods functioning correctly during the day are not able to accurately determine IOPs at night.

3.3. Relationships

Spatial relationships between point intensity, reprojection error and tie point determination precision are shown in Figure 11. The intensity map was calculated for each tie point ( I i ) according to the formula [80]:
I i = 0.21   R + 0.72   G + 0.07   B ,
where: i mean a tie point number, and R, Gm and B represent spectral response values recorded by a camera sensor for a given tie point and a red, green, and blue spectrum, respectively.
In order to evaluate the tie point precision distribution, the authors introduced a value of the precision vector (vPrec— d σ ), such that:
d σ i = ( σ X i ) 2 + ( σ Y i ) 2 + ( σ Z i ) 2 ,
where: σ X , σ Y , σ Z means the measurement precision for the i-th tie point, obtained throug the method described in [78]. These values are calculated in millimetres (mm).
Figure 12 shows the relationships of the d σ i precision vector (vPrec) reprojection error and intensity, calculated for each tie point. The blue line represents accurately obtained values, while the red line averages them in order to visualise the trend and relationships. All data were sorted in ascending order of intensity.
The mean value for models based on day-time photos is 113 and 124 with a standard deviation of 51 and 44, respectively (Table 11). The mean RE value for day-time models is 0.6656 px ( s R E ( M P 1 D ) = 0.5253 ) and 0.3130 px ( s R E ( M P 2 D ) = 0.3659 ), respectively. The maximum RE value for day-time models does not exceed 21 pixels. The s R E analysis for day-time cases indicates that values are relatively constant for all points, which is also confirmed by the graph (Figure 12). The mean value for models based on night-time photos is 25 and 32 with a standard deviation of 23 and 27, respectively. The mean RE value for night-time models is 1.3311 px ( s R E ( M P 1 N ) = 1.407 ) and 1.168 px ( s R E ( M P 2 N ) = 1.26 ), respectively, and the maximum RE value is significantly higher at approximately 37 pixels. Conversely, for night-time cases, s R E takes higher values, which proves high result variability. This can be easily seen in the graphs (Figure 12), where the blue plot for RE and vPrec for night-time cases shows a high amplitude change.
An analysis of relationships between intensity, RE and vPrec shows a strong correlation between accuracy parameters in nigh-time photos, i.e., those, where the mean intensity is in the range from 25 to 35. On the other hand, the same relationships are not exhibited by models based on day-time photos. The results show that RE stabilizes towards the mean only above an intensity of 30 and 50, respectively.
A decrease in the number of tie points following a decrease of intensity results directly from the algorithm applied at the first stage of the photogrammetric stage, namely, feature matching. The software developer uses a modified version of the popular SIFT (scale-invariant feature transform) algorithm [81]. As shown by the studies in [82,83,84,85,86,87], the SIFT algorithm is non-resistant to low intensity and low contrast. This leads to the generation of a certain number of false key points, which are later not found on subsequent photos, thus becoming tie points.
The maximum total number of matches in MP1 night-time photos is 1080, including 1018 valid and 62 invalid matches. Similarly, the maximum total number of matches for day-time photos is 3733, including 3493 valid and 240 invalid matches. A decrease in the number of matches by more than 70% greatly impacts image quality. This phenomenon was demonstrated for a pair of identical photos in the same region in Figure 13. The figure shows a day-time example of a stereopair from 1238 total matches, and a night-time stereopair from 208 total matches.

4. Discussion

The results and analyses conducted within this study show the sensitivity of a photogrammetric process applied in modern software, relative to photos taken in extreme lighting conditions. The issues analysed by the authors can be divided as per the division in [18], intro procedural, numerical and technical factors. Procedural factors that impact the final quality of a photogrammetric product include GCP distribution and flight plan, whereas in terms of numerical factors, it is the issue of night-time calibration. Thirdly, presented technical factors included the difference between cameras on two different UAVs.
The reduction in geometry during night-time tests were greatly impacted by non-uniform distribution of GCPs within the study area. As was noted, wherever there were no placed GCPs, the distance between the reference model and the night-time model significantly increased. In the case of day-time photos, these differences are not as important since this phenomenon is compensated by high tie point density and their low mean diameter. As recommended by most photogrammetric software developers, 5–10 ground control points should be arranged uniformly in such a case. This ensures acceptable product quality. The situation is quite the opposite at night, when the tie point density decreases by up to 70%, which results in the formation of aerotriangulation errors. A night-time study will lead to the reprojection error increasing with growing distance from GCPs and with a change in object height. In the case in question, it was physically impossible to uniformly arrange GCPs, due to safety (traffic on the flyover—expressway) and in the south-eastern area, which is a closed seaport section. The location of the research in such a challenging area was not a coincidence. The authors decided on this location because there is indeed a problem with light pollution in this area and there is a wide variety of urban lighting of different types. The location of the study is closest to the real problems that can be encountered during the measurements and can induce new and interesting challenges for science. After all, it is not always physically possible to evenly distribute GCPs, which during the day does not induce significant problems, but as shown, at night can be a rather more significant factor.
GCP arrangement at night should be properly planned. Objects intended to serve as natural or signalled points should be illuminated. This is a rather obvious statement. However, it requires a certain degree of planning when developing the photogrammetric matrix and, more importantly, such a selection of control points so that they are sufficiently contrasting. Low contrast of natural control points results in a significant reduction of identifiability at low lighting intensity. An active (glowing marker), new type of control point might be a good solution in insufficiently illuminated spots, where it is impossible to place or locate control points.
A developed and executed night-time flight plan covered manual flight execution, following a single grid. The UAV had to be stopped every time for 1–3 s, until hover was stabilized, and the shutter released. The time required to stabilise the hover depends, of course, on the inertia of the UAV itself and its flight dynamics. Such a procedure enables taking a photo without clear blur and allows to optimally extend the exposure time, and reduce ISO sensitivity in order to avoid excessive matrix noise. It seems impossible to take sharp, unblurred photos without stopping in-flight, as is the case with a traditional day-time flight. Commercially available software for planning and executing UAV flights (Pix4D Capture—Android version) does not offer a function of automatic breaking during exposure, and furthermore, does not enable full control over exposure parameters. The same software offers the possibility to stop the UAV while taking picture only on iOS (mobile operating system created Apple Inc., Cupertino, CA, USA). Full control is possible also via UAV manufacturer’s dedicated software. However, it does not enable automatic flight plan execution.
Commercial aerial systems are equipped with built-in non-metric digital cameras, the intrinsic orientation and distortion parameters of which are not constant. Such cameras require frequent calibration, which is practically achieved every time, with each photogrammetric process. This process, as demonstrated, is sensitive to poor lighting conditions, which leads to the generation a higher determination error for each intrinsic orientation and distortion parameter. Since it is these parameters that significantly influence the geometric quality of a product, their correct determination is extremely important. As indicated by studies, a brighter camera of a generally better quality, used onboard Mavic 2 Pro, exhibited clearly lower deviation of night-time calibration parameters, relative to the same settings during the day. This enables obtaining clearly more stable results.
As evidenced, uniformly illuminated photos can be used to construct a model with lower RE, and each tie point exhibits greater precision. The issue of decreasing precision at low photo brightness results from the type of algorithm used to detect tie points, which means that, potentially, improving this element within the software would improve the geometric quality of the model generated based on night-time photos.

5. Conclusions

As shown in the introduction, a number of publications on night-time photogrammetric products did not analyse the issue of model geometry. These studies assumed, a priori, that geometry would be similar to that during the day, and focused on the application-related aspects of night-time models. As indicated in this research, popular photogrammetric software generates night-time models with acceptable geometry. However, their absolute quality is poor. It can be concluded that the geometry of night-time models cannot be simply used for surveying or cartographic projects. This is greatly influenced by the image processing itself and used algorithm.
A more in-depth analysis of the photogrammetric process indicated that a decrease is experienced on many levels, starting with procedural factors (data acquisition, photogrammetric flight), through numerical factors (calibration process and algorithms applied to detect key points), to technical factors (camera type and brightness). These factors blend and mutually impact image quality. It cannot be clearly stated which of the factors has the greatest influence on the geometric quality of an image, although it seems obvious that a precise camera and a stable UAV flight will be the best combination. Therefore, it can be concluded that technical factors will be the most decisive in terms of night-time image quality. They will be followed by procedural factors. The very procedure of night-time photo acquisition must ensure taking a sharp photo, namely a stable hover during the exposure time. Such a solution will ensure a sharp photo without blur. Another element of the procedure is uniform GCP distribution in an illuminated location. Distribution uniformity has a significant impact on image geometry, especially with low tie point density. The last element is the numerical factor, which the user has little influence on in this case. Software does not allow to change applied algorithms or the calibration procedure, and one can guarantee conditions correct for the operation of these algorithms only through good practice. The biggest problem was noted in the terms of the calibration algorithm, with even a slight error increase resulting in significant geometric changes and ground point displacement, by up to several metres.

Author Contributions

Conceptualization, P.B. and K.B.; methodology, P.B.; software, P.B. and K.B.; validation, P.B. and K.B.; formal analysis, K.B.; investigation, P.B.; resources, P.B.; data curation, P.B.; writing—original draft preparation, P.B. and K.B.; writing—review and editing, K.B. and P.B.; visualization, P.B.; supervision, P.B.; project administration, K.B.; funding acquisition, K.B. All authors have read and agreed to the published version of the manuscript.

Funding

Financial support of these studies from Gdańsk University of Technology by the DEC-42/2020/IDUB/I.3.3 grant under the ARGENTUM—‘Excellence Initiative-Research University’ program is gratefully acknowledged.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Goel, S. A Distributed Cooperative UAV Swarm Localization System: Development and Analysis. In Proceedings of the 30th International Technical Meeting of The Satellite Division of the Institute of Navigation (ION GNSS), Portland, OR, USA, 25–29 September 2017; pp. 2501–2518. [Google Scholar]
  2. Goel, S.; Kealy, A.; Lohani, B. Development and Experimental Evaluation of a Low-Cost Cooperative UAV Localization Network Prototype. J. Sens. Actuator Netw. 2018, 7, 42. [Google Scholar] [CrossRef] [Green Version]
  3. Zhang, X.; Lu, X.; Jia, S.; Li, X. A novel phase angle-encoded fruit fly optimization algorithm with mutation adaptation mechanism applied to UAV path planning. Appl. Soft Comput. J. 2018, 70, 371–388. [Google Scholar] [CrossRef]
  4. Kyristsis, S.; Antonopoulos, A.; Chanialakis, T.; Stefanakis, E.; Linardos, C.; Tripolitsiotis, A.; Partsinevelos, P. Towards Autonomous Modular UAV Missions: The Detection, Geo-Location and Landing Paradigm. Sensors 2016, 16, 1844. [Google Scholar] [CrossRef] [Green Version]
  5. Oktay, T.; Kanat, Ö.Ö. A Review of Aerodynamic Active Flow Control. In Proceedings of the International Advanced Technologies Symposium, Elazig, Turkey, 19–22 October 2017. [Google Scholar]
  6. Oktay, T.; Celik, H.; Turkmen, I. Maximizing autonomous performance of fixed-wing unmanned aerial vehicle to reduce motion blur in taken images. Proc. Inst. Mech. Eng. Part I J. Syst. Control. Eng. 2018, 232, 857–868. [Google Scholar] [CrossRef]
  7. Oktay, T.; Coban, S. Simultaneous Longitudinal and Lateral Flight Control Systems Design for Both Passive and Active Morphing TUAVs. Elektron. Elektrotechnika 2017, 23, 15–20. [Google Scholar] [CrossRef] [Green Version]
  8. Kedzierski, M.; Fryskowska, A.; Wierzbicki, D.; Nerc, P. Chosen Aspects of the Production of the Basic Map Using UAV Imagery. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. ISPRS Arch. 2016, 2016, 873–877. [Google Scholar]
  9. Kedzierski, M.; Fryskowska, A.; Wierzbicki, D.; Grochala, A.; Nerc, P. Detection of Gross Errors in the Elements of Exterior Orientation of Low-Cost UAV Images. Baltic Geod. Congr. (BGC Geomat.) 2016, 95–100. [Google Scholar] [CrossRef]
  10. Di Franco, C.; Buttazzo, G. Coverage Path Planning for UAVs Photogrammetry with Energy and Resolution Constraints. J. Intell. Robot. Syst. Theory Appl. 2016, 83, 445–462. [Google Scholar] [CrossRef]
  11. Ribeiro-Gomes, K.; Hernández-López, D.; Ortega, J.F.; Ballesteros, R.; Poblete, T.; Moreno, M.A. Uncooled Thermal Camera Calibration and Optimization of the Photogrammetry Process for UAV Applications in Agriculture. Sensors 2017, 17, 2173. [Google Scholar] [CrossRef] [PubMed]
  12. Kromer, R.; Walton, G.; Gray, B.; Lato, M. Robert Group Development and Optimization of an Automated Fixed-Location Time Lapse Photogrammetric Rock Slope Monitoring System. Remote Sens. 2019, 11, 1890. [Google Scholar] [CrossRef] [Green Version]
  13. Rau, J.-Y.; Jhan, J.-P.; Li, Y.-T. Development of a Large-Format UAS Imaging System with the Construction of a One Sensor Geometry from a Multicamera Array. IEEE Trans. Geosci. Remote Sens. 2016, 54, 5925–5934. [Google Scholar] [CrossRef]
  14. Alfio, V.S.; Costantino, D.; Pepe, M. Influence of Image TIFF Format and JPEG Compression Level in the Accuracy of the 3D Model and Quality of the Orthophoto in UAV Photogrammetry. J. Imaging 2020, 6, 30. [Google Scholar] [CrossRef]
  15. Zhou, Y.; Daakir, M.; Rupnik, E.; Pierrot-Deseilligny, M. A two-step approach for the correction of rolling shutter distortion in UAV photogrammetry. ISPRS J. Photogramm. Remote Sens. 2020, 160, 51–66. [Google Scholar] [CrossRef]
  16. Koutalakis, P.; Tzoraki, O.; Zaimes, G. UAVs for Hydrologic Scopes: Application of a Low-Cost UAV to Estimate Surface Water Velocity by Using Three Different Image-Based Methods. Drones 2019, 3, 14. [Google Scholar] [CrossRef] [Green Version]
  17. Zhou, Y.; Rupnik, E.; Meynard, C.; Thom, C.; Pierrot-Deseilligny, M. Simulation and analysis of photogrammetric uav image blocks: Influence of camera calibration error. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, IV-2/W5, 195–200. [Google Scholar] [CrossRef] [Green Version]
  18. Burdziakowski, P. A Novel Method for the Deblurring of Photogrammetric Images Using Conditional Generative Adversarial Networks. Remote Sens. 2020, 12, 2586. [Google Scholar] [CrossRef]
  19. Burdziakowski, P. Increasing the Geometrical and Interpretation Quality of Unmanned Aerial Vehicle Photogrammetry Products using Super-Resolution Algorithms. Remote Sens. 2020, 12, 810. [Google Scholar] [CrossRef] [Green Version]
  20. Wierzbicki, D.; Kedzierski, M.; Fryskowska, A. Assesment of the influence of UAV image quality on the orthophoto production. In Proceedings of the International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Toronto, ON, Canada, 30 August–2 September 2015. [Google Scholar] [CrossRef] [Green Version]
  21. Wierzbicki, D.; Fryskowska, A.; Kedzierski, M.; Wojtkowska, M.; Delis, P. Method of radiometric quality assessment of NIR images acquired with a custom sensor mounted on an unmanned aerial vehicle. J. Appl. Remote Sens. 2018, 12, 015008. [Google Scholar] [CrossRef]
  22. Sekrecka, A.; Kedzierski, M.; Wierzbicki, D. Pre-Processing of Panchromatic Images to Improve Object Detection in Pansharpened Images. Sensors 2019, 19, 5146. [Google Scholar] [CrossRef] [Green Version]
  23. Wierzbicki, D.; Kedzierski, M.; Sekrecka, A. A Method for Dehazing Images Obtained from Low Altitudes during High-Pressure Fronts. Remote Sens. 2019, 12, 25. [Google Scholar] [CrossRef] [Green Version]
  24. Specht, C.; Lewicka, O.; Specht, M.; Dąbrowski, P.; Burdziakowski, P. Methodology for Carrying out Measurements of the Tombolo Geomorphic Landform Using Unmanned Aerial and Surface Vehicles near Sopot Pier, Poland. J. Mar. Sci. Eng. 2020, 8, 384. [Google Scholar] [CrossRef]
  25. Specht, M.; Specht, C.; Lewicka, O.; Makar, A.; Burdziakowski, P.; Dąbrowski, P. Study on the Coastline Evolution in Sopot (2008–2018) Based on Landsat Satellite Imagery. J. Mar. Sci. Eng. 2020, 8, 464. [Google Scholar] [CrossRef]
  26. Yan, G.; Li, L.; Coy, A.; Mu, X.; Chen, S.; Xie, D.; Zhang, W.; Shen, Q.; Zhou, H. Improving the estimation of fractional vegetation cover from UAV RGB imagery by colour unmixing. ISPRS J. Photogramm. Remote Sens. 2019, 158, 23–34. [Google Scholar] [CrossRef]
  27. Wilkowski, W.; Lisowski, M.; Wyszyński, M.; Wierzbicki, D. The use of unmanned aerial vehicles (drones) to determine the shoreline of natural watercourses. J. Water Land Dev. 2017, 35, 259–264. [Google Scholar] [CrossRef]
  28. Jhan, J.-P.; Rau, J.-Y.; Haala, N. Robust and adaptive band-to-band image transform of UAS miniature multi-lens multispectral camera. ISPRS J. Photogramm. Remote Sens. 2018, 137, 47–60. [Google Scholar] [CrossRef]
  29. Ampatzidis, Y.; Partel, V. UAV-Based High Throughput Phenotyping in Citrus Utilizing Multispectral Imaging and Artificial Intelligence. Remote Sens. 2019, 11, 410. [Google Scholar] [CrossRef] [Green Version]
  30. Bobkowska, K.; Inglot, A.; Przyborski, M.; Sieniakowski, J.; Tysiąc, P. Low-Level Aerial Photogrammetry as a Source of Supplementary Data for ALS Measurements. In Proceedings of the 10th International Conference “Environmental Engineering”, Vilnius, Lithuania, 27–28 April 2017. [Google Scholar]
  31. Przyborski, M.; Szczechowski, B.; Szubiak, W. Photogrammetric Development of the Threshold Water at the Dam on the Vistula River in Wloclawek from Unmanned Aerial. In Proceedings of the SGEM2015, Albena, Bulgaria, 18–24 June 2015; pp. 18–24. [Google Scholar]
  32. Kovanič, L.; Blistan, P.; Urban, R.; Štroner, M.; Blišťanová, M.; Bartoš, K.; Pukanská, K. Analysis of the Suitability of High-Resolution DEM Obtained Using ALS and UAS (SfM) for the Identification of Changes and Monitoring the Development of Selected Geohazards in the Alpine Environment—A Case Study in High Tatras, Slovakia. Remote Sens. 2020, 12, 3901. [Google Scholar] [CrossRef]
  33. Hejmanowska, B.; Mikrut, S.; Strus, A.; Glowienka, E.; Michalowska, K. 4D Models in World Wide Web. Balt. Geod. Congr. 2018, 1–6. [Google Scholar] [CrossRef]
  34. Jeong, H.; Ahn, H.; Shin, D.; Ahn, Y.; Choi, C. A Comparative Assessment of the Photogrammetric Accuracy of Mapping Using UAVs with Smart Devices. Photogramm. Eng. Remote Sens. 2019, 85, 889–897. [Google Scholar] [CrossRef]
  35. Guimarães, N.; Pádua, L.; Adão, T.; Hruška, J.; Peres, E.; Sousa, J.J. VisWebDrone: A Web Application for UAV Photogrammetry Based on Open-Source Software. ISPRS Int. J. Geo-Inf. 2020, 9, 679. [Google Scholar] [CrossRef]
  36. Thevara, D.J.; Kumar, C.V. Application of photogrammetry to automated finishing operations. IOP Conf. Mater. Sci. Eng. 2018, 402, 012025. [Google Scholar] [CrossRef]
  37. Rossi, G.; Tanteri, L.; Tofani, V.; Vannocci, P.; Moretti, S.; Casagli, N. Multitemporal UAV surveys for landslide mapping and characterization. Landslides 2018, 15, 1045–1052. [Google Scholar] [CrossRef] [Green Version]
  38. Bakuła, K.; Ostrowski, W.; Pilarska, M.; Szender, M.; Kurczyński, Z. Evaluation and Calibration of Fixed-Wing Multisensor Uav Mobile Mapping System: Improved Results. ISPRS-Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, XLII-2/W13, 189–195. [Google Scholar] [CrossRef] [Green Version]
  39. Mikrut, S.; Brzęczek, J. Studies on external geometry of a plane with photogrammetric methods and laser scanning. ISPRS-Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2020, XLIII-B2-2, 459–464. [Google Scholar] [CrossRef]
  40. Stateczny, A.; Wlodarczyk-Sielicka, M.; Gronska, D.; Motyl, W. Multibeam Echosounder and LiDAR in Process of 360-Degree Numerical Map Production for Restricted Waters with HydroDron. Baltic Geod. Congr. 2018, 288–292. [Google Scholar] [CrossRef]
  41. Stateczny, A.; Gronska, D.; Motyl, W. Hydrodron-New Step for Professional Hydrography for Restricted Waters. Balt. Geod. Congr. 2018, 226–230. [Google Scholar] [CrossRef]
  42. Koc-San, D.; Selim, S.; Aslan, N.; San, B.T. Automatic citrus tree extraction from UAV images and digital surface models using circular Hough transform. Comput. Electron. Agric. 2018, 150, 289–301. [Google Scholar] [CrossRef]
  43. Dash, J.P.; Watt, M.S.; Paul, T.S.H.; Morgenroth, J.; Pearse, G.D. Early Detection of Invasive Exotic Trees Using UAV and Manned Aircraft Multispectral and LiDAR Data. Remote Sens. 2019, 11, 1812. [Google Scholar] [CrossRef] [Green Version]
  44. Johansen, K.; Duan, Q.; Tu, Y.-H.; Searle, C.; Wu, D.; Phinn, S.; Robson, A.; McCabe, M.F. Mapping the condition of macadamia tree crops using multi-spectral UAV and WorldView-3 imagery. ISPRS J. Photogramm. Remote Sens. 2020, 165, 28–40. [Google Scholar] [CrossRef]
  45. Torresan, C.; Berton, A.; Carotenuto, F.; Di Gennaro, S.F.; Gioli, B.; Matese, A.; Miglietta, F.; Vagnoli, C.; Zaldei, A.; Wallace, L. Forestry applications of UAVs in Europe: A review. Int. J. Remote Sens. 2017, 38, 2427–2447. [Google Scholar] [CrossRef]
  46. Hobart, M.; Pflanz, M.; Weltzien, C.; Schirrmann, M. Growth Height Determination of Tree Walls for Precise Monitoring in Apple Fruit Production Using UAV Photogrammetry. Remote Sens. 2020, 12, 1656. [Google Scholar] [CrossRef]
  47. Sofonia, J.; Shendryk, Y.; Phinn, S.; Roelfsema, C.; Kendoul, F.; Skocaj, D. Monitoring sugarcane growth response to varying nitrogen application rates: A comparison of UAV SLAM LiDAR and photogrammetry. Int. J. Appl. Earth Obs. Geoinf. 2019, 82, 101878. [Google Scholar] [CrossRef]
  48. Batistoti, J.; Junior, J.M.; Ítavo, L.; Matsubara, E.; Gomes, E.; Oliveira, B.; Souza, M.; Siqueira, H.; Filho, G.S.; Akiyama, T.; et al. Estimating Pasture Biomass and Canopy Height in Brazilian Savanna Using UAV Photogrammetry. Remote Sens. 2019, 11, 2447. [Google Scholar] [CrossRef] [Green Version]
  49. Barlow, J.; Gilham, J.; Cofrã, I.I. Kinematic analysis of sea cliff stability using UAV photogrammetry. Int. J. Remote Sens. 2017, 38, 2464–2479. [Google Scholar] [CrossRef]
  50. Gilham, J.; Barlow, J.; Moore, R. Detection and analysis of mass wasting events in chalk sea cliffs using UAV photogrammetry. Eng. Geol. 2019, 250, 101–112. [Google Scholar] [CrossRef] [Green Version]
  51. Hugenholtz, C.H.; Whitehead, K.; Brown, O.W.; Barchyn, T.E.; Moorman, B.J.; LeClair, A.; Riddell, K.; Hamilton, T. Geomorphological mapping with a small unmanned aircraft system (sUAS): Feature detection and accuracy assessment of a photogrammetrically-derived digital terrain model. Geomorphology 2013, 194, 16–24. [Google Scholar] [CrossRef] [Green Version]
  52. Šašak, J.; Gallay, M.; Kaňuk, J.; Hofierka, J.; Minár, J. Combined Use of Terrestrial Laser Scanning and UAV Photogrammetry in Mapping Alpine Terrain. Remote Sens. 2019, 11, 2154. [Google Scholar] [CrossRef] [Green Version]
  53. Partsinevelos, P.; Mertikas, S.; Agioutantis, Z.; Tsioukas, V.; Tripolitsiotis, A.; Zervos, P. Rockfall detection along road networks using close range photogrammetry. In Proceedings of the Second International Conference on Remote Sensing and Geoinformation of the Environment (RSCy2014), Paphos, Cyprus, 7–10 April 2014. [Google Scholar]
  54. Lessard, B. Shot in the Dark: Nocturnal Philosophy and Night Photography. In Critical Distance in Documentary Media; J.B. Metzler: Stuttgart, Germany, 2018; pp. 45–67. [Google Scholar]
  55. Portalã, S.C.; Lerma, J.L.; Portalés, C.; Lerma, J.L.; Pérez, C. Photogrammetry and augmented reality for cultural heritage applications. Photogramm. Rec. 2009, 24, 316–331. [Google Scholar] [CrossRef]
  56. Teza, G.; Pesci, A.; Ninfo, A. Morphological Analysis for Architectural Applications: Comparison between Laser Scanning and Structure-from-Motion Photogrammetry. J. Surv. Eng. 2016, 142, 04016004. [Google Scholar] [CrossRef]
  57. Ozbek, M.; Rixen, D.J.; Erne, O.; Sanow, G. Feasibility of monitoring large wind turbines using photogrammetry. Energy 2010, 35, 4802–4811. [Google Scholar] [CrossRef]
  58. Bulmer, M.H.; Farquhar, T.; Roshan, M. How to use fiducial-based photogrammetry to track large-scale outdoor motion. Exp. Tech. 2010, 34, 40–47. [Google Scholar] [CrossRef]
  59. Santise, M.; Thoeni, K.; Roncella, R.; Diotri, F.; Giacomini, A. Analysis of Low-Light and Night-Time Stereo-Pair Images For Photogrammetric Reconstruction. ISPRS-Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2018, XLII-2, 1015–1022. [Google Scholar] [CrossRef] [Green Version]
  60. Sledz, A.; Unger, J.; Heipke, C. Thermal ir imaging: Image quality and orthophoto generation. ISPRS-Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2018, XLII-1, 413–420. [Google Scholar] [CrossRef] [Green Version]
  61. Li, F.; Yang, W.; Liu, X.; Sun, G.; Liu, J. Using high-resolution UAV-borne thermal infrared imagery to detect coal fires in Majiliang mine, Datong coalfield, Northern China. Remote Sens. Lett. 2018, 9, 71–80. [Google Scholar] [CrossRef]
  62. Sun, C.; Xu, Y.; Ye, Z. Evaluating Road Lighting Effects on Traffic Safety around Critical Road Sites. In Proceedings of the CICTP 2017; American Society of Civil Engineers (ASCE): Reston, VA, USA, 2018; pp. 4620–4630. [Google Scholar]
  63. Budzyński, M.; Jamroz, K.; Mackun, T. Pedestrian Safety in Road Traffic in Poland. IOP Conf. Mater. Sci. Eng. 2017, 245, 42064. [Google Scholar] [CrossRef]
  64. Yannis, G.; Kondyli, A.; Mitzalis, N. Effect of lighting on frequency and severity of road accidents. Proc. Inst. Civ. Eng.-Transp. 2013, 166, 271–281. [Google Scholar] [CrossRef]
  65. Jamroz, K.; Budzyński, M.; Romanowska, A.; Żukowska, J.; Oskarbski, J.; Kustra, W. Experiences and Challenges in Fatality Reduction on Polish Roads. Sustainability 2019, 11, 959. [Google Scholar] [CrossRef] [Green Version]
  66. Zielinska-Dabkowska, K.M. Make lighting healthier. Nat. Cell Biol. 2018, 553, 274–276. [Google Scholar] [CrossRef] [PubMed]
  67. Chepesiuk, R. Missing the Dark: Health Effects of Light Pollution. Environ. Health Perspect. 2009, 117, 7–20. [Google Scholar] [CrossRef] [Green Version]
  68. Zielińska-Dabkowska, K.M.; Xavia, K.; Bobkowska, K. Assessment of Citizens’ Actions against Light Pollution with Guidelines for Future Initiatives. Sustainability 2020, 12, 4997. [Google Scholar] [CrossRef]
  69. Vaaja, M.; Maksimainen, M.; Kurkela, M.; Virtanen, J.-P.; Rantanen, T.; Hyyppä, H. Approaches for mapping night-time road environment lighting conditions. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2020, V-1-2020, 199–205. [Google Scholar] [CrossRef]
  70. Li, X.; Levin, N.; Xie, J.; Li, D. Monitoring hourly night-time light by an unmanned aerial vehicle and its implications to satellite remote sensing. Remote Sens. Environ. 2020, 247, 111942. [Google Scholar] [CrossRef]
  71. Bouroussis, C.A.; Topalis, F.V. Assessment of outdoor lighting installations and their impact on light pollution using unmanned aircraft systems-The concept of the drone-gonio-photometer. J. Quant. Spectrosc. Radiat. Transf. 2020, 253, 107155. [Google Scholar] [CrossRef]
  72. Guk, E.; Levin, N. Analyzing spatial variability in night-time lights using a high spatial resolution color Jilin-1 image-Jerusalem as a case study. ISPRS J. Photogramm. Remote Sens. 2020, 163, 121–136. [Google Scholar] [CrossRef]
  73. Hänel, A.; Posch, T.; Ribas, S.J.; Aubé, M.; Duriscoe, D.; Jechow, A.; Kollath, Z.; Lolkema, D.E.; Moore, C.; Schmidt, N.; et al. Measuring night sky brightness: Methods and challenges. J. Quant. Spectrosc. Radiat. Transf. 2018, 205, 278–290. [Google Scholar] [CrossRef] [Green Version]
  74. Roncella, R.; Bruno, N.; Diotri, F.; Thoeni, K.; Giacomini, A. Photogrammetric Digital Surface Model Reconstruction in Extreme Low-Light Environments. Remote Sens. 2021, 13, 1261. [Google Scholar] [CrossRef]
  75. Fryer, J.G.; Brown, D.C. Lens Distortion for Close-Range Photogrammetry. Photogramm. Eng. Remote Sens. 1986, 52, 51–58. [Google Scholar]
  76. Brown, D. Decentering Distortion of Lenses. Photom. Eng. 1966, 32, 444–462. [Google Scholar]
  77. Hartley, R.; Zisserman, A. Multiple View Geometry in Computer Vision; Cambridge University Press: Cambridge, UK, 2004; ISBN 9780521540513. [Google Scholar] [CrossRef] [Green Version]
  78. James, M.R.; Robson, S.; Smith, M.W. 3-D uncertainty-based topographic change detection with structure-from-motion photogrammetry: Precision maps for ground control and directly georeferenced surveys. Earth Surf. Process. Landf. 2017, 42, 1769–1788. [Google Scholar] [CrossRef]
  79. Stateczny, A.; Kazimierski, W.; Burdziakowski, P.; Motyl, W.; Wisniewska, M. Shore Construction Detection by Automotive Radar for the Needs of Autonomous Surface Vehicle Navigation. ISPRS Int. J. Geo-Inf. 2019, 8, 80. [Google Scholar] [CrossRef] [Green Version]
  80. International Telecommunication Union Recommendation ITU-R BT.709-6(06/2015) Parameter Values for the HDTV Standards for Production and International Programme Exchange. Recommendation ITU-R BT.709-6 2002. Available online: https://www.itu.int/rec/R-REC-BT.709 (accessed on 19 May 2021).
  81. Murtiyoso, A.; Grussenmeyer, P.; Börlin, N.; Vandermeerschen, J.; Freville, T. Open Source and Independent Methods for Bundle Adjustment Assessment in Close-Range UAV Photogrammetry. Drones 2018, 2, 3. [Google Scholar] [CrossRef] [Green Version]
  82. Alitappeh, R.J.; Saravi, K.J.; Mahmoudi, F. A New Illumination Invariant Feature Based on SIFT Descriptor in Color Space. Procedia Eng. 2012, 41, 305–311. [Google Scholar] [CrossRef] [Green Version]
  83. Wang, H.; Jin, S.; Wei, X.; Zhang, C.; Hu, R. Performance Evaluation of SIFT under Low Light Contrast. In Proceedings of the MIPPR 2019: Pattern Recognition and Computer Vision, Wuhan, China, 2–3 November 2019; Liu, Z., Udupa, J.K., Sang, N., Wang, Y., Eds.; SPIE: Berlin, Germany, 2020; Volume 11430, p. 7. [Google Scholar]
  84. Križaj, J.; Štruc, V.; Pavešić, N. Adaptation of SIFT Features for Face Recognition under Varying Illumination. In Proceedings of the The 33rd International Convention MIPRO, Opatija, Croatia, 24–28 May 2010; pp. 691–694. [Google Scholar]
  85. El-Gayar, M.; Soliman, H.; Meky, N. A comparative study of image low level feature extraction algorithms. Egypt. Inform. J. 2013, 14, 175–181. [Google Scholar] [CrossRef] [Green Version]
  86. Karami, E.; Prasad, S.; Shehata, M. Image Matching Using SIFT, SURF, BRIEF and ORB: Performance Comparison for Distorted Images. arXiv 2017, arXiv:1710.02726. [Google Scholar]
  87. Jia, Y.; Wang, K.; Hao, C. An Adaptive Contrast Threshold SIFT Algorithm Based on Local Extreme Point and Image Texture. In Proceedings of the 2019 IEEE International Conference on Mechatronics and Automation (ICMA), Tianjin, China, 4–7 August 2019; Institute of Electrical and Electronics Engineers (IEEE): Chem, Switzerland, 2019; pp. 219–223. [Google Scholar]
Figure 1. Test and result processing diagram.
Figure 1. Test and result processing diagram.
Sensors 21 03531 g001
Figure 2. The PSNR ratio calculated for (a) MP1 and (b) MP2 and diferent shuter speed.
Figure 2. The PSNR ratio calculated for (a) MP1 and (b) MP2 and diferent shuter speed.
Sensors 21 03531 g002
Figure 3. Day- and night-time control point image (a) MP1-D GCP No. 21, (b) MP1-N GCP No. 21, (c) MP1-D GCP No. 17, (d) MP1-N GCP No. 17.
Figure 3. Day- and night-time control point image (a) MP1-D GCP No. 21, (b) MP1-N GCP No. 21, (c) MP1-D GCP No. 17, (d) MP1-N GCP No. 17.
Sensors 21 03531 g003
Figure 4. Developed orthophotoimages (a) MP1-D, (b) MP2-D, (c) MP1-N, (d) MP2-N.
Figure 4. Developed orthophotoimages (a) MP1-D, (b) MP2-D, (c) MP1-N, (d) MP2-N.
Sensors 21 03531 g004
Figure 5. Ground Control Points (GCPs) and Check Points (CPs) distribution map.
Figure 5. Ground Control Points (GCPs) and Check Points (CPs) distribution map.
Sensors 21 03531 g005
Figure 6. M3C2 Distances (a) MP1-D/ MP1-N, (b) MP2-D/ MP2-N, (c) MP1-D/ MP2-D, (d) MP1-N/ MP2-N.
Figure 6. M3C2 Distances (a) MP1-D/ MP1-N, (b) MP2-D/ MP2-N, (c) MP1-D/ MP2-D, (d) MP1-N/ MP2-N.
Sensors 21 03531 g006aSensors 21 03531 g006b
Figure 7. Graphical comparison of calibration parameters (a) F, (b) Cx and Cy, (c) B1 and B2, (d) K1-4, (e) P1 and P2.
Figure 7. Graphical comparison of calibration parameters (a) F, (b) Cx and Cy, (c) B1 and B2, (d) K1-4, (e) P1 and P2.
Sensors 21 03531 g007
Figure 8. Distortion visualisations and profiles for MP1 day (a,c) and night cases (b,d).
Figure 8. Distortion visualisations and profiles for MP1 day (a,c) and night cases (b,d).
Sensors 21 03531 g008
Figure 9. Distortion visualizations and profiles for MP2 day (a,c) and night cases (b,d).
Figure 9. Distortion visualizations and profiles for MP2 day (a,c) and night cases (b,d).
Sensors 21 03531 g009
Figure 10. Simulated maximum ground point displacement for a flight altitude of 100 m, taking into account IOP and distortion parameter determination errors (a) MP1-D, (b) MP1-N, (c) MP2-D, (d) MP2-N.
Figure 10. Simulated maximum ground point displacement for a flight altitude of 100 m, taking into account IOP and distortion parameter determination errors (a) MP1-D, (b) MP1-N, (c) MP2-D, (d) MP2-N.
Sensors 21 03531 g010
Figure 11. Tie point quality measurements—night cases (a) MP1 intensity, (b) MP1 RE (c) MP1-vPrec (d) MP2 intensity, (e) MP2 RE (f) MP2-vPrec.
Figure 11. Tie point quality measurements—night cases (a) MP1 intensity, (b) MP1 RE (c) MP1-vPrec (d) MP2 intensity, (e) MP2 RE (f) MP2-vPrec.
Sensors 21 03531 g011
Figure 12. Correlations between accuracy parameters relative to point intensity (a) MP1 Day (MP1-D), (b) MP1 Night (MP1-N), (c) MP2-Day (MP2-D), (d) MP2-Night (MP2-N).
Figure 12. Correlations between accuracy parameters relative to point intensity (a) MP1 Day (MP1-D), (b) MP1 Night (MP1-N), (c) MP2-Day (MP2-D), (d) MP2-Night (MP2-N).
Sensors 21 03531 g012
Figure 13. Decrease in the number of matches for day- and night-time cases (a) 1238 total matches, (b) 208 total matches.
Figure 13. Decrease in the number of matches for day- and night-time cases (a) 1238 total matches, (b) 208 total matches.
Sensors 21 03531 g013
Table 1. Technical data of camera installed onboard UAVs.
Table 1. Technical data of camera installed onboard UAVs.
Technical DataFC220Hasselblad L1D-20c
Sensor size1/2.3″ CMOS, 12.35 MP1″ CMOS, 20 MP
Pixel Size1.57 × 1.57 μm2.41 × 2.41 μm
Focal Length4.73 mm10.26 mm
LensFOV * 78.8° (28 mm **) f/2.2FOV * 77° (28 mm **) f/2.8
Focusfrom 0.5 m to ∞from 1 m to ∞
ISO Range (photo)100–1600100–12,800
Shutter Speed, type8 s–1/8000 s, electronic8 s–1/8000 s, electronic
Image Size (pixels)4000 × 3000 5472 × 3648
Photo file formatJPEG, DNGJPEG, DNG
* FOV—field of view, ** 35 mm equivalent focal length.
Table 2. Research flight parameters.
Table 2. Research flight parameters.
Flight SymbolUAVDaytimeFlight PlanCoverage Area
MP1-DDJI Mavic Prodaysingle grid auto0.192 km²
MP2-DDJI Mavic Pro 2daysingle grid auto0.193 km²
MP1-NDJI Mavic Pronightsingle grid manual0.114 km²
MP2-NDJI Mavic Pro 2nightsingle grid manual0.147 km²
Table 3. Flight plan data—read from reports.
Table 3. Flight plan data—read from reports.
FlightFlying Altitude (Reported)Ground ResolutionNumber of PhotosCamera Stations
MP1-D105 m3.05 cm/px129129
MP2-D127 m2.26 cm/px126126
MP1-N69.7 m3.11 cm/px136136
MP2-N118 m2.24 cm/px114114
Table 4. Tie points and reprojection error data.
Table 4. Tie points and reprojection error data.
FlightTie PointsMean Key Point SizeProjectionsReprojection ErrorMax Reprojection Error
MP1-D134.8945.08186 px450.8740.952 px20.844 px
MP2-D141.0353.26046 px414.5850.543 px20.5249 px
MP1-N46.8708.41085 px131.6252.04 px37.4667 px
MP2-N57.3929.15704 px154.0541.85 px36.7523 px
Table 5. Average camera location error. X—Easting, Y—Northing, Z—Altitude.
Table 5. Average camera location error. X—Easting, Y—Northing, Z—Altitude.
FlightX Error (m)Y Error (m)Z Error (m)XY Error (m)Total Error (m)
MP1-D2.603812.3683844.91933.5198145.057
MP2-D2.005142.6832722.90083.3497123.1445
MP1-N1.092683.6544425.94143.814326.2203
MP2-N2.490581.4853513.42762.8998713.7372
Table 6. Control point root mean square error (RMSE). X—Easting, Y—Northing, Z—Altitude.
Table 6. Control point root mean square error (RMSE). X—Easting, Y—Northing, Z—Altitude.
FlightGCP CountX Error (cm)Y Error (cm)Z Error (cm)XY Error (cm)Total (cm)
MP1-D139.5398411.4392.5771614.89515.1163
MP2-D1411.802515.67513.2247419.621619.8848
MP1-N103.529712.805792.068054.509024.96066
MP2-N98.961188.1460.53135712.110312.122
Table 7. Check point root mean square error (RMSE). X—Easting, Y—Northing, Z—Altitude.
Table 7. Check point root mean square error (RMSE). X—Easting, Y—Northing, Z—Altitude.
FlightCPs CountX Error (cm)Y Error (cm)Z Error (cm)XY Error (cm)Total (cm)
MP1-D38.4811216.859415.690918.872524.5433
MP2-D321.854334.121125.910140.519948.0957
MP1-N32.364024.104612.977484.736715.5948
MP2-N32.034368.851045.070369.0818210.4013
Table 8. Camera internal orientation element value for each image.
Table 8. Camera internal orientation element value for each image.
ParameterFlight
MP1-DMP2-DMP1-NMP2-N
ValueErrorValueErrorValueErrorValueError
F3127.8600005.2000005206.03000010.0000001532.4600006.8000004882.83000012.000000
Cx−28.1667000.140000−82.6838000.49000031.5201000.210000−75.9967000.820000
Cy0.1620810.120000−6.1821400.200000−33.1754000.180000−28.0098000.370000
B1−10.0432000.076000−22.2775000.1100000.4575340.063000--
B2−6.3083300.058000−6.2898800.0880000.6298840.05700014.1126000.170000
K10.0985790.000490−0.0290470.0001700.0479360.000510−0.0214960.000610
K2−0.5854590.0045000.0265800.000780−0.0681250.001300−0.0027580.003500
K31.4200500.015000−0.0518380.0014000.0439680.001200−0.0112420.006000
K4−1.1806700.017000--−0.0098560.000370--
P1−0.0002860.000010−0.0032910.000008−0.0002650.000021−0.0030150.000031
P20.0003620.000011−0.0016470.000009−0.0000300.000018−0.0018950.000023
Table 9. Mean distance value and standard deviation for M3C2.
Table 9. Mean distance value and standard deviation for M3C2.
M3C2 CaseMean (m)Std.Dev (m)
MP1-D/N−0.394.14
MP2-D/N1.634.59
D-MP1/MP20.112.52
N-MP1/MP2−3.175.04
Table 10. Statistical values of the maximum error, in metres.
Table 10. Statistical values of the maximum error, in metres.
MP1-DMP2-DMP1-NMP2-N
E X C E Y C E Z C E X C E Y C E Z C E X C E Y C E Z C E X C E Y C E Z C
Max0.46970.35700.00010.10190.07240.00003.08351.70380.00030.17680.10170.0001
Min−0.1712−0.13510.00010.07810.05680.0000−1.2506−0.98800.00030.06640.03750.0001
Mean0.10730.08040.00010.08970.06420.00000.61040.30720.00030.11870.06890.0001
Range0.64090.49210.00000.02380.01560.00004.33402.69170.00000.11050.06420.0000
Std0.06910.04580.00000.00320.00180.00000.56510.24920.00000.01600.00710.0000
Table 11. List of statistical data (m- mean, s-standard deviation).
Table 11. List of statistical data (m- mean, s-standard deviation).
Flight m I s I m R E (px) s R E (px) m v P r e c (cm) s v P r e c (cm)
MP1-D113.2451.780.66560.5253401.221458.20
MP2-D124.7644.900.31300.3659290.631686.36
MP1-N25.7623.491.33111.4070365.21994.63
MP2-N32.83927.021.16801.26803.422471.73
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Burdziakowski, P.; Bobkowska, K. UAV Photogrammetry under Poor Lighting Conditions—Accuracy Considerations. Sensors 2021, 21, 3531. https://doi.org/10.3390/s21103531

AMA Style

Burdziakowski P, Bobkowska K. UAV Photogrammetry under Poor Lighting Conditions—Accuracy Considerations. Sensors. 2021; 21(10):3531. https://doi.org/10.3390/s21103531

Chicago/Turabian Style

Burdziakowski, Pawel, and Katarzyna Bobkowska. 2021. "UAV Photogrammetry under Poor Lighting Conditions—Accuracy Considerations" Sensors 21, no. 10: 3531. https://doi.org/10.3390/s21103531

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop