Next Article in Journal
Microphysical Characteristics of Convective and Stratiform Precipitation Generated at Different Life Stages of Precipitating Cloud in the Pre-Summer Rainy Season in South China
Previous Article in Journal
Change Patterns of Ecological Vulnerability and Its Dominant Factors in Mongolia During 2000–2022
Previous Article in Special Issue
Hyperspectral Target Detection Based on Masked Autoencoder Data Augmentation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Application Possibilities of Orthophoto Data Based on Spectral Fractal Structure Containing Boundary Conditions

Department of Drone Technology and Image Processing, Dennis Gabor University, Fejér Lipót Street 70, 1119 Budapest, Hungary
Remote Sens. 2025, 17(7), 1249; https://doi.org/10.3390/rs17071249
Submission received: 12 February 2025 / Revised: 12 March 2025 / Accepted: 24 March 2025 / Published: 1 April 2025
(This article belongs to the Special Issue Image Processing from Aerial and Satellite Imagery)

Abstract

:
The self-similar structure-based analysis of digital images offers many new practical possibilities. The fractal dimension is one of the most frequently measured parameters if we want to use image data in measurable analyses in metric spaces. In practice, the fractal dimension can be measured well in simple files containing only image data. In the case of complex image data structures defined in different metric spaces, their measurement in metric space encounters many difficulties. In this work, we provide a practical solution for the measurement of ortho-photos—as complex image data structures—based on the spectral fractal structure based on boundary conditions (height, time, and temperature), presenting the further development of the related theoretical foundations. We will discuss the optimal flight altitude determination in detail through practical examples. For this, in addition to the structural measurements on the images, we also use the well-known image entropy in information theory. The data obtained in this way can facilitate the optimal UAS operation execution that best suits further image processing tasks (e.g., classification, segmentation, and index analysis).

1. Introduction

The spread of UAV-based image sensing devices has significantly accelerated the application of high geometric resolution image data taken near the earth’s surface at low altitudes. Precise processing and analysis of these data can complement the comparative analysis of image data taken previously by aerial or space vehicles. This enables the reliable examination of global processes based on remote sensing data. The spectral sensors of UAV device camera systems are currently showing significant and rapid technical development. In addition to multispectral cameras, hyperspectral UAV-based image sensors have also become widespread. This has brought to the fore the application of data processing methods that are heterogeneous in spectral characteristics, in addition to different geometric resolutions and sensor types. The analysis of spectral characteristics before processing based on self-similar (fractal-based) image data is most important here.
An excellent summary textbook was prepared to present the application possibilities of fractal dimension in network systems [1]. Multifractals can be considered a further development [2,3]. The spectral fractal dimension relations according to (3), (5), and (6) have been applied in many fields in practice, without the detailed development of the mathematical foundations for metric spaces. The role of spectral fractal structure and spectral fractal dimension in image processing, as well as the presentation of its first possible applications (reed mushroom pathogen investigation, examination of sawn wood board patterns, examination of crop seeds, examination of herbicide-treated sweet corn, examination of leaf damage, examination of psychovisual test images, and 3D simulation) were first presented in 2004 [4,5,6,7]. A more detailed description of the spectral fractal structure and its practical application in psychovisual laboratory testing of the efficiency of image compression methods were presented in the works [8,9]. SFD has also been used as a measure-to-measure spectral property of space images [10], to classify potato tubers and chips [11], and to study the damage of plant parts over time [12]. It has also been used to measure the characteristics of multimedia systems [13], to classify multi- and hyperspectral images [14,15,16], and to separate crops based on color [17]. During the aerial survey of the red mud disaster [18,19,20], it was used to locally identify laminar and turbulent flow areas when processing visible (VIS), near-infrared (NIR), and fared-infrared (FIR) images, which was successfully applied to images with three different spectral ranges [21]. In the impact assessment of traffic-related pollutants, SFD and Shannon entropy were compared. Based on the images taken in the second half of the vegetation period, it can be clearly stated that in the case of maize, the average information content (entropy) gives the highest value in the images taken in the near infrared range, regardless of whether the detector was optimized for the visible range or what treatment the plant was exposed to. However, the information content analysis did not show any difference between the individual treatments in any spectral range. To detect these, the spectral fractal structure (SFD)-based analysis gave significant results [22,23]. A fractal dimension calculation method has also been published that attempt to determine the dimension in an image-independent manner [24], but its practical significance has yet to be seen. It is mentioned in the single scale-based classification of Earth observation systems [25]. It has also been used in the prognosis of multiple sclerosis [26] and in the analysis of CT images [27]. It is indicated as an application in a different scientific field in [28]. It was used to verify the reliability of measurement data in the work [29], where laboratory measurements aimed at wheat quality estimation were evaluated using image processing methods. In the investigation of the impact of imaging algorithms on image classification procedures, the results were presented based on data depth [30] and image structure-based analyses [31,32]. Nowadays, it has played a key role in the joint application of medicine, biology, physics, and computer science, applying the Shannon entropy [33,34]-weighted version of SFD (EW-SFD) [35] to the first detection of ultra-weak photon emission from mouse embryos. Here, it demonstrated the significant difference in live, degenerated, and background samples in digital images with an extremely low signal-to-noise ratio [35,36]. Further research options have suggested the analysis of RGB and structured similarity (SSIM) indices on UAV images [37], their application in vegetation mapping [38], and their use in a comparative analysis of NDVI indices calculated from multitemporal and Bayer-type RGB images [39]. The fractal structure-based analysis of digital images offers many possibilities. If we want to supplement the image and other (e.g., terrestrial) data related to the image with structural data that can be measured in metric spaces, one of the most frequently measured parameters may be the fractal dimension.
Mandelbrot defined the concept of fractal as follows [40]: “A fractal is by definition a set for which the Hausdorff Besicovitch dimension strictly exceeds the topological dimension”. In practice, in the case of digitally recorded data (e.g., images, sounds, and videos), the above definition is met with a good approximation; however, in the case of complex image data structures defined in different metric spaces and with different content, their measurement encounters numerous difficulties.
Theoretical description of the fractal dimension [41]: Let be ( X , d ) a metric space, and A H ( X ) . Let be N ( ε ) the number of spheres of minimal radius ε that cover the set A .
If
F D = L i m ε 0 S u p L n N ( ε ¯ ) L n ( 1 / ε ¯ ) : ε ¯ ( 0 , ε )
exists, then FD is called the fractal dimension of the set.
The general measurable definition of fractal dimension (FD) is as follows:
F D = log L 2 L 1 log S 1 S 2
where L1 and L2 are the lengths measured on the (fractal) curve, and S1 and S2 are the size of the (arbitrary) measure used (e.g., the resolution in the case of digital images). The calculation of the fractal dimension is simple and can be done based on several well-known algorithms—Table 1 [4].
The fundamental problem when applying the fractal dimension defined based on Equation (2) is that it is insensitive to the shades and colors of the image. All of this is well illustrated in Figure 1, where we can see BW, Grayscale, and 3 × 8-bit color (RGB) images. Applying Equation (2) with the box counting method, the measured FD values are the same for all three images (FD = 2). The solution to this problem is the introduction of SFD according to [49] and its application during measurements.
The most common product of the alignment of drone images is an orthophoto. An orthophoto contains height and spectral data. The dimension defined by (2) is suitable for measuring height data, which can be calculated most simply by [42]. At the same time, the following spectral fractal dimension (SFD), (3) relationship defined in [49] can be used to measure spectral structure:
S F D m e a s u r e d = n × j = 1 S 1 log ( B M j ) log ( B T j ) S 1
where
  • n—number of image layers or bands;
  • S—spectral resolution of the layer, in bits;
  • BMj—number of spectral boxes containing valuable pixels in case of j-bits;
  • BTj—total number of possible spectral boxes in case of j-bits.
The number of possible spectral boxes (BTj) in case of j-bits is as follows:
B T j = ( 2 S ) n
Metric (3) is thus suitable for measuring self-similar structure in images [49]. With Equations (3) and (4) the general measurable definition of spectral fractal dimension is as follows, if the spectral resolution is equal to all bands (SFDEqual Spectral Resolution–SFDESR) (5):
S F D E S R = n × j = 1 S 1 log ( B M j ) log ( ( 2 S ) n ) S 1
If the spectral resolution is different in bands/layers, the general measurable definition of spectral fractal dimension (SFDDifferent Spectral Resolution–SFDDSR) is as follows [49]:
S F D D S R = n × j = 1 ( min ( S i ) ) 1 log ( B M j ) log ( 2 ( k = 1 n S k ) ) ( min ( S i ) ) 1
where
  • Si—spectral resolution of the layer i, in bits.
As another possibility, we mention the relation presented in [35], which is the weighting of the self-similar spectral image structure with entropy, as follows:
E W S F D m e a s u r e m e n t = n j = 1 S 1 l o g k = 1 2 S p k l d 1 p k × ( S B M j ) j l o g ( S B T j ) S 1
The introduction of EW-SFD was developed based on energetic, physical, and biological laws, the details of which are contained in the [35] papers.
Below we show that while relations (3), (5), and (6) are metrics according to [14], (7) is also a metric.
For metrics, all of the following conditions must be true [14,49]:
  • Non-negative definite, that is
ρ P 1 , P 2 0
ρ P 1 , P 2 = 0 if   P 1 = P 2
2.
Symmetric, that is
ρ P 1 , P 2 = ρ P 2 , P 1
3.
Satisfies triangle inequality, that is
ρ P 1 , P 3 ρ P 1 , P 2 + ρ P 2 , P 3
4.
Regularity, this means that the points of a discreet image plane are to be evenly dense.
Let
ρ S F D : = S F D A + P S F D A
where A is an optional subset of N dimension image, whereas P is an optional pixel of N dimension image, and in the context of (11) above, SFD means the metrics according to (3), (5)–(7).
The Shannon entropy [33,34] interpreted as a weight factor for the independent pixels in the denominator of Equation (7) is positive and finite. It cannot be zero, since in the case of metric spaces the zero-set (empty) image is not part of the space. The finite condition is also fulfilled since in practice, every digital image consists of a finite number of pixels. Based on the above, the proof presented earlier in Chapter 4 of [14] (SFD as metrics) is also true for Equations (8)–(10).
When the topology X is given by a metric, the closure A ¯ of A in X is the union of A and the set of all limits of sequences of elements in A (its limit pixels) [14,40,50,51,52],
A ¯ = A ( lim n a n :   a n A   f o r   a l l   n N )
Then, A is dense in X if
A ¯ = X
The fulfillment of relations (12) and (13) in the case of digital images implies several practical conditions for the following:
  • The image sensor chip;
  • Readout;
  • File formats.
In the case of image sensor chips, the condition primarily means the same geometric size of the pixel sensor elements in the x, y direction, and the same cell size within the entire chip. All of this is usually fulfilled in industrial devices in the case of space, air, and UAV image sensors.
During the readout, the omission of data from several pixel rows/columns at the edges of the sensors in the final image significantly improves the signal-to-noise ratio of the pixels at the edges. However, distortions and inadequate geometric mapping at the corners of the image worsen this. In the case of professional UAV camera systems, the distortion caused by the mapping can be significantly improved with an appropriate optical system.
In the case of the file format, the condition is the exact storage of the image sensor chip data in a lossless, raw image format, based on standards. However, it is advisable to check the fulfillment of the conditions outlined above based on the technical data of the UAV devices.

2. Materials and Methods

In the case of orthophotos, the data in spectral space represent image layers (n—number of image layers or bands) made in different spectral ranges. Table 2 contains the spectral characteristics of some discrete spectral range sensors and camera arrays made for Unmanned Aerial Vehicle (UAV) systems (MicaSense Dual, MicaSense Altum-PT: AgEagle Aerial Systems Inc. 8201 E 34th Street N Suite 1307 Wichita, KS, USA; Sentera 6x Multispectral: Sentera Sensors and Drones 767 N. Eustis St., Ste 120, St. Paul, MN 55114, USA; Parrot Sequoia+ Multispectral: Parrot 38 avenue John F. Kennedy, L-1855 Luxembourg; DJI P4 Multispectral, DJI M3 Multispectral: DJI Sky City, No.55 Xianyuan Road, Nanshan District, Shenzhen, China).
There are also data produced as images that are not spectral (e.g., height, time, and temperature data). Height data are image data related to pixels interpreted in a given projection system (e.g., WGS 84 (EPSG:4326)). We can also consider temporal changes as image layers (multi-, hypertemporal recordings). We can also treat the temperature data of UAV FIR cameras as image layers. Let us consider these non-spectral data as separate image layers in addition to the spectral ones. These data can generally serve as boundary or initial conditions during spectral analyses. Regions (ROI—Region of Interest) can be specified on the image plane for delimitation, a height value or range can be indicated on the layer containing height data, while the temperature data/range produced as an image can indicate the emitted energy. We will interpret these as boundary conditions in the following. In addition to the conditions interpreted as image layers above, Equations (3), (5), and (6) can be applied. At the same time, we can also perform analyses based on (3), (5), and (6) on boundary conditions as independent layers.
Another possibility is that Equations (3), (5), and (6) can also be interpreted for the combination of spectral and boundary conditions. If we want to quantify the fractal dimension of a height, time, or temperature distribution, we need the full spectrum of fractal dimensions, which, however, is not considered multifractal.

3. Results

The distribution of the dimensions defined in Equations (3), (5), (6), and (7), with respect to height, time, or temperature, defined by the boundary conditions described above, interpreted for a given object (natural or artificial objects), can be considered as the spectral height, time, or temperature fingerprint characteristic of the given object, as follows:
S F D m e a s u r e d h ,   t ,   T = n × j = 1 S 1 log B M j h ,   t ,   T log B T j h ,   t ,   T S 1
where
  • n—number of image (h, t, T) excluding layers or bands;
  • S—spectral resolution of the (h, t, T) excluding layer, in bits;
  • BMj (h, t, T)—number of spectral boxes containing valuable pixels in case of j-bits (h, t, T) distributions;
  • BTj (h, t, T)—total number of possible spectral boxes in case of j-bits (h, t, T) distributions.
The number of possible spectral boxes (BTj) in case of j-bits (h, t, T) distributions is as follows:
B T j = ( 2 S ) n
The images shown in Figure 2 were taken with the DJI M3 Multispectral camera array specified in Table 2, at a flight altitude of 100 m relative to the take-off point (GND), on 29 May 2024. During the creation of the orthophoto, we separately aligned the RGB images containing the Bayer sensor and the 4 discrete spectral band images, by aligning 35 images per sensor. We separately calculated the Digital Elevation Model (DEM) (labeled RGB-DEM and MS-DEM in Figure 2) height data based on the RGB and MS images.
Table 3 summarizes real SFD values and Shannon entropy calculated according to (3) for the images in Figure 2. We have shown the data measured for the RGB, MS, and 9 channels together. Regardless of the image format (RGB-JPEG, MS-TIFF), the S (bit) value clearly shows the real data depth occurring in the image. It is worth mentioning that the average information content of the four independent MS channels is the same as the total entropy of the entire nine bands (22.0394), but the SFD values differ significantly. All this clearly illustrates the more efficient possibilities of separating the measurements and bands inherent in the image structure.
Table 4 summarizes the results of the sorting (14) and entropy measurements above 10 m (relative to the lowest point in the image). Based on the RGB-DEM, the distance between the lowest and highest points is 21.6 m. It can be stated that while the average information content has decreased significantly, the self-similar structure of the image has remained almost unchanged, close to the maximum value. However, we note that since we are dealing with Bayer-type image data, the error is significant (min. ±1/28) due to the 8-bit resolution per band. In Figure 3, the result of the sorting is illustrated on the 3D model of the orthophoto taken from the area.
A further application of the spectral fractal structure can be the determination of the optimal flight altitude for later processing. During the operation, the drone (DJI M30) ascended to an altitude of 1500 m and then descended to the surface after 10 min. During both operations, it took images at 2 s intervals with a vertical optical axis. During the flight, the weather and light conditions were suitable (Temperature: 23 °C, Wind: 13 km/h, Gust: 27 km/h, Wind direction: N, and Cloud cover: 54%, Kp 1).
During the ascent, with unchanged weather conditions, the maximum entropy was measured at 48 m (Hmax = 15.70), while the minimum entropy was measured at 1440 m (Hmin = 12.88). Figure 4 illustrates the images taken by the visible range sensor during the ascent.
During the descent, the image with the maximum information content was taken at an altitude of 56 m, with a value of Hmax = 15.71, (Figure 5). The minimum entropy was measured at 1072 m, with a value of Hmin = 12.97.
During the ascent, based on the measurement of the self-similar image structure according to (3) and (7), the maximum value was obtained at height of 28 m (SFDmax = 2.58, EW-SFDmax = 2.30). In the case of SFD, the minimum value was at 920 m (SFDmin = 2.21). In the case of EW-SFD, the minimum value was measured at 976 m (EW-SFDmin = 1.67).
During the descent, the maximum of SFD was at 31 m (SFDmax = 2.57), and the minimum at 925 m (SFDmin = 2.20). The maximum of EW-SFD was at 31 m (EW-SFDmax = 2.29), and the minimum at 1295 m (EW-SFDmin = 2.02), (Figure 6).
The data obtained during the measurement described in detail above are summarized in Table 5.
Optimal data acquisition may depend on several environmental conditions and some technical parameters of the image sensors. For a more precise determination, we have planned and implemented an additional operation. Among the environmental conditions, we deal with cloud cover. Among the technical characteristics, we deal with camera arrays that detect in different spectral bands. The entire data acquisition was implemented as follows between 10 June 2023 and 28 February 2024:
  • 47 operations;
  • 33 were carried out at different times;
  • Each operation took place within the area of the Kis-Balaton I or II watershed;
  • Each recording was made in the range of GND—1500 m.
Regarding the camera arrays, we used Bayer sensors, multispectral and FIR global shutter sensor arrays, 35 mm full-frame, and Oblique cameras.
Below, we present the results of measurements experienced during an operation with 88% cloud cover (Temperature: 23 °C, Wind: 11 km/h, Gust: 23 km/h, Wind direction: N, Cloud cover: 88%, and Cloud base: 500 m, Kp 2). The operation was performed with a DJI M30T device, so we could evaluate data from cameras operating in the visible and fared infrared ranges. Based on the entropy values of the visible range images taken during the ascent, three significantly different stages can be distinguished:
  • Below-cloud (5–658 m);
  • Transitional (668–788 m);
  • Above-cloud (798–1500 m).
In the determination of the individual sections, the condition for determining the endpoints in our case was the larger deviation from the single standard deviation. The calculations were performed based on the standard deviation of the entropy values measured at the two end points, moving towards the middle points.
Table 6 shows the average and standard deviation of the measurements of the visible and thermal images characteristic of each section based on sections A, B, and C. The individual sections show a clearly distinguishable, significant difference (Figure 7). The images taken at the beginning and end points of the sections are shown in Figure 8.
If we want to create an orthophoto from images taken from the surface during the operation, images taken in the range of 5–668 m are suitable for this. However, VIS images above this range are no longer suitable.
Figure 9 clearly shows that the EW-SFD allows for a more detailed assessment within each section. If we define each section based on the VIS images (based on the previously described condition), we get the following values:
  • Below-cloud (5–708 m);
  • Transitional (718–1078 m);
  • Above-cloud (1088–1500 m).
The starting and ending points of the resulting segment definition differ from the data of the definition based on the visible range images. Figure 10 shows the FIR images taken at the endpoints of each section, while Table 7 shows the measurement data obtained for each section. If the post-processing is known, the operation execution at a given height can be optimized for each camera—based on average information content or image structure.
When taking images in the visible range, the phenomenon of Mie scattering (0.1–1 microns) is most noticeable [53]. However, when taking FIR images, the effect of non-selective scattering (5–100 microns) appears [54]. The VIS (Figure 8) and FIR (Figure 10) images clearly show the difference.
We also performed measurements of the global shutter cameras of the DJI M3 Multispectral camera array (Table 2 shows the spectral peak and half-value width). During the operation, the Temperature was 23 °C, Wind: 21 km/h, Gust: 35 km/h, Wind direction: N, Cloud cover: 93%, and Cloud base: 600 m, Kp 3.33. The total climb lasted 4 min 14 s and during the operation the UAV continuously climbed to an altitude of 1500 m. The results of the measurements are summarized in Table 8 and Figure 11.
The maximum values of the measured parameters are found in the same height range (14–24 m). The individual height maximum values are also the same in many cases. If the four different bands are measured as a single image, the maximum values for all three measured parameters occur at different heights or in different height ranges: Hmax = 84–96, SFDmax = 36, and EW-SFDmax = 14–24. Based on the values given in the table, if the individual image layers are processed separately or together, we can obtain the flight height that best suits our subsequent processing.
If we measure the information content (H) or image structure-based data of high-altitude flight in real time, we can determine the optimal flight altitude in each environment under these conditions, adapting it to subsequent processing.

4. Discussion

In this work, we proposed a practical solution for measuring orthophotos—as complex image data structures—based on the spectral fractal structure, based on DEM-based height data as boundary conditions. In practice, UAV-based orthophotos constitute the input data for higher-level image processing processes (index analysis, segmentation, image classification, and image recognition). Since the application of orthophotos in higher-level image processing processes is highly sensitive to the data depth of the pixels and, in the case of Bayer-type RGB images, to the image creation algorithms, preliminary analyses based on image structure can shed light on possible errors. Classification methods based on spectral image structure may be appropriate for sorted images, while image classification methods based on image content are less so.
By measuring the fractal structure or the information content of the image data in real time, we can determine the shooting or flight height optimized for later—classification and index analysis—algorithms. All this was presented based on measurements on images taken with several camera systems (VIS, VIS-FIR, and MS). In this work, we also discussed the relationship between cloud cover and optimized altitude flight. It can be stated that the three parameters (H, SFD, and EW-SFD) are still applicable, but the predicted optimal altitude values become closer to each other as cloud cover increases.
The practical application of EW-SFD may be justified in many cases since the information content and the self-similar spectral fractal structure appear together in the formula. As a result, it can contain measurement areas in more detail. This is well illustrated in (Figure 11), where the SFD R and G curves are almost identical, but the EW-SFD R and G are clearly separated.

5. Conclusions

The dependence of entropy and image structure on several classification algorithms is currently being investigated. The results will enable further practical application of optimized operations. We would also like to precisely determine the computational demands that real-time processing of individual elevation images may place on the controller. Preliminary results show that the controller of industrial drones is capable of measuring entropy in near real-time. However, today’s modern laptops are needed to measure SFD and EW-SFD in real time. However, this also requires the real-time transmission of the data of the entire camera array to ground devices during the operations.

Funding

Project no. TKP2021-NVA-05 has been implemented with the support provided by the Ministry of Innovation and Technology of Hungary from the National Research, Development and Innovation Fund, financed under the TKP 2021 funding scheme.

Data Availability Statement

The original contributions presented in this study are included in the article; further inquiries can be directed to the corresponding author.

Acknowledgments

The author would like to thank the researchers at the Department of Drone Technology and Image Processing, Dennis Gabor University, for their useful advice and suggestions that significantly helped in the preparation of this study. I would like to thank my daughter and her lovely partner for their advice and corrections during the creation of the images and illustrations in this work. I am grateful to Hévíz-Balaton Airport for its air control during the execution of the operation and for its meteorological support for the practical implementation. I am especially grateful to the staff of the Balaton Upland National Park, the West Transdanubian Water Management Directorate, and the Kis-Balaton Engineering District for their helpful and useful advice.

Conflicts of Interest

The author declares no conflicts of interest.

Nomenclature

BWBlack and White (1 bit)
CTComputer Tomography
DEMDigital Elevation Model
EW-SFDEntropy-Weighted Spectral Fractal Dimension
FDFractal Dimension
FIRFared InfraRed
GNDGround (altitude relative to take-off point)
HEntropy
MSMultispectral
MS-GMultispectral camera array G band
MS-NIRMultispectral camera array Near-InfraRed band
MS-R Multispectral camera array R band
MS-REMultispectral camera array Red-Edge band
NIRNear-InfraRed
RERed-Edge
RGBRed, Green, Blue (as color-space)
RGB-BB band of RGB image of Bayer sensor
RGB-GG band of RGB image of Bayer sensor
RGB-RR band of RGB image of Bayer sensor
SFDSpectral Fractal Dimension
UAV Unmanned Aerial Vehicle
UASUnmanned Aerial System
VISVisible

References

  1. Rosenberg, E. Fractal Dimensions of Networks; Springer: Cham, Switzerland, 2020. [Google Scholar]
  2. Mandelbrot, B.B. Fractals: Forms, Chance and Dimensions; W.H. Freeman and Company: San Francisco, CA, USA, 1977. [Google Scholar]
  3. Hentschel, H.G.E.; Procaccia, I. The Infinite Number of Generalized Dimensions of Fractals and Strange Attractors. Phys. D 1983, 8, 435–444. [Google Scholar]
  4. Berke, J. Spectral fractal dimension. In Proceedings of the 7th WSEAS Telecommunications and Informatics (TELE-INFO ’05), Prague, Czech Republic, 12–14 March 2005; pp. 23–26. [Google Scholar]
  5. Berke, J. Fractal dimension on image processing. In Proceedings of the 4th KEPAF Conference on Image Analysis and Pattern Recognition, Miskolc-Tapolca, Hungary, 28–30 January 2004; Volume 4, p. 20. [Google Scholar]
  6. Berke, J. The Structure of dimensions: A revolution of dimensions (classical and fractal) in education and science. In Proceedings of the 5th International Conference for History of Science in Science Education, Keszthely, Hungary, 12–16 July 2004. [Google Scholar]
  7. Berke, J. Real 3D terrain simulation in agriculture. In Proceedings of the 1st Central European International Multimedia and Virtual Reality Conference, Veszprém, Hungary, 6–8 May 2004; Volume 1, pp. 195–201. [Google Scholar]
  8. Busznyák, J.; Berke, J. Psychovisual comparison of image compression methods under laboratory conditions. In Proceedings of the 4th KEPAF Conference on Image Analysis and Pattern Recognition, Miskolc-Tapolca, Hungary, 28–30 January 2004; Volume 4, pp. 21–28. [Google Scholar]
  9. Berke, J.; Busznyák, J. Psychovisual Comparison of Image Compressing Methods for Multifunctional Development under Laboratory Circumstances. WSEAS Trans. Commun. 2004, 3, 161–166. [Google Scholar]
  10. Berke, J. Applied spectral fractal dimension. In Proceedings of the Joint Hungarian-Austrian Conference on Image Processing and Pattern Recognition, Veszprém, Hungary, 11–13 May 2005; pp. 163–170. [Google Scholar]
  11. Berke, J.; Polgár, Z.; Horváth, Z.; Nagy, T. Developing on exact quality and classification system for plant improvement. J. Univers. Comput. Sci. 2006, 12, 1154–1164. [Google Scholar]
  12. Berke, J. Measuring of Spectral Fractal Dimension. In Proceedings of the International Conferences on Systems, Computing Sciences and Software Engineering (SCSS 05), Virtual, 10–20 December 2005. Paper No. 62. [Google Scholar]
  13. Kozma-Bognár, V. The application of Apple systems. J. Appl. Multimed. 2007, 2, 61–70. [Google Scholar]
  14. Berke, J. Using spectral fractal dimension in image classification. In Innovations and Advances in Computer Sciences and Engineering; Sobh, T., Ed.; Springer: Berlin/Heidelberg, Germany, 2010; pp. 237–242. [Google Scholar]
  15. Horváth, Z.; Kozma-Bognár, V.; Hegedűs, G.; Berke, J. Fractaltexture test in lawn combination classification with hyperspectral images. In Proceedings of the 12th year of the European conference on Information Systems in Agriculture and Forestry (ISAF), Bled, Slovenia, 27–29 June 2006. [Google Scholar]
  16. Kozma-Bognar, V.; Berke, J. New Evaluation Techniques of Hyperspectral Data. J. Syst. Cybern. Inform. 2009, 8, 49–53. [Google Scholar]
  17. Horváth, Z. Separation plant cultures with color mapping. In Proceedings of the 7th International Conference on Computing and Convergence Technology (ICCIT, ICEI and ICACT), ICCCT, Seoul, Republic of Korea, 3–5 December 2012; pp. 363–366. [Google Scholar]
  18. Berke, J.; Bíró, T.; Burai, P.; Kováts, L.D.; Kozma-Bognar, V.; Nagy, T.; Tomor, T.; Németh, T. Application of remote sensing in the red mud environmental disaster in Hungary. Carpathian J. Earth Environ. Sci. 2013, 8, 49–54. [Google Scholar]
  19. Bíró, T.; Tomor, T.; Lénárt, C.S.; Burai, P.; Berke, J. Application of remote sensing in the red sludge environmental disaster in Hungary. Acta Phytopathol. Entomol. Hung. 2012, 47, 223–231. [Google Scholar]
  20. Burai, P.; Smailbegovic, A.; Lénárt, C.; Berke, J.; Milics, G.; Tomor, T.; Bíró, T. Preliminary Analysis of Red Mud Spill Based on Ariel Imagery. Acta Geogr. Debrecina Landsc. Environ. 2011, 5, 47–57. [Google Scholar]
  21. Berke, J.; Kozma-Bognár, V. Investigation possibilities of turbulent flows based on geometric and spectral structural properties of aerial images. In Proceedings of the 10th National Conference of the Hungarian Society of Image Processing and Pattern Recognition, Kecskemét, Hungary, 27–30 January 2015; pp. 295–304. [Google Scholar]
  22. Kozma-Bognár, V.; Berke, J. Entropy and fractal structure based analysis in impact assessment of black carbon pollutions. Georg. Agric. 2013, 17, 53–68. [Google Scholar]
  23. Kozma-Bognar, V.; Berke, J. Determination of optimal hyper- and multispectral image channels by spectral fractal structure. In Innovations and Advances in Computing, Informatics, Systems Sciences, Networking, and Engineering; Lecture Notes in Electrical Engineering (LNEE); Sobh, T., Elleithy, K., Eds.; Springer International Publishing: Cham, Switzerland, 2015; Volume 313, pp. 255–262. [Google Scholar]
  24. Chamorro-Posada, P. A simple method for estimating the fractal dimension from digital images: The compression dimension. Chaos Solitions Fractals 2016, 91, 562–572. [Google Scholar]
  25. Karydas, C.G. Unified scale theorem: A mathematical formulation of scale in the frame of Earth observation image classification. Fractal Fract. 2021, 5, 127. [Google Scholar] [CrossRef]
  26. Dachraoui, C.; Mouelhi, A.; Drissi, C.; Labidi, S. Chaos theory for prognostic purposes in multiple sclerosis. Trans. Inst. Meas. Control. 2021, 0, 1–12. [Google Scholar] [CrossRef]
  27. Abdul-Adheem, W. Image Processing Techniques for COVID-19 Detection in Chest CT. J. Al-Rafidain Univ. Coll. Sci. 2022, 52, 218–226. [Google Scholar]
  28. Frantík, P. An Approach for Accurate Measurement of Fractal Dimension Distribution on Fracture Surfaces. Trans. VSB–Tech. Univ. Ostrav. Civ. Eng. Ser. 2022, 22, 7–12. [Google Scholar]
  29. Csákvári, E.; Halassy, M.; Enyedi, A.; Gyulai, F.; Berke, J. Is Einkorn Wheat (Triticum monococcum L.) a Better Choice than Winter Wheat (Triticum aestivum L.)? Wheat Quality Estimation for Sustainable Agriculture Using Vision-Based Digital Image Analysis. Sustainability 2021, 13, 12005. [Google Scholar] [CrossRef]
  30. Kevi, A.; Berke, J.; Kozma-Bognár, V. Comparative analysis and methodological application of image classification algorithms in higher education. J. Appl. Multimed. 2023, XVIII/1, 13–16. [Google Scholar]
  31. Vastag, V.K.; Óbermayer, T.; Enyedi, A.; Berke, J. Comparative study of Bayer-based imaging algorithms with student participation. J. Appl. Multimed. 2019, XIV/1, 7–12. [Google Scholar] [CrossRef]
  32. Berke, J.; Kozma-Bognár, V. Measurement and comparative analysis of gain noise on data from Bayer sensors of unmanned aerial vehicle systems. In X. Hungarian Computer Graphics and Geometry Conference; SZTAKI: Budapest, Hungary, 2022; pp. 136–142. [Google Scholar]
  33. Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef]
  34. Shannon, C.E. Prediction and entropy of printed English. Bell Syst. Tech. J. 1951, 30, 50–64. [Google Scholar] [CrossRef]
  35. Berke, J.; Gulyás, I.; Bognár, Z.; Berke, D.; Enyedi, A.; Kozma-Bognár, V.; Mauchart, P.; Nagy, B.; Várnagy, Á.; Kovács, K.; et al. Unique algorithm for the evaluation of embryo photon emission and viability. Sci. Rep. 2024, 14, 15066. [Google Scholar] [CrossRef]
  36. Bodis, J.; Nagy, B.; Bognar, Z.; Csabai, T.; Berke, J.; Gulyas, I.; Mauchart, P.; Tinneberg, H.R.; Farkas, B.; Varnagy, A.; et al. Detection of Ultra-Weak Photon Emissions from Mouse Embryos with Implications for Assisted Reproduction. J. Health Care Commun. 2024, 9, 9041. [Google Scholar]
  37. Biró, L.; Kozma-Bognár, V.; Berke, J. Comparison of RGB Indices used for Vegetation Studies based on Structured Similarity Index (SSIM). J. Plant Sci. Phytopathol. 2024, 8, 7–12. [Google Scholar]
  38. Kozma-Bognár, K.; Berke, J.; Anda, A.; Kozma-Bognár, V. Vegetation mapping based on visual data. J. Cent. Eur. Agric. 2024, 25, 807–818. [Google Scholar]
  39. Berzéki, M.; Kozma-Bognár, V.; Berke, J. Examination of vegetation indices based on multitemporal drone images. Gradus 2023, 10, 1–6. [Google Scholar]
  40. Mandelbrot, B.B. The Fractal Geometry of Nature; W.H. Freeman and Company: New York, NY, USA, 1983; p. 15. [Google Scholar]
  41. Barnsley, M.F. Fractals Everywhere; Academic Press: Cambridge, MA, USA, 1998; pp. 182–183. [Google Scholar]
  42. Voss, R. Random fractals: Characterisation and measurement. In Scaling Phenomena in Disordered Systems; Pynn, R., Skjeltorps, A., Eds.; Plenum: New York, NY, USA, 1985. [Google Scholar]
  43. Peleg, S.; Naor, J.; Hartley, R.; Avnir, D. Multiple Resolution Texture Analysis and Classification. IEEE Trans. Pattern Anal. Mach. Intell. 1984, 4, 518–523. [Google Scholar] [CrossRef]
  44. Turner, M.T.; Blackledge, J.M.; Andrews, P.R. Fractal Geometry in Digital Imaging; Academic Press: Cambridge, MA, USA, 1998; pp. 45–46, 113–119. [Google Scholar]
  45. Goodchild, M. Fractals and the accuracy of geographical measure. Math. Geol. 1980, 12, 85–98. [Google Scholar]
  46. DeCola, L. Fractal analysis of a classified Landsat scene. Photogramm. Eng. Remote Sens. 1989, 55, 601–610. [Google Scholar]
  47. Clarke, K. Scale based simulation of topographic relief. Am. Cartogr. 1988, 12, 85–98. [Google Scholar]
  48. Shelberg, M. The Development of a Curve and Surface Algorithm to Measure Fractal Dimension. Master’s Thesis, Ohio State University, Columbus, OH, USA, 1982. [Google Scholar]
  49. Berke, J. Measuring of spectral fractal dimension. New Math. Nat. Comput. 2007, 3, 409–418. [Google Scholar]
  50. Fréchet, M.M. Sur quelques points du calcul fonctionnel. Rend. Circ. Matem. 1906, 22, 1–72. [Google Scholar] [CrossRef]
  51. Baire, R. Sur les fonctions de variables réelles. Ann. Mat. 1899, 3, 1–123. [Google Scholar]
  52. Serra, J.C. Image Analysis and Mathematical Morphology; Academic Press: London, UK, 1982. [Google Scholar]
  53. Mie, G. Beiträge zur Optik trüber Medien, speziell kolloidaler Metallösungen. Ann. Phys. 1908, 25, 377–445. [Google Scholar]
  54. Sabins, F.F. Remote Sensing. In Principles and Interpretation; W.H. Freeman and Company: Los Angeles, CA, USA, 1987. [Google Scholar]
Figure 1. The classical fractal dimension, when applied to digital images, is not sensitive to shades and colors. Measurements on Black and White (BW), Grayscale, and Color (RGB) images give the same value, while measurements based on (3) clearly show the differences between images.
Figure 1. The classical fractal dimension, when applied to digital images, is not sensitive to shades and colors. Measurements on Black and White (BW), Grayscale, and Color (RGB) images give the same value, while measurements based on (3) clearly show the differences between images.
Remotesensing 17 01249 g001
Figure 2. Multispectral camera array aligned orthophoto images by band (RGB—orthophoto made with 3-band Bayer sensor, MS-R—R band of 4-band orthophoto, MS-G—G band of 4-band orthophoto, MS-RE—Red-Edge band of 4-band orthophoto, MS-NIR—NIR band of 4-band orthophoto, RGB-R—R band of 3-band Bayer sensor orthophoto, RGB-G—G band of 3-band Bayer sensor orthophoto, RGB-B—B band of 3-band Bayer sensor orthophoto, RGB-DEM—DEM made based on alignment of RGB images, and MS-DEM—DEM made based on alignment of MS images).
Figure 2. Multispectral camera array aligned orthophoto images by band (RGB—orthophoto made with 3-band Bayer sensor, MS-R—R band of 4-band orthophoto, MS-G—G band of 4-band orthophoto, MS-RE—Red-Edge band of 4-band orthophoto, MS-NIR—NIR band of 4-band orthophoto, RGB-R—R band of 3-band Bayer sensor orthophoto, RGB-G—G band of 3-band Bayer sensor orthophoto, RGB-B—B band of 3-band Bayer sensor orthophoto, RGB-DEM—DEM made based on alignment of RGB images, and MS-DEM—DEM made based on alignment of MS images).
Remotesensing 17 01249 g002
Figure 3. Data model illustrating the selection of vegetation above 10 m (colored blue) based on the 3D orthophoto.
Figure 3. Data model illustrating the selection of vegetation above 10 m (colored blue) based on the 3D orthophoto.
Remotesensing 17 01249 g003
Figure 4. Minimum and maximum entropy and SFD aerial photographs taken during high altitude flight.
Figure 4. Minimum and maximum entropy and SFD aerial photographs taken during high altitude flight.
Remotesensing 17 01249 g004
Figure 5. Altitude dependence of entropy values of visible range images taken by DJI M30.
Figure 5. Altitude dependence of entropy values of visible range images taken by DJI M30.
Remotesensing 17 01249 g005
Figure 6. Altitude dependence of SFD and EW-SFD values of visible range images taken by DJI M30.
Figure 6. Altitude dependence of SFD and EW-SFD values of visible range images taken by DJI M30.
Remotesensing 17 01249 g006
Figure 7. Altitude dependence of entropy values based on M30T images taken in the visible range, with 88% cloud cover (coloring according to Table 6).
Figure 7. Altitude dependence of entropy values based on M30T images taken in the visible range, with 88% cloud cover (coloring according to Table 6).
Remotesensing 17 01249 g007
Figure 8. Visible range images taken at the beginning and end of the altitude sections below, between and above the cloud during the ascent. The upper-left image was taken at an altitude of 5 m, the image to the right was taken at 658 m, then at altitudes of 668, 788, 798, and 1500 m.
Figure 8. Visible range images taken at the beginning and end of the altitude sections below, between and above the cloud during the ascent. The upper-left image was taken at an altitude of 5 m, the image to the right was taken at 658 m, then at altitudes of 668, 788, 798, and 1500 m.
Remotesensing 17 01249 g008
Figure 9. Altitude dependence of SFD and EW-SFD values based on M30T images taken in the thermal range, with 88% cloud cover. EW-SFD allows for a more accurate assessment of the details within each section.
Figure 9. Altitude dependence of SFD and EW-SFD values based on M30T images taken in the thermal range, with 88% cloud cover. EW-SFD allows for a more accurate assessment of the details within each section.
Remotesensing 17 01249 g009
Figure 10. Far-infrared images taken at the beginning and end of the altitude sections below, between and above the cloud during the ascent. The images with a light background, starting from the upper-left corner, were taken at an altitude of 5, 708, 718, 1288, 1298, and 1500 m, respectively, measured to GND. While the shots with a dark background were taken at an altitude of 5, 658, 668, 788, 798, and 1500 m.
Figure 10. Far-infrared images taken at the beginning and end of the altitude sections below, between and above the cloud during the ascent. The images with a light background, starting from the upper-left corner, were taken at an altitude of 5, 708, 718, 1288, 1298, and 1500 m, respectively, measured to GND. While the shots with a dark background were taken at an altitude of 5, 658, 668, 788, 798, and 1500 m.
Remotesensing 17 01249 g010
Figure 11. Height dependence of entropy and self-similar image structure parameters of images taken by DJI M3 Multispectral camera array, global shutter camera.
Figure 11. Height dependence of entropy and self-similar image structure parameters of images taken by DJI M3 Multispectral camera array, global shutter camera.
Remotesensing 17 01249 g011
Table 1. Methods suitable for calculating fractal dimension.
Table 1. Methods suitable for calculating fractal dimension.
MethodsMain Facts
Box Counting [42]most popular,
can be easily algorithmized in the case of images
Epsilon-Blanket [43]to curve
Fractional Brownian Motion [44]similar box counting
Power Spectrum [44]digital fractal signals
Hybrid Methods [45]calculate the fractal dimension of
2D using 1D methods
Perimeter–Area relationship [46]to classify different type images
Prism Counting [47]for a one-dimensional signal
Walking-Divider [48]practical to length
Table 2. Spectral characteristics of global shutter-based, discrete multispectral camera arrays designed for industrial UAVs (peak sensitivity, with half-width data).
Table 2. Spectral characteristics of global shutter-based, discrete multispectral camera arrays designed for industrial UAVs (peak sensitivity, with half-width data).
Band NameSensor Name
(Center Wavelength (nm), Bandwidth (nm))
MicaSense DualMicaSense
Altum-PT
Sentera 6x
Multispectral
Parrot Sequoia+ MultispectralDJI P4
Multispectral
DJI M3
Multispectral
Coastal Blue444 ± 28---450 ± 16-
Blue475 ± 32475 ± 32475 ± 30---
Green531 ± 14-----
Green560 ± 27560 ± 27550 ± 20550 ± 40560 ± 16560 ± 16
Red650 ± 16---650 ± 16650 ± 16
Red668 ± 14668 ± 14670 ± 30660 ± 40--
Red Edge705 ± 10-----
Red Edge717 ± 12717 ± 12715 ± 10---
Red Edge740 ± 18--735 ± 10730 ± 16730 ± 16
Near Infrared842 ± 57842 ± 57840 ± 30790 ± 40860 ± 26860 ± 26
LWIR-10.5 ± 6 μm----
Table 3. The table shows the channel numbers of the images in Figure 2, their spectral resolution in bits, their values calculated based on the real image data bits according to (3), and the Shannon entropy values of each image.
Table 3. The table shows the channel numbers of the images in Figure 2, their spectral resolution in bits, their values calculated based on the real image data bits according to (3), and the Shannon entropy values of each image.
ImagenS (Bit)SFDmeasuredEntropy
RGB382.459218.1197
MS4162.346522.0394
RGB-DEM180.98837.5330
MS-DEM1160.996214.1626
RGB-R181.00007.3043
RGB-G181.00007.2245
RGB-B181.00007.1399
MS-R1160.997612.7551
MS-G1160.996313.1113
MS-RE1160.974613.3359
MS-NIR1160.963713.0143
ALL9164.128122.0394
Table 4. We used 10 m as a boundary condition when measuring the RGB images shown in Figure 2 according to (7) and compared them with the SFD and entropy values without boundary conditions.
Table 4. We used 10 m as a boundary condition when measuring the RGB images shown in Figure 2 according to (7) and compared them with the SFD and entropy values without boundary conditions.
RGB-DEMRGB-RRGB-GRGB-BMaximum Values
SFD values measured without boundary conditions0.98831111
SFD values measured based on 10 m height—as boundary condition0.99960.99960.9996
Entropy values measured without boundary conditions7.53307.30437.22457.13998
Entropy values measured based on 10 m height—as boundary condition4.34994.30614.2527
Table 5. Ascent and descent to an altitude of 1500 m, information content and self-similar image structure parameters.
Table 5. Ascent and descent to an altitude of 1500 m, information content and self-similar image structure parameters.
HmaxHminSFDmaxSFDminEW-SFDmaxEW-SFDmin
Numerical values15.7012.882.582.212.301.67
Ascent height (m)4814402892028976
Numerical values15.7112.972.572.202.292.02
Descent height (m)56107231925311295
Table 6. Measurement values of the separated sections (A, B, and C) based on measurements on visible range images.
Table 6. Measurement values of the separated sections (A, B, and C) based on measurements on visible range images.
H
average
HstdevSFD
average
SFDstdevEW-SFD
average
EW-SFD
stdev
Visible A14.470.542.370.102.020.13
Visible B10.141.091.710.320.541.04
Visible C8.630.371.370.150.180.75
FIR A14.060.652.560.042.320.06
FIR B11.591.162.450.052.100.10
FIR C6.950.371.560.220.910.26
Table 7. Measurement values of the separated sections (I, II, and III) based on measurements on FIR range images. For comparability, the table also includes data from sections A, B, and C, in the bottom three rows.
Table 7. Measurement values of the separated sections (I, II, and III) based on measurements on FIR range images. For comparability, the table also includes data from sections A, B, and C, in the bottom three rows.
H
average
HstdevSFD
average
SFDstdevEW-SFD
average
EW-SFD
stdev
FIR I13.970.712.560.042.310.06
FIR II7.960.761.780.051.210.08
FIR III6.850.111.540.190.880.20
FIR A14.060.652.560.042.320.06
FIR B11.591.162.450.052.100.10
FIR C6.950.371.560.220.910.26
Table 8. M3 Multispectral camera array, images taken by global shutter camera, entropy, extreme values of self-similar image structure parameters and the corresponding height values.
Table 8. M3 Multispectral camera array, images taken by global shutter camera, entropy, extreme values of self-similar image structure parameters and the corresponding height values.
HmaxHminSFDmaxSFDminEW-SFDmaxEW-SFDmin
Red9.768.850.990.930.32−0.21
R-altitude241176141428141044
Green10.189.300.990.920.36−0.09
G-altitude241296241452131056
Red Edge10.359.920.970.890.310.06
altitude24116424684241164
Near9.949.560.900.86−0.01−0.38
Infrared alt.141176147201496
All layers
together
22.264622.26332.83642.62482.42061.9846
84–961236136814–241368
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Berke, J. Application Possibilities of Orthophoto Data Based on Spectral Fractal Structure Containing Boundary Conditions. Remote Sens. 2025, 17, 1249. https://doi.org/10.3390/rs17071249

AMA Style

Berke J. Application Possibilities of Orthophoto Data Based on Spectral Fractal Structure Containing Boundary Conditions. Remote Sensing. 2025; 17(7):1249. https://doi.org/10.3390/rs17071249

Chicago/Turabian Style

Berke, József. 2025. "Application Possibilities of Orthophoto Data Based on Spectral Fractal Structure Containing Boundary Conditions" Remote Sensing 17, no. 7: 1249. https://doi.org/10.3390/rs17071249

APA Style

Berke, J. (2025). Application Possibilities of Orthophoto Data Based on Spectral Fractal Structure Containing Boundary Conditions. Remote Sensing, 17(7), 1249. https://doi.org/10.3390/rs17071249

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop