Next Article in Journal
Mie Voids for Single-Molecule Fluorescence Enhancement in Wavelength-Scale Detection Volumes
Previous Article in Journal
Sensitivity Evaluation of a Dual-Finger Metamaterial Biosensor for Non-Invasive Glycemia Tracking on Multiple Substrates
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

4D Pointwise Terrestrial Laser Scanning Calibration: Radiometric Calibration of Point Clouds

by
Mansoor Sabzali
* and
Lloyd Pilgrim
Surveying Discipline, School of Engineering, University of Newcastle, Newcastle, NSW 2308, Australia
*
Author to whom correspondence should be addressed.
Sensors 2025, 25(22), 7035; https://doi.org/10.3390/s25227035
Submission received: 30 September 2025 / Revised: 2 November 2025 / Accepted: 14 November 2025 / Published: 18 November 2025
(This article belongs to the Section Radar Sensors)

Highlights

What are the main findings?
  • A novel framework for the pointwise radiometric calibration of terrestrial laser scanning (TLS) is presented, which is a combination of the LiDAR range equation, texture-dependent LiDAR cross-section determination, and a neural network technique.
  • This method significantly enhances the radiometric resolution of TLS on color targets, with accuracy improvements of 31–49% across different color patches and precision improvements of approximately 97% within the same color patch for four TLS devices.
What are the implications of the main findings?
  • TLS intensity attributes can be identified as a standardized fourth dimension in addition to the 3D spatial point clouds for more reliable reflectivity-based analysis.
  • This framework demonstrates the potential path towards more robust 4D TLS calibration, where standard radiometric values from various target geometries (target materials, roughness, albedo, and edgy and tilted surfaces) are strictly required.

Abstract

Terrestrial Laser Scanners (TLS), as monostatic LiDAR systems, emit and receive laser pulses through a single aperture, which ensures the simultaneous measurement of signal geometry and intensity. The relative intensity of a signal, defined as the ratio of received to transmitted power, directly describes the strength and quality of the reflected signal and the corresponding radiometric uncertainty of individual points. The LiDAR range equation provides the physical connection for characterizing signal strength as a function of reflectivity and other spatial parameters. In this research, theoretical developments of the texture-dependent LiDAR range equation, in conjunction with a neural network method, are presented. The two-step approach aims to improve the accuracy of signal intensities by enhancing signal reflectivity estimation and the precision of signal intensities by reducing their sensitivity to variations in spatial characteristics—range and incidence angle. This establishes the intensity as the standard fourth dimension of the 3D point cloud based on the inherent target quality. For validation, four terrestrial laser scanners—Leica ScanStation P50, Leica ScanStation C10, Leica RTC360, and Trimble X9—are evaluated. Results demonstrate significant improvements of at least 40% in accuracy and 97% in precision for the color intensities of individual points across the devices. This research enables a 4D TLS point cloud calibration framework for further investigations on other internal and external geometries of targets (target materials, roughness, albedo, and edgy and tilted surfaces), which allows the standardization of radiometric values.

1. Introduction

1.1. Problem Background

With the advent of the first generation of LiDAR (Light Detection and Ranging), data acquisition approaches in photogrammetry and engineering geodesy have been entirely revolutionized. LiDAR has different classifications, but in terms of aperture types, it is categorized as monostatic or bistatic LiDAR, having the same or separate apertures for emitting and receiving light, respectively [1,2]. A terrestrial laser scanner (TLS) is classified as a monostatic LiDAR due to its single-aperture design.
TLS is fundamentally an active sensor that uses visible or infrared waves of the electromagnetic spectrum (EM) to capture spatial and radiometric data through the availability of the reflected signal from a scene (i.e., each point in a point cloud provides both 3D spatial coordinates and radiometric information). One of the systematic error sources degrading the point clouds’ quality—spatially and radiometrically—is the unknown impact of the target geometry and surface properties [3]. The evaluation of the radiometric uncertainty of TLS point clouds—signal strength or intensity of the returned pulse from sample scanned targets—is the focus of this research (with the main emphasis on color targets).
It was reported that points lying on identical color patches (with similar target quality) do not uniformly scatter reflectance [4,5]. To comprehend the radiometric influences received from individual points, it is crucial to investigate the amount of energy received via the energy link equation by experimenting with variant sample targets under multiple scanning arrangements. This energy link budget equation is satisfied through the implementation of the LiDAR range equation. This method simultaneously takes into account the reflectivity and geometric characteristics—range and incidence angle—of a single point. Here, various samples of color targets (Macbeth color chart) under two scanning conditions—orthogonal and inclined—are assessed for accuracy using four scanner devices.

1.2. Significance and Purposes

The current work aims to address signal intensity as a pointwise factor that varies with respect to range, incidence angle, and reflectivity. Conventional methods of intensity modeling limit the consideration of the spatial characteristics of a single point and neglect the influence of surface reflection. Since points sharing the same target quality do not scatter the signal uniformly, a detailed study of the individual points on the target is important.
To reinforce this, for the first time, the LiDAR range equation is introduced as a well-established framework for assessing the point-to-point radiometric uncertainty in terrestrial laser scanning. The innovative pointwise radiometric calibration of TLS is achieved through an appropriate determination of the LiDAR cross-section, accounting for the surface reflection of the target properties, and by developing neural network techniques to incorporate weightings of the spatial dimensions of a single point—range, and incidence angle. The two-step approach expressed here deals with both challenges together to enhance the accuracy and precision of the intensity values for a single pulse. The proposed framework ultimately aims to standardize the intensity values within each color patch.
For data collection, a laboratory calibration setup was established under controlled environmental conditions using four terrestrial laser scanners—Leica ScanStation P50, Leica ScanStation C10, Leica RTC360 (Leica Geosystems, Heerbrugg, Switzerland), and Trimble X9 (Trimble, Westminster, CO, USA)—under orthogonal and inclined scanning conditions. Due to the varying wavelengths and multiple beam characteristics, the selected devices provide a noteworthy validation under different scanning environments. Note that the calibration conditions were also isolated in terms of reflectivity from any unexpected interfering light sources during scanning. It is worth clarifying that the effect of refractivity along the laser line is insignificant within the short range of scanning [6]. A significant aspect of the data experiments is that scanning was conducted under maximum default resolution and in “Scan Only” mode (with no attached RGB camera data). The study reveals that the RGB values derived from single laser pulses cannot replicate true color equivalents, dissimilar to those derived from cameras. They are simply alternative representations of signal intensity (intensity-based color reproduction values).
The investigation of the results indicates the success of the proposed two-step approach in determining the color-dependent LiDAR cross-section and data-driven neural network techniques across different color intensities. The accuracy experiences improvements between 31% and 47%, while precision shows improvements of 97% or higher. Accuracy refers to the accurate reflectivity modeling of the surface corresponding to the inherent color targets, whereas precision encompasses the pointwise intensity modeling by addressing the range and incidence angle of the returned pulse. The methodologies presented here provide a radiometric calibration framework that is independent of scanning and target geometry—range and incidence angle, but dependent on reflectivity. The findings underscore that the 3D point coordinates of TLS are complemented by the standard 1D intensity coordinate with respect to the signal strength. This part aims to provide a future pathway toward fully standardized 4D point clouds calibration that accounts for additional target characteristics (material, roughness, and edgy or tilted geometries).

2. Related Works

The uncertainty of TLS deliverables is highly dependent on the reflected signal’s strength. Several factors affect the signal strength, including the reflectivity of the surface, range, and incidence angle of a single-point observation [7,8,9,10]. Not only do these factors affect the geometric uncertainty of 3D point coordinates, but they also create a deviation in radiometric uncertainty. To clarify these contributing parameters, the range r is the physical value and can be measured (Figure 1). However, the incidence angle α is a non-physical value and can be described theoretically as the angle between the transmitted laser beam vector L and the surface-normal vector N   .
The ideal condition of the incidence angle is zero when the shape of the laser spot footprint is circular rather than elliptical. Then, this corresponds to a zero reflected signal angle. This occasionally occurs for a single point measurement, only when the orthogonal condition of scanning is satisfied. Studies [11,12] expressed that an incidence angle larger than 60° significantly affects the overall 3D point cloud precision, and the presence of noise increases by 20 % as the incidence angle rises. Nevertheless, at smaller incidence angles, range becomes the dominant factor influencing signal strength [7,10]. Therefore, the behavior of the reflected signal, primarily influenced by range and subsequently by the incidence angle, plays a critical role in determining the strength of the backscattered signal.
The investigation of both effects—range and incidence angle—on the intensity was initially evaluated by using the laser radar range equation [13]. In this experiment, six different laboratory setups at varying ranges and incidence angles were proposed to test the reflected intensities against the known reflectivity of the targets. These findings confirm that the range has a greater impact on intensity values than the incidence angle. Accordingly, the intensity function was logarithmically approximated with respect to range, following the inverse-square law, with the impact of the incidence angle considered negligible.
Specifically, color targets have been additionally examined in previous studies for processing geometric range displacements through the availability of the signal strength. The experiment considers variant illumination conditions [12]. In the experimental setups, a Macbeth color checker chart was scanned, and the results were compared to the reference plane. The Macbeth color checker chart is a unique test pattern scientifically designed to help determine the true color balance of any color rendition system (https://hollynorth.com/product/macbeth-chart/) (accessed on 15 September 2025). In conclusion, it was noted that the ambient light behind any color leads to meaningful errors in the range measurements since it influences the noise level of the reflected signal. Finally, the corresponding geometric corrections in the measured range for different colors were determined. Moreover, the research classified the colors into two groups as follows: lighter colors that indicate high reflectance, better point density, and minimal noise with smaller geometric range distortions; and darker colors that show low reflectance, poorer point density, and higher noise with larger geometric range distortions [14]. J. Clark et al. [15] also proposed three varying range setups—close (less than 4   m ), near ( 4   m to 6   m ), and far range ( 7   m to 8   m )—to assess the impacts of color variations on range measurements. It was highlighted that ideal conditions occur at arbitrary range observations when geometric compensation is internally applied within the scanners. Additional studies reported the effects of color targets on range distortions under various experimental setups [16,17,18].
Former studies have also highlighted the comparison of point intensities with radiometric imagery data, considering both range and incidence angle. D. Wujanz [19] retrieved a mathematical relationship between three variables—range, incidence angle, and reflectivity—and intensities. In the computational principle, intensity through a polynomial function was formulated and calculated by interpreting the projection matrix of the scanner as the image and using the intensity values derived from the 2D image. The function eventually allowed for transforming the measured intensities into pseudo-quotient values of reflection. A reasonable reduction in the standard deviation of intensity values was achieved after the correction of the range, incidence angle, and reflectivity.
Building on former findings, recent research on the colorization of 3D point clouds aimed to utilize 2D images extracted from scanned data. The experiment evaluated various image quality methods with respect to color accuracy, sharpness, information capacity, and noise [20]. The study noted that basic challenges in color reproduction (e.g., measured differences in color, white balance, and exposure) could be mitigated during the data processing stages; however, the detailed aspects of image production (e.g., sharpness and noise) were less controllable, reflecting the inherent limitations in scanner construction [20]. A comparative analysis between TLS intensity data and RGB camera data in [21]—based on identifying the coordinates of target centers—indicated that the existing systematic errors are sometimes greater than the beam divergence of the scanner.
Balaguer-Puig et al. [22] established the connection between TLS intensity and color properties on the Macbeth color chart. Through this comparison, a reasonable relationship between surface reflectivity and color intensity values can be observed. According to these outcomes, the current study proposes a generic framework for intensity standardization based on color reflectivity, formulated through a color-dependent LiDAR range equation. This framework is further supported by a data-driven neural network approach that accounts for variations in range and incidence angle. The two-step technique introduces a pointwise correction procedure for 4D radiometric calibration in TLS. This methodology not only investigates color properties but also systematically enables its extension to other target geometries. Ultimately, the study offers a unique contribution to radiometric TLS error modeling in order to guarantee consistent intensity values corresponding to the reflected backscattered signal.

3. TLS Background

3.1. TLS Deliverables

TLS is a terrestrial laser-based instrument that delivers 3D point coordinates in 3D spherical coordinates. In principle, TLS is a very high-speed and movable total station which can capture millions of points in a second as a result of measuring three spherical coordinates—range r , horizontal angle h , and vertical angle v —from the returned signals reflected from individual points [23]. The mathematical conversion is applied from 3D spherical coordinates to 3D Cartesian coordinates as follows:
x y z = r cos v cos h r cos v sin h r sin v ,
Apart from the 3D spatial coordinates, every measured point includes the following four additional radiometric values (Figure 2): a scalar field and the red, green, and blue (RGB) color components.
The scalar field (i.e., intensity or grayscale values) is a numerical attribute assigned to each point, describing the strength of the reflected signal. These values depend on numerous factors such as surface roughness, material properties, incidence angle, etc. In contrast, the red, green, and blue color components (RGB) are always expressed on a scale between 0 and 255, where 0 indicates no reflected intensity, but 255 represents the maximum reflected intensity. To relate these perceptual color estimates to physical measurements, a conversion from RGB values to intensity values is required (Appendix A). However, the radiometric products of terrestrial laser scanning are generally uncalibrated deliverables and cannot be directly applied to reflectivity-related tasks. Accordingly, the uncertainty assessment of these non-spatial values within TLS calibration studies is referred to as radiometric calibration. The current study focuses on the fourth dimension of TLS observations—intensity value—alongside the conventional 3D spatial coordinates.
One of the major systematic error sources impacting those radiometric values is the external geometry of the scanned targets (edged and tilted surfaces) and/or their inherent characteristics (e.g., roughness, material, color, albedo, etc.)—commonly referred to as object and surface-related issues. Although other geometries concerning the instrument and scanning configuration must be simultaneously involved in comprehensive TLS calibration studies, the key focus of the current research is on one of the inherent features of the object-related issues—the color of targets—regardless of the exterior and interior topographies of the targets.

3.2. Object- and Surface-Related Issues

A comprehensive calibration study for terrestrial laser scanning embraces the incorporation of the following four geometries: instrument, laser, scanning, and target geometries. One of the important geometries among the four relates to the characteristics and properties of the scanned targets. When the laser strikes the surface of the targeted object, it experiences several optical phenomena. Reflectivity is one of the dominating factors leading to the varying power of reflected signals. Several elements play integral roles in the varying reflectance behavior of sample targets, namely, material [24], roughness [25,26,27,28], color (the discussion point in the current work), albedo [29], and the geometric configuration of the surface, such as the edged and tilted properties [30]. The major emphasis, however, is on the laser striking the target surface, with reduced influence from other optical interferences such as atmospheric effects along the propagation path [6].
Reflectivity in optics is a measure with regard to the ability of a surface to reflect the radiation (i.e., here, the radiation is the laser illumination, typically within the visible or infrared domain of the electromagnetic spectrum (EM)). Hence, reflectiveness is defined as the quality or capability of the surface being reflective. Generally, reflectivity occurs in two general patterns [31,32] (Figure 3).
When the beam generates specular reflectivity (mirror or regular reflectivity), meaning that the returned signal is uniformly illuminated like the emitted signal, the reflection is equal and specular in the same direction. Specular returns are frequently referred to as a glint in research activities, but this type of reflection is unlikely to happen in nature. The alternative pattern of the reflected signal is scattered over a large volume and non-uniformly in different directions. This pattern is called diffuse reflection. The (perfect) diffuse reflection is referred to as Lambertian reflectance. Although the perfect diffuse is unlikely to take place in nature, Lambertian reflectance is assumed in most reflectance-related research for LiDAR research investigations [33,34]. Thus, to better quantify the radiometric quantities of terrestrial laser scanning, the geometry of the laser striking the target surfaces alongside the geometry of the targets are two simultaneous issues. Therefore, a laboratory test arrangement must be conducted to address variant sample targets.
In summary, the points with the same target characteristics (e.g., color) can exhibit varying reflectivity, meaning that neighboring points on the same surface do not necessarily scatter the laser signal identically and uniformly. Variations in range, incidence angle, reflectivity, and other degradation parameters influence the strength of individual point returns, and they lead to different radiometric qualities. These factors directly affect the received power and the corresponding signal-to-noise ratio (SNR), resulting in inconsistent intensity values across the identical color patch (Figure 2). To address these challenges, this study proposes an amended methodology that integrates pointwise intensity modeling—based on the LiDAR range equation—in conjunction with a neural network, resulting in a rigorous 4D radiometric TLS calibration. This framework can further account for various factors influencing individual point intensity measurements, including the other object- or surface-related issues.

4. Methods

A pointwise study of reflected power is proposed for intensity modeling in terrestrial laser scanning (TLS). This approach enhances the accuracy and precision of the fourth dimension of point clouds, namely, the intensity values associated with each point observation. To address this challenge, both radiometric variation (reflectivity) and geometric variation (range and incidence angle) must be considered simultaneously, since even through points share the same target quality (e.g., color), they can produce inconsistent reflected laser power. A two-step methodology, based on a modified theoretical development on the LiDAR range equation, is presented to standardize intensity values with respect to color targets. In the first step, the color-dependent LiDAR cross-section is accurately determined to distinguish between intensities according to their corresponding reflectivity. In the second step, neural network algorithms are incorporated to minimize the sensitivity of the derived pointwise intensity to spatial resolution effects.
The LiDAR range equation is an energy link budget interpretation which specifies the attenuation of the signal due to its propagation and the other possible deteriorating factors (i.e., it relates the received to the transmitted power of the signal, considering additional factors that might degrade signal strength). The comprehensive LiDAR range equation is written as follows [1,33,34]:
P R = P T Ω A i l l u m A r e c π r 2 η a t m 2 η s y s ,
where the received power of the laser P R is reduced in magnitude compared to transmitted power P T   in W a t t s . The relationship is established through the following two ratios: the first, the area of cross-section Ω that is divided by the illumination area at the target location A i l l u m ; and the second, the area of receiver aperture A r e c that is divided by the effective average area illuminated by the reflection from the target π r 2 depending on measured range r (all areas are computed in m 2 ). There are two efficiency terms, as follows: η a t m is the transmission efficiency through the atmosphere (atmospheric loss), and η s y s is the optical efficiency of the receiver system (system loss) (both are dimensionless values). The efficiency of a system or a process is typically expressed as the ratio of useful output to total input. Notably, under ideal efficiency, the two efficiency terms are equal to one, and the equation is simplified.
The area of the receiver A r e c ( m 2 ) is determined as the function of the diameter of receiver D r e c   m . In case of a circular aperture of a single receiver, it is generally given as follows:
A r e c = π D r e c 2 2 ,
The diameter of the receiver is dependent on the beam divergence ϑ ( r a d ) and wavelength λ ( m ) .
D r e c λ ϑ ,
Equation (4) can be expressed in different formats depending on the beam profile. Several beam profiles were defined in the literature. The most common one encountered in laser-based sensors is the Gaussian beam profile with multiple definitions of beamwidth ω . Although there is no fixed definition of beamwidth, full width at half maximum (FWHM) is one of the customary conventions used for laser scanner devices. It represents the full beamwidth where the intensity is half the maximum. For example, for the Gaussian FWHM beam profile, Equation (4) can be rewritten as D r e c = 1.22 λ ϑ [34].
The second area of a two-way energy budget is A i l l u m ( m 2 ) , the illuminated area of the actual ray beam. This area is not smaller than the diffraction limit. The diffraction limit is theoretically the smallest size at which optical systems can resolve the targets. It is assumed that the transmitted beam uniformly illuminates a circular output aperture [34]. The area is based on the diameter of the receiver D r e c   ( m ) and can be computed as follows [31,34]:
A i l l u m π λ   r 2   D r e c 2 ,
The foremost critical parameter is to obtain the LiDAR cross-sectional area for any given target. The computation of this non-physical area at the target location plays an essential role in reflected power and intensity value determination, as it is theoretically defined as a perfectly reflecting spherical area, dependent on the illumination area A i l l u m   ( m 2 ) and surface reflectance γ (dimensionless). The more closely the object being measured resembles a circle, the stronger the relationship between the LiDAR cross-section and the physical dimensions of the object. Accordingly, the ideal condition occurs whose illumination area is identical to the dimension of the targets ( Ω = A i l l u m ) . This phenomenon does not occur in nature. Therefore, two underlying assumptions are substantiated. The first assumption is when the area of the targets is smaller than illumination area ( Ω < A i l l u m ) (i.e., smaller beamwidth)—point targets (Figure 4, condition a (the central focus of pointwise radiometric calibration)), and the second is extended targets whose area is larger than the illumination area ( Ω > A i l l u m ) (i.e., larger beamwidth) (Figure 4, condition b).
The other involved parameter in the LiDAR cross-section is the surface reflection. To determine the reflectivity, the dominating backscattering signal is diffuse reflectance γ d rather than specular reflectance γ s   ( γ s γ d ) , as previously discussed in Figure 3. Then, the bidirectional reflectance distribution function (BRDF) plots of real surfaces typically display the combination of those two components equal or lower than one γ = γ s + γ d 1 . Those plots are explained via the Rayleigh condition, where a diffuse term (as the random case) decreases by an error-free term (specular) e Λ 2 .
γ d = 1 e Λ 2 ,
where Λ = 4 2 π R M S λ is an empirical factor, whose root mean square ( R M S ) is a measure of surface roughness. For specular reflection, the R M S is significantly smaller than the laser wavelength, or generally negligible, whereas for Lambertian (diffuse) reflection, the R M S is in the order of the laser wavelength or larger.
When the R M S of surface roughness is similar across the target ( R M S 1 ) , the reflectance term for all points lying on the same target properties (here color) is effectively equal to one, making the reflectivity nearly identical for all different colors. It is difficult to distinguish the surface roughness solely based on target reflectivity in terrestrial laser scanning calibration. To further account for the issue, diffuse reflectance term γ d must be scaled by the intrinsic reflectance coefficient with respect to the specific target texture k . Coefficient k is dimensionless and can be amended with respect to the proportion of incident energy reflected by a perfect Lambertian surface (i.e., this is reflected from the given color target under the propagating laser wavelength k c o l o u r ) .
γ d * = k c o l o u r ( 1 e Λ 2 ) ,
The coefficient can be either experimentally determined or, alternatively, a standard intensity value for the given target texture at a specific wavelength can be substituted. Then, point-to-point reflectivity variations γ d *   (dimensionless) are influenced by k c o l o u r and surface roughness (i.e., the intrinsic texture- (color-)dependent reflectivity). This methodology is designed to distinguish reflectivity differences based on the intrinsic surface reflectance properties of points that exhibit only slight variations in geometric parameters (Section 6.2). Therefore,
Ω d * = 4 γ d * π ω 2 cos α ,
Given all the parameters of the LiDAR range equation, relative intensity I   (dimensionless) is computed as follows:
I = P R P T = Ω d * A i l l u m A r e c π r 2 ,
According to Equation (9), the variations in geometric factors at the single-point level, namely, range and incidence angle, also contribute to fluctuations in signal strength, which are addressed in the second step of the presented method. To minimize the sensitivity of the intensity to these spatial factors, the LiDAR range equation is reformulated in terms of the weighted combination of range and incidence angle as follows:
I = w 1 f 1 r + w 2 f 2 α ,
Given Equations (8) and (9),
f 1 r = 1 r 2 ,   and   f 2 α = cos α ,
Here, w 1 and w 2 are the corresponding weightings on the individual measurements of the range and incidence angle which are able to estimate the relative contribution of each geometric parameter to the measured intensity. These weightings are variant with respect to target and scanner geometry. A data-driven neural network algorithm is introduced to predict this complex relationship between intensity and geometric factors in order to mitigate the sensitivity of pointwise intensity to spatial variations. Therefore, intensity values I can be reparametrized as follows (Section 6.3):
I = γ d * w 1 1 r 2 + w 2 cos α ,
Neural networks are machine learning models inspired by the structure of the human brain. They consist of multiple layers of interconnected neurons, where each neuron applies a nonlinear transformation to its inputs [35]. During training, the algorithm iteratively adjusts its internal weights to minimize the residuals between predicted and measured intensities. In principle, the proposed data-driven neural network operates on color-dependent reflectivity, range, and incidence angle as inputs, learning to estimate intensity values relative to the corresponding surface reflectivity as the output. The following four steps summarize the procedure of the pointwise radiometric calibration of TLS using the neural network approach:
  • The parameters, such as range, incidence angle, and color-dependent reflectivity obtained from the LiDAR range equation, are integrated into a feature matrix, and the output variable is determined as the intensity values.
  • The dataset is randomly divided into training (80%) and testing (20%) subsets to enable the independent evaluation of neural network performance (i.e., weightings on spatial parameters for a single point observation).
  • A feed-forward neural network with two hidden layers (each containing 10 neurons) is trained using the Levenberg–Marquardt (trainlm) optimization algorithm. The hidden layers use the hyperbolic tangent sigmoid (tansig) activation function, while the output layer employs a linear (purelin) function suitable for regression. As an example, training is performed with a learning rate of 0.001 , a maximum of 1000 epochs, and an early stopping criterion based on validation error. Finally, the objective function minimizes the mean squared error between predicted intensity and color-dependent intensity using the LiDAR range equation (i.e., those were formerly validated through the intrinsic reflectance coefficient).
  • During the validation process, the point reflectivity (i.e., intensity) from a presumed color patch is compared against the reference reflectivity (i.e., intensity derived from neutral colors) within each dataset. This comparison provides a quantitative assessment of the improvements achieved by both the data-driven method and the physical, laser-based approach (Section 7).
For the quantitative assessment of the accuracy improvements, accuracy was estimated from the variability of repeated observations, expressed by the sample standard deviation (Bessel-corrected) [36].
σ = 1 n 1 i = 1 n ( I i I ¯ ) 2
where I ¯ is the average intensity value computed using I ¯ = 1 n i = 1 n I i , I i represents the intensity of the i -th point, and n is the total number of points within each color patch.
The accuracy assessment involves evaluating the intensity values before and after applying the color-dependent LiDAR range equation, in comparison with the corresponding standard intensity of each color. Precision evaluation is performed by analyzing the repeatability of the measured intensity values with respect to a neutral reference color patch selected for each dataset (i.e., the neutral grey patch of the Macbeth chart in each dataset exhibits the minimal deviation from the standard intensity). Accordingly, reductions in standard deviation are compared for the purpose of quantifying the improvement in measurement.
This entire formulation establishes the connection between the radiometric-based LiDAR range equation and spatial-based neural network, which allows that both geometric and radiometric effects of a single point are incorporated for rigorous 4D radiometric standardization. This two-way approach offers the calibrated intensity values that are independent of the scanning geometry and target orientation but more dependent on intrinsic target reflectance (e.g., color). This results in the consistent (standard) fourth radiometric dimension of 3D point clouds.

5. Data Experiment

5.1. Laser Study

Terrestrial laser scanners are active sensors that utilize the laser to capture data. Overlooking the relevant knowledge of laser geometry leads to unreliable calibration results. In the laboratory experiment proposed for this research, four scanners—Leica ScanStation P50, Leica ScanStation C10, Leica RTC360 (Leica Geosystems, Heerbrugg, Switzerland), and Trimble X9 (Trimble, Westminster, CO, USA)—are used (Figure 5). The objective is to examine the behavior of multiple wavelengths and different physical beam characteristics when interacting with the reflected signal strength from various sample targets.
Table 1 provides a summary of the technical radiometric specifications reported by the manufacturers of the employed scanners (with further explanations provided below for clarification).
Figure 6 depicts the relationship between the maximum allowed power of a continuous wave in milliwatts ( m W ) and different classifications of the laser with respect to the corresponding wavelengths ( n m ). For example, Leica ScanStation P50, with laser Class 1 and a long wavelength, potentially emits a signal with higher power than Leica ScanStation C10, with laser Class 3R and a short wavelength. Further classifications of the laser are provided here (https://en.wikipedia.org/wiki/Laser_safety) (accessed on 15 September 2025).
In addition, wavelength is a contributing factor in signal power. Among all scanners, Leica ScanStation C10 is equipped with a visible green laser that falls within the visible light spectrum ( 532   n m ). This wavelength is suitable for bathymetric LiDAR installation since it can penetrate water and capture the sea floor. Leica ScanStation P50, Leica RTC 360, and Trimble X9 emit infrared waves ( 1550   n m ), whose behavior is considerably diverse, and they typically interact weakly with color variations (i.e., they are not sensitive enough for color distinction, especially for visible colors); however, they are suitable for the distinction of material properties and rough surfaces and ideal for long-range scanning. As outlined above, differences in wavelengths and beam characteristics produce varying radiometric responses. This, therefore, enables a more rigorous basis for validating the proposed 4D TLS calibration practices.

5.2. Data Collection Steps

In the following experiment, a series of sample (color) targets are tested, regardless of their inherent qualities and external formation. For data acquisition, the laboratory test design was established on 15 December 2024 at the University of Newcastle, Callaghan campus. The planar dimension of the designated laboratory room is 5.265   m   ×   2.89   m . As discussed earlier, four terrestrial laser scanners were employed under identical data acquisition conditions, and a Macbeth color chart containing 24 colors was used as the color reference (Figure 7).
Given the Macbeth color chart, Table 2 compares the normalized standard intensity values obtained from the standard RGB values of each color using the perceptual intensity equation (Appendix A).
The first scanner setup was established at the maximum room width, with the scanner oriented orthogonally to the center of each color patch. This is considered the ideal case for minimizing both the incidence angle and the circular spot size of the laser pulse. For the second setup, the inclined condition of scanning was proposed with the large incidence angle (greater than 60°) to the centre of the Macbeth color chart at the maximum room length. This is aimed at maximizing the incidence angle and the elliptical effect of the spot size of the laser pulse (i.e., an incidence angle larger than 60° significantly affects the precision of the 3D point cloud, resulting in approximately 20 % more noise [4]). The other important consideration of the data collection operation is that the testing experiment was executed in a dark room with no interference from any external light. Every window, door, and seam of the room were completely sealed. Furthermore, the illuminated areas on the scanners, including their screen, were covered when data capturing to avoid any external illumination. This prevents any unexpected interference on the emitted and reflected laser pulses. Thereafter, during scanning, the highest default resolution within each scanner was selected (Table 3).
All acquired scans were finalized under “Scans Only” conditions, meaning that scans were captured with no attached images to potentially colorize 3D point clouds. It specifies that a signal pulse is accompanied by its own intensity strength from a single point measurement. Since data collection was achieved in a very short range and in an isolated room, the effect of other optical occurrences, such as atmospheric effects along the laser path, is quite negligible [6].
However, achieving standard intensity values under controlled laboratory conditions is not straightforward, as results are influenced by variations in scanner setups, scanning configurations, ambient lighting, target properties, etc. (Appendix A). It is important to note that the RGB values listed in Table 2 represent the true color values of the Macbeth color chart, whereas under “Scan Only” conditions, the strength of the laser response does not exactly correspond to the true RGB values (an empirical comparison to support the fact will be elaborated in Section 6.1). Thus, to determine the appropriate reference for each acquired dataset, the residuals for each color target under ideal conditions of scanning (orthogonal condition) were compared against the normalized intensity values. The smallest residuals consistently corresponded to the neutral color patches. Accordingly, neutral colors were selected as ground-truth reference for the datasets, as they are generally less affected by color variation and ambient lighting. Therefore, those colors yield the most stable results under experimental conditions. For example, in the case of Leica ScanStation C10, Neutral 5 demonstrated the least variation compared to all other colors.

6. Results

6.1. Pre-Processing Stages

6.1.1. Radiometric Comparison: Intensity vs. RGB Across Scanners

The datasets under identical experimental instructions, as explained, were captured by four scanners. At the first stage, the individual color patches were extracted using corresponding software: Leica Register 360 (Leica Geosystems, Heerbrugg, Switzerland) (https://leica-geosystems.com/products/laser-scanners/software/leica-cyclone/leica-cyclone-register-360) (accessed on 15 September 2025) for Leica devices and Trimble Business Centre (TBC) (Trimble, Westminster, CO, USA) (https://geospatial.trimble.com/de/products/software/trimble-business-center) (accessed on 15 September 2025) for the Trimble device. To reduce the influence of the outliers imposed during manual selection, intensity values within three standard deviations ( ± 3 σ ) from the average intensity of each color patch were filtered out. This filtering corresponds to a 99.7% confidence level to ensure that the dataset represents the true intensity values for each patch.
Under the “Scan Only” mode used in this study, analysis is solely based on the laser signal pulse. Thus, the specified outputs as the RGB values for a single point do not essentially represent true color values. Instead, they are the result of pseudo-color mapping, where the recorded laser intensity is artificially mapped into the red R , green G , and blue B channels for visualization purposes. Consequently, these RGB values are alternative representations of intensity, rather than physically meaningful color measures. Therefore, they cannot be considered reliable indicators of graphical colorization.
Table 4 compares the standard deviations of the averaged intensity values—from both the measured intensity values and those converted from RGB values—against standard intensity values for each color patch (as shown in Table 2) under both scanning conditions. The conversion equation from RGB values to the normalized standard intensity is provided in Appendix A.
The variations in intensities between the scanners are primarily associated with the scanners’ inherent operating systems, the laser wavelength, physical signal characteristics, transmitted pulse power, and noise levels—all of which are relevant to the radiometric behavior of surface reflection. The Leica ScanStation C10 has shorter wavelength ( 532   n m ), lower power for the transmitted laser ( 3 R corresponds to 8   m W ), and smaller beam divergence ( 0.09   m r a d ). This scanner is very sensitive to color intensity variations in comparison to the remaining scanners functioning with the longer wavelength (infrared domain) transmitting higher power for the laser ( 10   m W ).
Results originated from the orthogonal scanning condition (presumably the ideal condition), as demonstrated in Figure 8 and Figure A1, vary with respect to the normalized standard intensities. As mentioned earlier, the standard intensities presented in Table 2 might not be optimally replicable even under the ideal scanning condition. Those are derived from true color values and are not identical to these converted from the pseudo-RGB measures. Moreover, the insignificant changes in range and incidence angle at the single-point level pose an additional challenge for consistent intensity modeling. To overcome this difficulty for quantitative precision assessment, within each dataset, one neutral color is selected as the reference color patch. For instance, Neutral 6.5 for Leica ScanStation C10 and Trimble X9 (Figure 8 and Figure A1c, respectively), Neutral 5 for Leica ScanStation P50 (Figure A1a), and Neutral 8 for Leica RTC360 (Figure A1b) were opted (i.e., Neutral colors are approximately invariant against variations in spatial and radiometric characteristics and record the minimal residuals with respect to their standard intensities).
In summary, the intensity value—as the fourth dimension attached to conventional 3D spatial point coordinates—is a numerical uncalibrated attribute for each point. Given the signal strength capacity, RGB values are the alternate representation of the intensity. In addition, the combined impact of the visible wavelength with smaller beam divergence in the Leica Scan Station C10 results in a larger diameter for the receiver to effectively adjust the level of illumination area from a reflected signal, as demonstrated in Equation (10). In contrast, the other scanners deliver not only less precise color intensities (Table 4) but also very consistent intensities (larger area of illumination) (Section 6.1.2).

6.1.2. Reflectivity and Geometric Effects

In this section, the impact of three major factors, such as surface reflection and the individual geometric characteristics of the point, on the resulting intensity values is evaluated prior to implementing the proposed two-step method. The analyses here highlight that there is a general consistency in intensity among different color patches, with similar reflectivity patterns, whereas there is an inconsistency within the same patch due to minor variations in range and incidence angle among individual point measurements.
Figure 9 and Table 5 illustrate the distribution of intensity values with respect to the number of point observations for each color patch on the Macbeth color chart for the Leica ScanStation P50 and the other scanners, respectively. These findings indicate that this consistency is the result of surface reflection determination, irrespective of the target quality (i.e., color). The same conditions occur for the other scanners (Figure A2, Figure A3 and Figure A4 in Appendix B).
The distribution shown in Figure 9 under the orthogonal scanning condition exhibits a symmetric pattern, indicating that signal strength is least affected by variations in range and incidence angle when measurements are taken orthogonally (i.e., normal Gaussian distribution). In contrast, the distribution under the inclined condition (at incidence angles greater than 60°) shows a non-symmetric pattern, where a larger number of points with weaker intensity produce a longer left tail, while fewer observations with stronger signal strength form a shorter right tail (i.e., skewed Gaussian distribution). Consequently, the impact of geometric effects, range, and incidence angle appears unpredictable. A similar behavior occurred for the other scanners (Figure A2, Figure A3 and Figure A4 in Appendix B).
The variation in point-to-point intensity behavior within the same color patch in Figure 9 is largely attributable to differences in geometric range and incidence angle, despite identical reflectivity patterns. This discrepancy is evident not only for the Leica ScanStation C10 but also for all other colors scanned by different scanners (Table 6).
In summary, the arguments above emphasize the complexity of radiometric calibration of TLS—the consistency of intensities across different color qualities and the inconsistency of intensities among individual points within the same color quality. To address these challenges, the behavior of individual points must be considered to enhance accuracy for a clearer distinction between color-dependent reflectivity (intensity) (Section 6.2) and to reduce the sensitivity of the range and incidence angle on point reflectivity (precision), leading to consistent (standardized) intensity values (Section 6.3).

6.2. Pointwise Radiometric Calibration Using LiDAR Range Equation

The pre-processing analyses emphasized the importance of pointwise intensity modeling for 4D TLS radiometric calibration, as intensity values of individual points are influenced by variations in three factors—reflectivity, range, and incidence angle. This implies that reflectivity, with respect to the target texture (i.e., color dependency), must be integrated into the pointwise geometric calibration practice. To achieve this objective, the elaborated methodologies were developed using the LiDAR range equation through a refined determination of the LiDAR cross-section for the specific target quality (i.e., color). This ultimately guarantees the highest accuracy in reflectivity estimation and facilitates a clearer distinction of intensities among different color targets (Equations (7) and (8)).
The LiDAR range equation establishes the relative connection between received and transmitted energy power, incorporating two areal ratios to determine the intensity of the signal strength as follows:
(1)
At the receiver (i.e., TLS) location, the area of the receiver relative to the effective average area illuminated by the reflection from the target π r 2
(2)
At the target location, the area of the LiDAR cross-section relative to the illumination area (i.e., particular attention must be drawn to determining the non-physical area of cross-section).
As an example, for a randomly selected color (yellow), the variation of reflectivity, as the major influential radiometric factor for individual point intensity observations under the orthogonal scanning condition for the Leica ScanStation P50, is shown (Figure 10).
A further analysis of the reflectivity patterns for the individual points across different colors on the Macbeth color chart under both scanning conditions demonstrated that the returned signal power is consistent across all colors, resulting in approximately identical reflectivity behavior. In other words, the intensities obtained from the scanners are insensitive and invariant to the color properties of the target. This uniformity can be attributed to surfaces exhibiting predominantly diffuse reflectance ( 100 % diffuse reflectance ( γ d 1 ) ) (Equation (6)), producing similar responses for points with comparable textures. These support the adoption of a texture-dependent LiDAR cross-section technique to more effectively differentiate intensities across different surface targets (Equation (7)).
Considering the beam divergence and wavelength of the Leica ScanStation P50 (Table 1), combined with the standard intensity values of specific colors (Table 2), the color-dependent LiDAR cross-section area was determined exclusively for the points within the yellow patch (Figure 11). Given the comparison between the single-point LiDAR cross-section (Figure 11a) and the two other geometric components of the points (Figure 11b,c), the pointwise cross-section is more influenced by the range and incidence angle after the implementation of the approach. Importantly, the approach enables a better reproduction of color variations (Figure 12).
The uncertainties in each dataset before and after applying the implemented technique are listed (Table 7). The results deliver an acceptable level of improvement between 31 % and 49 % across different terrestrial laser scanners (Equation (13)). Interestingly, the results confirm the consistency of the proposed algorithm for the scanners operating at a shorter wavelength (Leica ScanStation C10) and at longer wavelengths (the remaining scanners). They demonstrate approximately identical accuracy afterward (0.09 and 0.10 for the orthogonal and inclined conditions, respectively). This outcome acknowledges the fact that the effect of geometric characteristics at the single-point level between the two scanning conditions has been addressed through the careful determination of the LiDAR range equation.
Here, to detach the color intensities in terms of the radiometric resolution in the TLS error model, the LiDAR range equation was proposed here to guarantee the required level of precision in reflectivity through the careful determination of the LiDAR cross-section. Thereafter, pointwise intensity modeling plays a critical role in achieving accurate intensity values for any sample target—not only for color properties, but also for other external formations and internal structures related to the targets. The following section elaborates on the standard radiometric practices through the weightings on the spatial range and incidence angle resolution of a single point observation. Clearly, the optimized values aim to attach real-world radiometric attributes to every single pulse as the fourth dimension of the 3D spatial point cloud.

6.3. Pointwise Radiometric Calibration Using a Neural Network

The radiometric calibration practice based on the amended LiDAR range equation—through effective application of the color-dependent LiDAR cross-section—shows a substantial accuracy improvement for pointwise intensity modeling—between 31% and 49% compared with the standard intensities of each color across four devices. These results assist in distinguishing intensity values according to the target characteristics (e.g., color) to better reflect individual color reprojection. However, more accurate intensity values do not necessarily generate consistent (precise) radiometric values within identical color quality. Additionally, the identical posteriori accuracy presented in Table 7 verifies the color-dependent LiDAR range equation partially elevates the precision by reducing the dependency between intensity and target–scanner geometry.
For further validation, the developed data-driven neural network, implemented in four steps as described in the Methods section, was implemented on each acquired dataset, and precision evaluation was performed for each color relative to the neutral color patches selected as the reference patches. As an example, a comparison is illustrated between the point-to-point measured intensities and those derived from neural network prediction methods on a randomly selected color patch (yellow) from the Leica ScanStation P50 under both orthogonal and inclined conditions (Figure 13).
The findings reveal that the modifications in the reflectivity pattern of the surface according to the intrinsic color target and further correction (or validation) of the resulting values by neural network mitigate the impacts of the single point geometric range and incidence angle lying in the same color patch. The framework ultimately leads to precise (standard) intensity values for the single-color patch, incorporating all geometric and radiometric characteristics of the points. Secondly, identical intensities under both conditions of scanning demonstrate enhanced consistency for the selected color regardless of scanning conditions and target geometry.
Subsequently, the histograms for all colors are depicted in Figure 14. Compared to Figure 9, the results exhibit narrower standard deviations (Table 8). The pointwise calibration approaches a normal Gaussian distribution for both scanning conditions, rather than the previously skewed distribution in inclined scanning conditions, which is here inadequately influenced by the weaker response at a steep inclined angle. Table 8 summarizes the precision evaluation using Equation (13) for the intensities obtained through data-driven radiometric calibration. The improvements in precision are approximately 97% or higher in comparison with the measured intensities.
In summary, calibration strategies for the fourth dimension of the TLS (radiometric attribution) exhibit improvements of at least 31% in the accuracy of reflectivity estimation—before and after applying the LiDAR range equation—against the standard intensities of each color, and at least 97% in precision—before and after validation process using the data-driven neural network—against the neutral reference plane with the same dataset. These benefits are achieved by first addressing color-dependent reflectivity and subsequently minimizing the sensitivity of intensity values to variations in spatial properties. This approach standardizes intensity values for points sharing the identical radiometric characteristics but differing in spatial attributes. It further underlines the following conceptual calibration advancement in TLS error analysis: integrating 3D point coordinates of TLS can be complemented with the 1D intensity coordinate which provides a pathway towards fully standardized 4D point cloud calibration. This improvement carries future implications for TLS applications where radiometric consistency is essential, such as characterizing the internal properties (e.g., material and roughness) and external topographies (e.g., edges and tilts) of targets.

7. Discussions on Reflectivity (Intensity)

In this section, the comparison of the two proposed methodologies—color-dependent LiDAR range equation and data-driven neural network—is discussed in terms of validating the pointwise intensity results. As discussed, the results of the color-focused LiDAR range equation partially enhance the precision in Table 7. This summarizes that the reflectivity pattern is the dominant factor. The similarity in the reflectivity patterns, as shown in Figure 15, represents that the data-driven approach can reliably reproduce the physically characterized parameterization of the LiDAR range equation within the reasonable standard variations (i.e., here, it refers to as the color reproduction from the signal pulse). Accordingly, it is noticeable that some points recorded a very irregular pattern of reflection, even after applying a 99.7% confidence level for outlier detection. This is attributed to the irregular impact of range and incidence angle on single-point reflectivity, as expected. Due to this, the data-driven method is applied to create homogeneity by imposing spatial weightings. Secondly, those might be the effects of other optical elements, such as scattering, refraction, etc. These might be future-involving parameters in the 4D calibration study. In general, comparable trends occurred for the remaining colors across different scanner systems and scanning conditions.
The proximity of the average intensities under two varying scanning conditions outlines that the spatial weights applied to the geometric range and incidence angle of an individual point, incorporated with the corresponding reflectivity of the individual points through surface-dependency reflection, bring positive outcomes (Figure 16). Particularly, these findings enable us to reduce the variabilities that were recorded as significantly high at larger incidence angles (i.e., the equal consistent intensities for each color regardless of the geometric target/scanning conditions).
Overall, the results from the proposed calibration framework address the following two key scenarios: (1) improved real-world reflectivity characterization, and (2) more precise (consistent) intensity values across different target texture qualities. For the first aspect, since reflectivity is often the primary indicator for identifying the internal characteristics of a target, a thorough understanding of the laser cross-section for any given target can significantly enhance the accurate determination of the received power amount. For example, in applications such as soil moisture detection, vegetation health assessment, surface material classification, and monitoring of painted or coated infrastructures (e.g., bridges, buildings, or road markings), texture-dependent reflectivity according to the signal pulse plays a critical role in distinguishing between radiometric variations. For the second aspect, the framework supports calibrated intensity as a reliable fourth dimension of the point cloud (i.e., the robust foundation for standardized 4D point cloud analysis across various LiDAR devices with multiple measurement configurations obtained from different platforms).

8. Conclusions and Future Investigations

This study aimed to investigate accurate radiometric values for 3D point clouds. The fourth dimension of the point cloud is assumed to be the reliable representation of signal intensity. Multiple factors, such as range, incidence angle, and reflectivity, contribute to the signal intensity. Note that not all points belonging to the same target quality reflect the signal uniformly. Under the proposed methodology, the innovative pointwise radiometric calibration addresses the spatial and radiometric conditions of individual points in order to enhance the accuracy and precision of intensity values for the points that often have the same target characteristics, accounting for the inherent physical properties of the laser and target geometry for better color reproduction of the signal pulse.
As already expressed, previous studies focused on applying the mean intensity values for targets with the same texture (i.e., using fitted planes through the laser radar range equation), which totally ignore the significance of LiDAR cross-section determination and the spatial conditions of single point measurements. The current work, for the first time, concentrates on the individual point observation to obtain more accurate and precise intensities using the LiDAR range equation through the appropriate resolution of the color-dependent LiDAR cross-section. As the initial step, the reflectivity estimation is pursued to clarify standard and accurate reflectivity respective to color variations on the Macbeth color chart. Furthermore, by relying on developed neural network algorithms, the precision of each optimized value is verified, providing meaningful radiometric values attached to each color, taking into account the single pulse geometric conditions.
For the data collection steps, four terrestrial laser scanners—Leica ScanStation P50, Leica ScanStation C10, Leica RTC360 (Leica Geosystems, Heerbrugg, Switzerland), Trimble X9 (Trimble, Westminster, Colorado, USA)—were used. The highest default resolution of each scanner under “Scan Only” was chosen during scanning. In addition, the internal condition of the calibration room was isolated from any external illumination interference, and observations were acquired under similar procedures at two fixed locations as follows: orthogonal (presumed as the ideal condition) and inclined (incidence angle larger than 60°), scanning conditions with respect to the Macbeth color chart. Reasonable improvements were observed in the accuracy and precision of the intensity values for each color patch as follows: uncertainty improvements between 31% and 49% against the standard intensities of each color and precision improvements of 97% against the neutral reference plane with the same dataset across different scanner devices and scanning conditions. The implemented techniques introduce a novel standardization of intensity values for points sharing the equivalent color textures obtained from a changeable geometric range and incidence angle.
For future works, the entire proposed algorithm has the potential to be applied to a wide range of target geometries (e.g., internal object- and external surface-related features). This predominantly leads to visible distinctions associated with identical target geometries. To advance this in a more sophisticated manner, the auxiliary theoretical parameterization of the texture-dependent LiDAR cross-section is strongly recommended, particularly with reference to the proportion of incident energy returned by an ideal Lambertian surface. The Lambertian surface is assumed to theoretically simplify research activities; however, this must be complemented by the other optical phenomena which might be reflected as the noise shown in Figure 15.

Author Contributions

Conceptualization, M.S. and L.P.; methodology, M.S.; software, M.S.; validation, M.S. and L.P.; formal analysis, M.S.; investigation, M.S. and L.P.; resources, M.S. and L.P.; data curation, M.S.; writing—original draft preparation, M.S.; writing—review and editing, L.P.; visualization, M.S.; supervision, L.P.; project administration, M.S.; funding acquisition, L.P. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by an Australian Government Research Training Program (RTP) Scholarship.

Data Availability Statement

The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding author(s).

Acknowledgments

The authors would like to sincerely thank the owner of the Survey Instrument Company in Newcastle, NSW, Colin Draper, who provided access to some equipment. The University of Newcastle has established an effective relationship with Seam Spatial Company located at Bongaree, Queensland, and the authors also would like to express their sincere appreciation for the effort of Peter Sergeant as the survey manager (NSW region) of Seam Spatial. We thank the Civil Surveying Lab team of the University, particularly Todd Wills, who efficiently supported the authors.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A. Perceptual Intensity Equation

The R G B values can be converted to perceptual intensity I using the following weighted normalized equation:
I = 0.2989   R + 0.587   G + 0.114   B 255 ,

Appendix B. Pre-Processing Steps

Figure A1. Average intensity values (dimensionless) for each color on the Macbeth color chart against the normalized standard values (dataset: (a) Leica ScanStation P50, (b) Leica RTC360, and (c) Trimble X9).
Figure A1. Average intensity values (dimensionless) for each color on the Macbeth color chart against the normalized standard values (dataset: (a) Leica ScanStation P50, (b) Leica RTC360, and (c) Trimble X9).
Sensors 25 07035 g0a1
Figure A2. Histograms of intensities versus the number of observations for each color on the Macbeth color chart (dataset: Leica ScanStation C10; scanning condition: (a) orthogonal and (b) inclined; and reference color patch: Neutral 6.5).
Figure A2. Histograms of intensities versus the number of observations for each color on the Macbeth color chart (dataset: Leica ScanStation C10; scanning condition: (a) orthogonal and (b) inclined; and reference color patch: Neutral 6.5).
Sensors 25 07035 g0a2
Figure A3. Histograms of intensities versus the number of observations for each color on the Macbeth color chart (dataset: Leica RTC360; scanning condition: (a) orthogonal and (b) inclined; and reference color patch: Neutral 8).
Figure A3. Histograms of intensities versus the number of observations for each color on the Macbeth color chart (dataset: Leica RTC360; scanning condition: (a) orthogonal and (b) inclined; and reference color patch: Neutral 8).
Sensors 25 07035 g0a3
Figure A4. Histograms of intensities versus the number of observations for each color on the Macbeth color chart (dataset: Trimble X9; scanning condition: (a) orthogonal and (b) inclined; and reference color patch: Neutral 6.5).
Figure A4. Histograms of intensities versus the number of observations for each color on the Macbeth color chart (dataset: Trimble X9; scanning condition: (a) orthogonal and (b) inclined; and reference color patch: Neutral 6.5).
Sensors 25 07035 g0a4

Appendix C. Pointwise Radiometric Calibration Using the LiDAR Range Equation

Figure A5. Average color-dependent LiDAR cross-section ( m m 2 ) (scanning condition: (a) orthogonal and (b) inclined).
Figure A5. Average color-dependent LiDAR cross-section ( m m 2 ) (scanning condition: (a) orthogonal and (b) inclined).
Sensors 25 07035 g0a5

Appendix D. Pointwise Radiometric Calibration Using a Neural Network

Figure A6. Histograms of intensities versus the number of observations for each color on the Macbeth color chart (dataset: Leica ScanStation C10; scanning condition: (a) orthogonal and (b) inclined; and reference color patch: Neutral 6.5).
Figure A6. Histograms of intensities versus the number of observations for each color on the Macbeth color chart (dataset: Leica ScanStation C10; scanning condition: (a) orthogonal and (b) inclined; and reference color patch: Neutral 6.5).
Sensors 25 07035 g0a6
Figure A7. Histograms of intensities versus the number of observations for each color on the Macbeth color chart (dataset: Leica RTC360; scanning condition: (a) orthogonal and (b) inclined; and reference color patch: Neutral 8).
Figure A7. Histograms of intensities versus the number of observations for each color on the Macbeth color chart (dataset: Leica RTC360; scanning condition: (a) orthogonal and (b) inclined; and reference color patch: Neutral 8).
Sensors 25 07035 g0a7
Figure A8. Histograms of intensities versus the number of observations for each color on the Macbeth color chart (dataset: Trimble X9; scanning condition: (a) orthogonal and (b) inclined; and reference color patch: Neutral 6.5).
Figure A8. Histograms of intensities versus the number of observations for each color on the Macbeth color chart (dataset: Trimble X9; scanning condition: (a) orthogonal and (b) inclined; and reference color patch: Neutral 6.5).
Sensors 25 07035 g0a8

References

  1. Mcmanamon, P.F. Field Guide to Lidar; SPIE Library: Bellingham, WA USA, 2015. [Google Scholar]
  2. National Research Council (NRC). Laser Radar: Progress and Opportunities in Active Electro-Optical Sensing; The National Academies Press: Washington, DC, USA, 2014. [Google Scholar]
  3. Reshetyuk, Y. Self-Calibration and Direct Georeferencing in Terrestrial Laser Scanning. Ph.D. Thesis, KTH University, Stockholm, Sweden, 2009. [Google Scholar]
  4. Soudarissanane, S.; Ree, J.V.; Bucksch, A.; Lindenbergh, R. Error Budget of Terrestrial Laser Scanning: Influence of the Incidence Angle on the Scan Quality. In Proc. of the 10. Anwendungsbezogener Workshop zur Erfassung, Modellierung, Verarbeitung und; Gesellschaft zur Forderung Angewandter Informatik: Berlin, Germany, 2006. [Google Scholar]
  5. Soudarissanane, S.; Lindenbergh, R.; Gorte, B. Reducing the errors in terrestrial laser scanning by optimizing the measurement set-up. In Proceedings of the International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Beijing, China, 3–11 July 2008. [Google Scholar]
  6. Sabzali, M.; Pilgrim, L. Hybrid Atmospheric Modeling of Refractive Index Gradients in Long-Range TLS-Based Deformation Monitoring. Remote Sens. 2025, 17, 3513. [Google Scholar] [CrossRef]
  7. Tan, K.; Cheng, X. Intensity data correction based on incidence angle and distance for terrestrial laser scanner. J. Appl. Remote Sens. 2015, 9, 094094. [Google Scholar] [CrossRef]
  8. Kaasalainen, S.; Jaakkola, A.; Kaasalainen, M.; Krooks, A.; Kukko, A. Analysis of Incidence Angle and Distance Effects on Terrestrial Laser Scanner Intensity: Search for Correction Methods. Remote Sens. 2011, 3, 2207–2221. [Google Scholar] [CrossRef]
  9. Soudarissananae, S.; Lindenbergh, R.; Menenti, M.; Teunissen, P. Incidence Angle Influence on the Quality of Terrestrial Laser Scanning Points. In Proceedings of the Laser Scanning 2009, Paris, France, 1–2 September 2009. Bretar, F., Pierrot-Deseillignay, M., Vosselman, G., Eds. [Google Scholar]
  10. Krooks, A.; Kaasalainen, S.; Hakala, T.K.H. Correction of intensity incidence angle effect in terrestrial laser scanning. In Proceedings of the ISPRS Workshop Laser Scanning, Antalya, Turkey, 11–13 November 2013. [Google Scholar]
  11. Soudarissanane, S.; Lindenbergh, R.C. Optimizing terrestrial laser scanning measurement set-ups. In Proceedings of the International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Calgary, AB, Canada, 29–31 August 2011. [Google Scholar]
  12. Soudarissanane, S.; Lindenbergh, R.; Menenti, M.; Teunissen, P. Scanning geometry: Influencing factor on the quality of terrestrial laser scanning points. ISPRS J. Photogram. Remote Sens. 2011, 66, 389–399. [Google Scholar] [CrossRef]
  13. Pfeifer, N.; Dorininger, P.; Haring, A.; Fan, H. Investigating terrestrial laser scanning intensity data: Quality and functional relations. In Proceedings of the International Conference on Optical 3-D Measurement Techniques VIII, Zurich, Switzerland, 9–12 July 2007; pp. 328–337. Available online: http://hdl.handle.net/20.500.12708/42100 (accessed on 15 September 2025).
  14. Voisin, S.; Foufou, S.; Truchetet, F.; Paage, D. Abidi Sttudy of ambient light influence for three-dimensional scanners based on structured light. Opt. Eng. 2007, 46, 030502. [Google Scholar] [CrossRef]
  15. Clark, J.; Robson, S. Accuracy of measurements made with a Cyrax 2500 Laser Scanner against surfaces of known color. Surv. Rev. 2004, 37, 626–638. [Google Scholar] [CrossRef]
  16. Bolkas, D.; Martinez, A. Effect of target color and scanning geometry on terrestrial LiDAR point-cloud noise and plane fitting. J. Appl. Geod. 2017, 12, 109–127. [Google Scholar] [CrossRef]
  17. Yaman, A.; Yilmaz, H.M. The effects of object surface colors on terrestrial laser scanners. Int. J. Eng. Geosci. 2017, 2, 68–74. [Google Scholar] [CrossRef]
  18. Stal, C.; De Maeyer, P.; De Ryck, M.; De Wulf, A.; Goossens, R.; Nutten, T. Comparison of Geometric and Radiometric Information from Photogrammetry and Color-Enriched Laser Scanning. In Proceedings of the FIG Working Week 2011: Bridging the Gap Between Cultures, Murrakech, Morocco, 18–22 May 2011. [Google Scholar]
  19. Wujanz, D. Intensity calibration method for 3D laser scanner. J. N. Z. Institute Surv. 2009, 299, 7–13. [Google Scholar]
  20. Julin, A.; Kurkela, M.; Rantanen, T.; Virtanen, J.P.; Maksimainen, M.; Kukko, A.; Kaartinen, H.; Vaaja, M.T.; Hyyppa, J.; Hyyppa, H. Evaluating the Quality of TLS Point Cloud Colorization. Remote Sens. 2020, 12, 2748. [Google Scholar] [CrossRef]
  21. Wang, Z.; Varga, M.; Medic, T.; Wieser, A. Assessing the alignment between geometry and colors in TLS colored point clouds. In Proceedings of the ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Science, Cairo, Egypt, 2–7 September 2023. [Google Scholar]
  22. Balaguer-Puig, M.; Molada-Tebar, A.; Marques-Mateu, A.; Lerma, J.L. Characterization of Intensity Values on Terrestrial Laser Scanning for Recording Enhancement. In Proceedings of the International Archives of the Photogrammetry, Remote Sensing and Spatial Information Science, Ottawa, ON, Canada, 28 August–1 September 2017. [Google Scholar]
  23. Sabzali, M.; Pilgrim, L. New Parametrization of Bundle Block Adjustment for Self-Calibration of Terrestrial Laser Scanner (TLS). Photogrammetric Rec. 2025, 40, 40009. [Google Scholar]
  24. Lichti, D.D.; Harvey, B.R. The effects of reflecting surface material properties on time-of-flight laser scanner measurements. In Proceedings of the Symposium on Geospatial theory, Processing and Applications, Ottawa, ON, Canada, 8–12 July 2002. [Google Scholar]
  25. Tonietto, L.; Gonzaga, L.J.; Veronez, M.R.; Kazmierczak, C.D.S.; Arnold, D.C.C.; Costa, C.A.D. New Method for Evaluating Surface Roughness Parameters Acquired by Laser Scanning. Nature 2019, 9, 15038. [Google Scholar] [CrossRef] [PubMed]
  26. Berger, A.B. State of the art in surface reconstruction from point clouds. Eurographics Star Rep. 2014, 1, 161–185. [Google Scholar]
  27. Mah, J.; Samson, C.; McKinnon, S.D.; Tibodeau, D. 3D laser imaging for surface roughness analysis. Int. J. Rock Mech. Min. Sci. 2013, 58, 111–117. [Google Scholar] [CrossRef]
  28. Moreau, N.; Roudet, C.; Gentil, C. Study and Comparison of Surface Roughness Measurements. In Journées du Groupe de Travail en, Lyon, 2014. Available online: https://hal.science/file/index/docid/1068988/filename/articleGTMG2014_Moreau_09_03_soir.pdf (accessed on 15 September 2025).
  29. Silva, B.B.D.; Braga, A.C.; Braga, C.C.; Olivera, L.M.M.D.; Montenegro, S.M.G.L.; Junior, B.B. Procedure for calculation of the albedo with OLI-Landsat 8 images: Application to Brazilian semi-arid. Rev. Bras. Eng. Agric. Ambient. 2016, 20, 3–8. [Google Scholar] [CrossRef]
  30. Maar, H.; Zogg, H.M. WFD-Wave Form Digitizer Technology White Paper. Leica Geosystem, 2021. Available online: https://naic.nrao.edu/arecibo/phil/hardware/theodolites/leicaDoc/Leica%20Nova%20Documentation/White%20Paper/WFD%20Technology/Leica_Nova_MS50_WFD-Wave_Form_Digitizer_Technology_en.pdf (accessed on 15 September 2025).
  31. Tan, K.; Cheng, X.; Cheng, X. Modeling hemispherical reflectance for natural surfaces based on terrestrial laser scanning backscattered intensity data. Opt. Soc. Am. 2016, 24, 22971–22988. [Google Scholar] [CrossRef] [PubMed]
  32. Tan, K.; Zhang, W.; Shen, F.; Cheng, X. Investigation of TLS Intensity Data and Distance Measurement Errors from Target Specular Reflections. J. Remote Sens. 2018, 10, 1077. [Google Scholar] [CrossRef]
  33. Mcmanamon, P.F. LiDAR Technologies and Systems; SPIE Press Book: Bellingham, WA, USA, 2019. [Google Scholar]
  34. Jenn, D. Radar and Laser Cross Section Engineering; American Institute of Aeronautics and Astronomy: Reston, VA, USA, 2019. [Google Scholar]
  35. Bishop, C. Pattern Recognition and Machine Learning; Springer: Berlin/Heidelberg, Germany, 2006. [Google Scholar]
  36. Rishi, S. The Guide to the Expression of Uncertainty in Measurement (GUM)—The New Approach; IOP Publishing Ltd.: Bristol, UK, 2024. [Google Scholar]
  37. Houghton, T. Harnessing Multiscale Non-imaging Optics for Automotive Flash LiDAR and Heterogenous Semiconductor Integration. Ph.D. Thesis, Arizona State University, Tempe, AZ, USA, 2020. [Google Scholar]
  38. Freeman, O.J.; Williamson, C.A. Visualizing the trade-offs between laser eye protection and laser eye dazzle. J. Laser Appl. 2020, 32, 012008. [Google Scholar] [CrossRef]
Figure 1. Incidence angle α and range r attached to the single measured point P [11].
Figure 1. Incidence angle α and range r attached to the single measured point P [11].
Sensors 25 07035 g001
Figure 2. The pts format of TLS deliverables (dataset: Leica ScanStation C10, and color: yellow). A randomly selected point cloud is shown, for demonstration purposes only acquired from CloudCompare v2.7.0 (https://www.cloudcompare.org; accessed on 15 September 2025).
Figure 2. The pts format of TLS deliverables (dataset: Leica ScanStation C10, and color: yellow). A randomly selected point cloud is shown, for demonstration purposes only acquired from CloudCompare v2.7.0 (https://www.cloudcompare.org; accessed on 15 September 2025).
Sensors 25 07035 g002
Figure 3. Different patterns of surface reflectivity (i.e., diffuse or specular reflection [32]).
Figure 3. Different patterns of surface reflectivity (i.e., diffuse or specular reflection [32]).
Sensors 25 07035 g003
Figure 4. (a) Point targets, and (b) extended targets [33].
Figure 4. (a) Point targets, and (b) extended targets [33].
Sensors 25 07035 g004
Figure 5. (a) Leica ScanStation P50, (b) Leica ScanStation C10, (c) Leica RTC360, and (d) Trimble X9.
Figure 5. (a) Leica ScanStation P50, (b) Leica ScanStation C10, (c) Leica RTC360, and (d) Trimble X9.
Sensors 25 07035 g005
Figure 6. Different classifications of the laser and corresponding maximum output power ( m W ) [37].
Figure 6. Different classifications of the laser and corresponding maximum output power ( m W ) [37].
Sensors 25 07035 g006
Figure 7. The Macbeth color chart [38].
Figure 7. The Macbeth color chart [38].
Sensors 25 07035 g007
Figure 8. Average intensity values (dimensionless) for each color on the Macbeth color chart against the standard intensity values (dataset: Leica ScanStation C10). Plots for the other scanners are provided in Appendix B.
Figure 8. Average intensity values (dimensionless) for each color on the Macbeth color chart against the standard intensity values (dataset: Leica ScanStation C10). Plots for the other scanners are provided in Appendix B.
Sensors 25 07035 g008
Figure 9. Histograms of intensity values versus the number of observations for each color on the Macbeth color chart (dataset: Leica ScanStation P50; scanning condition: (a) orthogonal and (b) inclined; and reference color patch: Neutral 5). Plots for the other scanners are provided in Appendix B.
Figure 9. Histograms of intensity values versus the number of observations for each color on the Macbeth color chart (dataset: Leica ScanStation P50; scanning condition: (a) orthogonal and (b) inclined; and reference color patch: Neutral 5). Plots for the other scanners are provided in Appendix B.
Sensors 25 07035 g009aSensors 25 07035 g009b
Figure 10. Relationship between individual observed point and reflectively (dimensionless) (dataset: Leica ScanStation P50; scanning condition: orthogonal; and color: yellow).
Figure 10. Relationship between individual observed point and reflectively (dimensionless) (dataset: Leica ScanStation P50; scanning condition: orthogonal; and color: yellow).
Sensors 25 07035 g010
Figure 11. Relationship between individual observed point and (a) color-dependent LiDAR cross-section ( m m 2 ), (b) range ( m ) , and (c) incidence angle ( r a d ) (dataset: Leica ScanStation P50; scanning condition: orthogonal; and color: yellow). For further clarifications, the average color-dependent LiDAR cross-sections for each scanner are plotted under both scanning conditions (Figure A5 in Appendix C).
Figure 11. Relationship between individual observed point and (a) color-dependent LiDAR cross-section ( m m 2 ), (b) range ( m ) , and (c) incidence angle ( r a d ) (dataset: Leica ScanStation P50; scanning condition: orthogonal; and color: yellow). For further clarifications, the average color-dependent LiDAR cross-sections for each scanner are plotted under both scanning conditions (Figure A5 in Appendix C).
Sensors 25 07035 g011aSensors 25 07035 g011b
Figure 12. Average intensity values (dimensionless) for each color on the Macbeth color (dataset: (a) Leica ScanStation C10, (b) Leica ScanStation P50, (c) Leica RTC360, and (d) Trimble X9).
Figure 12. Average intensity values (dimensionless) for each color on the Macbeth color (dataset: (a) Leica ScanStation C10, (b) Leica ScanStation P50, (c) Leica RTC360, and (d) Trimble X9).
Sensors 25 07035 g012aSensors 25 07035 g012b
Figure 13. Pointwise intensity modeling using a data-driven neural network (dataset: Lecia ScanStation P50; color: yellow; and scanning condition: (a) orthogonal and (b) inclined).
Figure 13. Pointwise intensity modeling using a data-driven neural network (dataset: Lecia ScanStation P50; color: yellow; and scanning condition: (a) orthogonal and (b) inclined).
Sensors 25 07035 g013
Figure 14. Histograms of intensities versus the number of observations for each color on the Macbeth color target (dataset: Leica ScanStation P50; scanning condition: (a) orthogonal and (b) inclined; and reference color patch: Neutral 5). Plots for the other scanners are provided in Appendix D.
Figure 14. Histograms of intensities versus the number of observations for each color on the Macbeth color target (dataset: Leica ScanStation P50; scanning condition: (a) orthogonal and (b) inclined; and reference color patch: Neutral 5). Plots for the other scanners are provided in Appendix D.
Sensors 25 07035 g014
Figure 15. Pointwise reflectivity modeling (dataset: Lecia ScanStation P50; color: yellow; and scanning condition: inclined (a) orthogonal, and (b) inclined).
Figure 15. Pointwise reflectivity modeling (dataset: Lecia ScanStation P50; color: yellow; and scanning condition: inclined (a) orthogonal, and (b) inclined).
Sensors 25 07035 g015
Figure 16. Average intensity values (dimensionless) for each color on the Macbeth color target (dataset: (a) Leica ScanStation C10, (b) Leica ScanStation P50, (c) Leica RTC360, and (d) Trimble X9).
Figure 16. Average intensity values (dimensionless) for each color on the Macbeth color target (dataset: (a) Leica ScanStation C10, (b) Leica ScanStation P50, (c) Leica RTC360, and (d) Trimble X9).
Sensors 25 07035 g016aSensors 25 07035 g016b
Table 1. Technical radiometric specifications of terrestrial laser scanners for this experiment.
Table 1. Technical radiometric specifications of terrestrial laser scanners for this experiment.
SpecificationsTLSs
Leica
ScanStation P50 1
Leica
ScanStation C10 2
Leica
RTC360 3
Trimble
X9 4
Laser class 1 3 R 1 1
Wavelength F r o m   685   n m   t o   1550   n m 532   n m 1550   n m From   1530   n m   to   1570   n m
Initial beam diameter 3.5   m m (FWHM) 6   m m
Spot size * From   0 50   m :   4.5   m m   (FWHH-based);   7   m m (Gaussian-based) 7.95   m m   at   10   m
Beam divergence < 0.23   m r a d (FWHM, full angle) 0.09   m r a d 0.5   m r a d (full angle) 0.8   m r a d
* The difference between spot size and initial beam diameter is that the spot size is the size of the laser pulse at the target location, while the initial beam diameter is introduced as the diameter of the beam at the transmitter location. The spot size can be converted to any given beam divergence ϑ in radians at any assumed range ϑ s p o t   s i z e r . 1 https://leica-geosystems.com/products/laser-scanners/scanners/leica-scanstation-p50 (accessed on 15 September 2025). 2 https://leica-geosystems.com/-/media/files/leicageosystems/products/brochures/leica_scanstation_c5_bro.ashx?la=en (accessed on 15 September 2025). 3 https://leica-geosystems.com/products/laser-scanners/scanners/leica-rtc360 (accessed on 15 September 2025). 4 https://geospatial.trimble.com/de/links?dcs=Collection-133328 (accessed on 15 September 2025).
Table 2. Standard intensity values for each color on the Macbeth color chart (dimensionless).
Table 2. Standard intensity values for each color on the Macbeth color chart (dimensionless).
Standard Intensity Values
Dark skin
(DS)
Light skin (LS)Blue sky
(BS)
Foliage
(F)
Blue flower
(BF)
Bluish green (BG)
0.350.630.650.380.530.62
Orange
(O)
Purplish blue (PB)Moderate red (MR)Purple
(P)
Yellow green
(YG)
Orange yellow (OY)
0.570.410.490.290.660.70
Blue
(B)
Green
(G)
Red
(R)
Yellow
(Y)
Magenta
(M)
Cyan
(C)
0.270.460.380.760.480.38
White
(W)
Neutral 8 (N8)Neutral 6.5
(N6.5)
Neutral 5
(N5)
Neutral 3.5
(N3.5)
Black
(Bl)
0.950.790.630.480.330.20
Table 3. The highest default resolution employed for the data experiments.
Table 3. The highest default resolution employed for the data experiments.
TLSs
Leica
ScanStation P50
Leica
ScanStation C10
Leica
RTC360
Trimble
X9
0.8   m m   at   10   m 0.002   m   at   10   m 3   m m   at   10   m 3   m m   at   10   m
Table 4. Comparison of standard deviations ( ± 1 σ ) between the observed radiometric values.
Table 4. Comparison of standard deviations ( ± 1 σ ) between the observed radiometric values.
Scanning
Conditions
Standard DeviationTLSs
Leica
ScanStation P50
Leica
ScanStation C10
Leica
RTC360
Trimble
X9
OrthogonalMeasured intensity0.1790.1340.1680.176
Computed intensity from RGB 0.1780.1320.1670.176
InclinedMeasured intensity0.1810.1790.1870.189
Computed intensity from RGB0.1820.1730.1890.189
Table 5. Average intensity values with the corresponding number of points (NP) for each color target on the Macbeth color chart.
Table 5. Average intensity values with the corresponding number of points (NP) for each color target on the Macbeth color chart.
TLSsLeica
ScanStation P50
Leica
ScanStation C10
Leica
RTC360
Trimble
X9
SC 1/NPONPINPONPINPONPINPONPINP
DS0.45816,3280.52311,2270.45925,5910.55412,4610.6300.1370.7246810.63218,9760.7915425
LS0.44615,8160.49910,7230.55825,9000.70011,6540.7260.1280.7477630.55018,9230.7095262
BS0.44716,1480.49210,3280.46726,0960.65610,1160.6090.1460.6689880.60719,1810.7454042
F0.44716,2850.43710,4190.45724,0210.62496560.7210.1450.6988400.69419,3810.7454457
BF0.45816,5480.50010,2580.47024,9690.64788950.5910.1320.7358140.66919,2980.7724049
BG0.45216,4220.55899070.57024,5570.59379430.6510.1280.7487960.68219,0830.7853386
O0.44416,2710.46810,9880.45125,1680.65513,0980.6770.1470.7477290.66918,8030.7685880
PB0.45116,5600.51310,9410.44826,2040.54111,4340.7240.1230.7325610.55918,8680.7255128
MR0.46416,8230.49010,2420.46126,2740.55497540.7730.1640.6167550.61519,3630.7613981
P0.44216,9520.45110,3230.45226,2330.49097530.6840.1520.7146280.56319,3620.7914632
YG0.47116,7500.53410,4540.71625,2530.55992590.6570.1550.7705930.59019,2250.7153992
OY0.45216,2880.56898990.51025,2300.69077820.7640.1290.7456790.61518,9890.8033222
B0.44616,4400.46310,9270.47524,3750.54312,8400.6950.1260.7226990.59218,7910.7675761
G0.48116,7020.47810,8260.48124,3540.61711,9260.7700.1320.7216410.61919,1310.7485275
R0.46216,6050.50810,6350.45725,5150.43095440.7030.1250.7288010.59519,2140.7423996
Y0.47216,8500.53910,8770.67324,9380.57110,0880.7230.1290.7446320.56419,1920.7954704
M0.47916,5960.48110,3610.45024,6590.50391510.7480.1450.6865850.59919,2520.7024008
C0.49116,5570.54297810.45024,7240.63079270.6230.1310.6977050.59718,9790.6963151
W0.46516,5950.46611,1760.58024,5890.57912,6680.6400.1480.7397570.59618,5390.6905571
N80.48016,7980.47910,9320.66724,9210.50911,8080.7030.1520.7215740.65419,0150.7195255
N6.50.47316,6590.49910,4670.47424,2640.64310,2980.7230.1360.7907560.60519,1920.7634155
N50.47716,7050.48110,4750.46624,5430.68910,0310.7620.1640.7246400.61219,1400.8254731
N3.50.47016,6820.51510,5940.44524,6740.65091300.7220.1380.7157420.61219,0490.7803870
Bl0.47616,8800.55910,3320.46724,5880.55779690.6550.1520.6886200.51618,8330.7233485
1 SC stands for scanning conditions, including two types, Orthogonal (O) and Inclined (I).
Table 6. Precision ( ± 1 σ ) of intensity for each color target on the Macbeth color chart.
Table 6. Precision ( ± 1 σ ) of intensity for each color target on the Macbeth color chart.
TLSsLeica
ScanStation P50
Leica
ScanStation C10
Leica
RTC360
Trimble
X9
Scanning
Conditions
OrthogonalInclinedOrthogonalInclinedOrthogonalInclinedOrthogonalInclined
Dark skin (DS)0.1420.1440.0990.1070.1370.1470.1280.107
Light skin (LS)0.1580.1370.1430.1610.1280.1720.1510.132
Blue sky (BS)0.1470.1410.1270.1540.1460.1680.1390.129
Foliage (F)0.1550.1330.1520.1380.1450.1590.1090.128
Blue flower (BF)0.1490.1400.1410.1460.1320.1770.1190.165
Bluish green (BG)0.1440.1560.1210.1500.1280.1690.1190.143
Orange (O)0.1440.1300.1150.1450.1470.1910.1140.103
Purplish blue (PB)0.1440.1430.1080.1080.1230.1720.1340.130
Moderate red (MR)0.1440.1390.1030.1130.1640.1610.1200.117
Purple (P)0.1410.1270.1040.0990.1520.1630.1260.128
Yellow green (YG)0.1420.1570.1250.1430.1550.1650.1380.149
Orange yellow (OY)0.1450.1520.1350.1660.1290.1720.1350.129
Blue (B)0.1450.1220.1090.1080.1260.1510.1290.112
Green (G)0.1440.1350.1340.1390.1320.1940.1290.139
Red (R)0.1420.1420.0980.0990.1250.1730.1310.121
Yellow (Y)0.1430.1430.1160.1350.1290.1700.1460.115
Magenta (M)0.1360.1400.0960.1140.1450.1790.1380.120
Cyan (C)0.1370.1480.1370.1460.1310.1700.1410.132
White (W)0.1490.1350.1180.1720.1480.1880.1460.133
Neutral 8 (N8)0.1490.1370.1290.1430.1520.1780.1460.149
Neutral 6.5 (N6.5)0.1420.1370.1330.1600.1360.2000.1360.123
Neutral 5 (N5)0.1420.1270.1490.1530.1640.2080.1350.143
Neutral 3.5 (N3.5)0.1430.1550.1420.1530.1380.1700.1270.145
Black (Bl)0.1400.1550.1130.1200.1520.1800.1340.152
Table 7. Accuracy ( ± 1 σ ) of intensity with respect to the standard intensities before and after the adoption of the LiDAR range equation.
Table 7. Accuracy ( ± 1 σ ) of intensity with respect to the standard intensities before and after the adoption of the LiDAR range equation.
Scanning
Conditions
Accuracy TLSs
Leica
ScanStation P50
Leica
ScanStation C10
Leica
RTC360
Trimble
X9
OrthogonalBefore0.1780.1340.1870.176
After 0.0930.0930.0960.095
Improvement48%31%49%46%
InclinedBefore0.1820.1790.1670.189
After 0.0970.1030.1040.116
Improvement47%42%38%39%
Table 8. Precision ( ± 1 σ ) of intensities with respect to neutral colors in each dataset after data-driven neural network adoption.
Table 8. Precision ( ± 1 σ ) of intensities with respect to neutral colors in each dataset after data-driven neural network adoption.
TLSsLeica
ScanStation P50
Leica
ScanStation C10
Leica
RTC360
Trimble
X9
Scanning
Conditions
OrthogonalInclinedOrthogonalInclinedOrthogonalInclinedOrthogonalInclined
Dark skin (DS)0.0010.0060.0010.0050.0010.0030.0010.002
Light skin (LS)0.0010.0090.0020.0110.0020.0050.0020.003
Blue sky (BS)0.0010.0090.0010.0100.0010.0060.0010.003
Foliage (F)0.0000.0040.0010.0050.0000.0030.0000.002
Blue flower (BF)0.0010.0060.0010.0060.0000.0040.0010.002
Bluish green (BG)0.0010.0070.0020.0060.0010.0040.0020.003
Orange (O)0.0010.0090.0010.0100.0020.0050.0020.003
Purplish blue (PB)0.0010.0060.0010.0050.0010.0030.0010.002
Moderate red (MR)0.0000.0060.0010.0060.0010.0030.0010.003
Purple (P)0.0000.0030.0000.0030.0000.0020.0000.001
Yellow green (YG)0.0010.0080.0010.0070.0010.0050.0010.003
Orange yellow (OY)0.0020.0080.0020.0080.0020.0050.0020.003
Blue (B)0.0010.0040.0010.0040.0010.0020.0010.001
Green (G)0.0010.0060.0010.0070.0020.0040.0010.002
Red (R)0.0000.0050.0010.0040.0010.0030.0010.002
Yellow (Y)0.0010.0100.0010.0090.0020.0050.0010.004
Magenta (M)0.0010.0050.0010.0040.0020.0030.0010.002
Cyan (C)0.0010.0040.0010.0040.0010.0030.0010.002
White (W)0.0020.0100.0020.0100.0030.0050.0020.003
Neutral 8 (N8)0.0020.0130.0030.0120.0040.0090.0030.004
Neutral 6.5 (N6.5)0.0010.0090.0010.0090.0020.0050.0010.003
Neutral 5 (N5)0.0010.0060.0010.0060.0020.0040.0010.002
Neutral 3.5 (N3.5)0.0010.0040.0010.0040.0010.0020.0010.002
Black (Bl)0.0000.0020.0000.0020.0010.0010.0000.001
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Sabzali, M.; Pilgrim, L. 4D Pointwise Terrestrial Laser Scanning Calibration: Radiometric Calibration of Point Clouds. Sensors 2025, 25, 7035. https://doi.org/10.3390/s25227035

AMA Style

Sabzali M, Pilgrim L. 4D Pointwise Terrestrial Laser Scanning Calibration: Radiometric Calibration of Point Clouds. Sensors. 2025; 25(22):7035. https://doi.org/10.3390/s25227035

Chicago/Turabian Style

Sabzali, Mansoor, and Lloyd Pilgrim. 2025. "4D Pointwise Terrestrial Laser Scanning Calibration: Radiometric Calibration of Point Clouds" Sensors 25, no. 22: 7035. https://doi.org/10.3390/s25227035

APA Style

Sabzali, M., & Pilgrim, L. (2025). 4D Pointwise Terrestrial Laser Scanning Calibration: Radiometric Calibration of Point Clouds. Sensors, 25(22), 7035. https://doi.org/10.3390/s25227035

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop