Next Article in Journal
Formalization of the Burning Process of Virtual Reality Objects in Adaptive Training Complexes
Previous Article in Journal
Vegetation Structure Index (VSI): Retrieving Vegetation Structural Information from Multi-Angular Satellite Remote Sensing
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Utilizing a Terrestrial Laser Scanner for 3D Luminance Measurement of Indoor Environments

1
Department of Built Environment, School of Engineering, Aalto University, FI-00076 Aalto, Finland
2
Finnish Geospatial Research Institute FGI, Geodeetinrinne 2, FI-02430 Masala, Finland
*
Author to whom correspondence should be addressed.
J. Imaging 2021, 7(5), 85; https://doi.org/10.3390/jimaging7050085
Submission received: 26 March 2021 / Revised: 29 April 2021 / Accepted: 4 May 2021 / Published: 10 May 2021
(This article belongs to the Special Issue High Dynamic Range Imaging)

Abstract

:
We aim to present a method to measure 3D luminance point clouds by applying the integrated high dynamic range (HDR) panoramic camera system of a terrestrial laser scanning (TLS) instrument for performing luminance measurements simultaneously with laser scanning. We present the luminance calibration of a laser scanner and assess the accuracy, color measurement properties, and dynamic range of luminance measurement achieved in the laboratory environment. In addition, we demonstrate the 3D luminance measuring process through a case study with a luminance-calibrated laser scanner. The presented method can be utilized directly as the luminance data source. A terrestrial laser scanner can be prepared, characterized, and calibrated to apply it to the simultaneous measurement of both geometry and luminance. We discuss the state and limitations of contemporary TLS technology for luminance measuring.

1. Introduction

Laser scanning is a commonly applied 3D measuring technology for indoor measurement. Laser scanning is based on measuring 3D coordinates from an environment using a laser beam. As a result, a 3D point cloud is formed from a dense set of 3D measurements. Most contemporary laser scanners also contain one or more integrated cameras that are used to capture a panoramic image used for point colorization. The R (red), G (green), and B (blue) values of the captured image are projected onto the point cloud to obtain coloring for points. In addition to visualization, the color information has been applied for registration [1] and segmentation [2]. However, the point cloud colorization quality varies, depending on the selected terrestrial laser scanning (TLS) instrument [3].
In the past, terrestrial laser scanning has been widely applied in archaeology [4], cultural heritage [5], forestry [6], industry [7], geology [8], surveying [9], and construction engineering [10]. Today, terrestrial laser scanners are also a commonly used instrument in the architecture, engineering, construction, owner, operator (AECOO) industry. In TLS, one path of development is automating the processing of raw measurement into more sophisticated 3D models [11,12,13]. Another path of development is the integration of parallel data and sensors in laser scanning [14,15].
Two-dimensional luminance photometry is commonly applied to measure indoor surface luminances [16,17]. Luminance is the measure of light reflected or emitted from an area, commonly measured in candelas per square meter (cd·m 2 ). In lighting design, luminance distribution is an important aspect, as it affects the security, well-being, visual comfort [18], and aesthetics of the indoor environment. The luminance distribution is usually measured via imaging luminance photometry, where a calibrated digital camera is used to obtain an absolute luminance value for each pixel. Imaging luminance photometry has been applied in the assessment of light pollution [19,20]. High dynamic range (HDR) imaging is a key technology in imaging luminance photometry [21]. In HDR imaging, a set of images with different exposure times is combined to extend the dynamic range of a single exposure. This technique has been applied in architecture [22]. Moreover, the HDR technique is under constant development, for example by being applied to 360 imaging [23] and by improved image fusion algorithms [24]. As a technology, imaging luminance photometry via HDR imaging has become well-established. However, an innate problem in measurement relying on individual images is the loss of 3D data in measuring.
Via photogrammetric 3D reconstruction, 2D luminance images can also be utilized for obtaining a 3D luminance measuring of a measured indoor environment [25]. Still, photogrammetry can perform poorly when measuring the 3D geometry of smooth, mono-colored, and uniform surfaces [26]. Luminance measurement applications require accurate radiometric data, and the use of 3D luminance measuring in design would be beneficial not only for lighting designers but also for architects [27]. However, indoor 3D luminance measurements made with a terrestrial laser scanner have not been extensively studied. Existing research has shown that luminance maps obtained via imaging luminance photometry can be combined with TLS [28] and MLS point clouds [29,30]. As stated, contemporary TLS instruments commonly contain imaging sensors for point cloud colorization. As the sensors are increasingly applicable for HDR imaging [3], the utilization of such HDR imaging-capable TLS instruments for producing a 3D point cloud with luminance information is a topical development issue. While the use of TLS for lighting design via luminance measuring has been suggested in earlier research [27,28], a solution employing the TLS images for luminance measuring is missing, since Rodrique et al. [27] utilized a separate imaging luminance photometer and they did not register the luminance values into a 3D luminance point cloud. Instead, they assessed the geometry and luminance measuring as separate entities. Vaaja et al. [28] manually combined images obtained with a conventional single-lens reflex camera into a point cloud produced by TLS. However, in this case, the images did not cover the full 360 , and the data integration relied on manual methodology, limiting the efficiency.
In this study, we aim to present a method to measure 3D luminance point clouds. We apply the integrated high dynamic range (HDR) panoramic camera system of a TLS instrument for 3D HDR luminance measurements simultaneously with laser scanning. We present a method for utilizing the images captured with a TLS instrument as the luminance data source (Table 1). Firstly, we present the luminance calibration of a laser scanner, and we assess the accuracy, color measurement properties, and dynamic range of luminance measurement achieved in a laboratory environment. Secondly, we demonstrate the 3D luminance measuring process through a case study with a luminance-calibrated laser scanner. We analyze the results and discuss the effect of scanning angles on luminance measurements. In addition, we explore future research directions in 3D luminance measuring. The novelty of our study is that the method covers the 360 3D luminance measurements and increases the level of automation in the data integration. In addition, the luminance point cloud data is enriched with the angle between the surface normal and the measurement direction.

2. Materials and Methods

2.1. Terrestrial Laser Scanner

For terrestrial laser scanning, we used a time-of-flight scanner Leica RTC360 (Hexagon AB, Stockholm, Sweden) [31,32]. According to the manufacturer, the scanning field of view is 360 horizontal and 300 vertical, and the measured 3D point accuracy is 1.9 mm at 10 m. The scanner has three 4000 × 3000 pixel image sensors mounted to the scanner body (see Figure 1). Together, the sensors cover a vertical view of 300 . These sensors are used to create a panoramic image of 20,480 × 10,240 pixels with 5-bracket HDR imaging. The entire equirectangular panoramic image consists of 12 adjacent vertical images. The total scan time is 4 min 21 s, including HDR imaging with a scan resolution setting of 3.0 mm at 10 m.
In the RTC360, HDR imaging is performed with a fixed exposure without any prior exposure measurements [3]. The imaging system of the RTC360 can therefore be calibrated to interpret the absolute luminance values of the measured environment. Furthermore, the panoramic image can be exported for editing as an EXR file without losing the high dynamic range of the images and registered into the point cloud without losing the dynamic information. These attributes make the Leica RTC360 a usable measurement device for luminance-calibrated terrestrial laser scanning.

2.2. Luminance Calibration of a Terrestrial Laser Scanner

2.2.1. Reference Color Target

A standardized color target, the X-Rite ColorChecker Classic chart (Grand Rapids, MI, USA) [33], was attached to the wall. The ColorChecker Classic chart is used in photography for creating camera profiles and correcting white balance and color. The chart is designed for color management in a variety of lighting conditions. Figure 2 shows the chart of 24 different colored patches with measured colorimetric reference data provided by X-Rite. The size of the ColorChecker Classic was 21.59 × 27.94 cm. In the X-Rite documents, the patches were labeled in a different order. Table 2 lists colorimetric reference data for the ColorChecker Classic manufactured after November 2014. The values were reported as CIE L*a*b* data.

2.2.2. Reference Luminance Measurements for the Color Target

The 16-bit sRGB values were measured and calculated for each patch in the reference color target. This was done for two reasons. Firstly, 16-bit sRGB values were not provided by the color target manufacturer. Secondly, by measuring and calculating the sRGB values for each patch ourselves, we were able to obtain the exact measurements in our laboratory environment, including especially the influence of lighting. Reference luminance values from the X-Rite ColorChecker Classic were measured with a Konica Minolta CS-2000 spectroradiometer (Teban Gardens Cres, Singapore). According to the manufacturer, the range of measurable luminances of the spectroradiometer is 0.003–500,000 cd·m 2 with a luminance measurement accuracy of ±2%. For each measured patch of the color target, the average of five consecutive measurements was used. For each channel, every measured value was scaled to the maximum 16-bit sRGB, calculated from the CIELAB values provided by X-Rite [33].
A test environment was set up for measuring the radiometric capability of tripod-mounted TLS instruments (Aalto University, Espoo, Finland). The space was illuminated by luminaires fitted with D65 standard fluorescent tubes with a color rendering value Ra > 93. Figure 3 illustrates a spectrum of the patch number 1 (Figure 2) in the ColorChecker measured with the spectroradiometer. The spikes of the D65 fluorescent illuminant are clearly visible in the spectrum. Figure 4 illustrates the CIE color matching functions x ¯ ( λ ) , y ¯ ( λ ) , z ¯ ( λ ) [34].
Each spectral power distribution P ( λ ) of the measured patches was converted into X, Y, and Z colour values applying the CIE color-matching functions [34] x ¯ ( λ ) , y ¯ ( λ ) , z ¯ ( λ ) (Equations (1)–(3)):
X = P ( λ ) x ¯ ( λ ) d λ ,
Y = P ( λ ) y ¯ ( λ ) d λ ,
Z = P ( λ ) z ¯ ( λ ) d λ ,
For each patch, the X, Y, and Z values were normalized and then converted into linear R, G, and B values in the sRGB (IEC 1999) color space, applying Equation (4):
R l i n e a r G l i n e a r B l i n e a r = 3.2406 1.5372 0.4986 0.9689 1.8758 0.0415 0.0557 0.2040 1.0570 X D 65 Y D 65 Z D 65
The acquired linear R, G, and B values were then scaled to make them comparable with the reference values and then applied in order to calculate the relative luminance values with Equation (5) from the sRGB standard [35]:
L r = 0.2126 R + 0.7152 G + 0.0722 B

2.2.3. Characterizing the Color and Luminance Capturing of the TLS

The HDR images (Figure 5) captured with the TLS were first exported as 32-bit EXR files which were then converted to linear 16-bit TIF format. From the linear images, the sRGB (standard Red Green Blue) R, G, and B values were obtained as a median pixel value for each patch of the color target and as the average of five images. The values were scaled in order to make them comparable with the measured values. For each channel, every value measured with the TLS was scaled to the maximum 16-bit sRGB calculated from the CIE L*a*b* values provided by X-Rite (Table 2). The 16-bit R, G, and B values were then converted into relative luminances applying Equation (5). A luminance calibration factor was obtained by comparing the relative luminance measured with the TLS to the absolute luminance measured with the spectroradiometer.

2.3. Luminance Data Processing

As in Section 2.2.3, the HDR images were exported as 32-bit EXR files from the TLS measurement data, and the 32-bit EXR files were converted to 16-bit TIF files, applying Python 3.6.9 with libraries OpenEXR (1.3.2) (San Francisco, CA, USA), Numpy (1.16.6) (Cambridge, MA, USA), and OpenCV-Python (4.2.0.32) (Willow Garage, Menlo Park, CA, USA). Relative luminance values were calculated for each pixel in the 16-bit TIF files applying Equation (5), and the 16-bit relative monochromatic luminance values were coded over the three 8-bit RGB channels of a respective pixel and a new image was saved as an 8-bit TIF [25]. Hence, the new 8-bit relative luminance TIF image contains a wider dynamic range than a regular 8-bit RGB image, as all three channels carry the relative luminance data. The coded 8-bit file format allowed further processing of data in software that does not support a wider dynamic range, e.g., 16-bit data. The 8-bit TIF images were projected and registered as the R, G, and B values in the point cloud. Point by point, the R, G, and B values were converted back to relative luminance values. Finally, the luminance calibration factor (see Section 2.2.3) was applied to interpret the relative luminance values as absolute luminance values, and the absolute luminance value was registered to each point in the point cloud. Figure 6 illustrates the luminance point cloud generating process.

2.4. Case Study

2.4.1. Study Area

Figure 7 shows the space measured, the B-Hall, a lecture hall at Aalto University, Espoo, Finland. The maximum capacity of B-Hall is 320 persons, and the floor area is 297 m2. The lecture hall was illuminated only by interior lights.
Seven scans were taken from the hall, and the scanned point clouds were registered with the manufacturer’s Leica Cyclone REGISTER 360 version 1.6.2 (Hexagon AB, Stockholm, Sweden) software [36]. Each scan took 4 min and 21 s. Linear EXR images were exported as separate linear image files and converted to 16-bit TIF images. The scanned point clouds were colored with TIF images, and the color values of the point clouds were converted to absolute luminance values, as described in Section 2.3. Lighting analysis was performed with CloudCompare 2.10.2 software (EDF, Paris, France) with standard tools such as plane fitting, octree subsampling, and distribution fitting.
In laser scanning, the point densities of measured surfaces vary, depending on the different angles of incident and the distance from the laser scanner. Hence, in order to balance the point density, all the point clouds from individual scan stations were sampled in CloudCompare using octree-based subsampling, where the octree level was set to 12. The size of a single scan was about 160 million points, and subsampling reduced the point cloud to about 16–25% of the original. The densest point spacing of the subsampled cloud was about 5 mm. The subsampled point clouds were then merged into a single point cloud, and the merged point cloud was resubsampled with octree level 12 to avoid unnecessarily large file sizes.

2.4.2. Sample Areas

We chose two sample areas (horizontal and vertical) for detailed analysis. In addition, we present a concise analysis for seven sample areas A–G (Table 3). The sizes of the sample areas were 0.5 m × 0.5 m. Figure 8 presents the sample areas. We applied the CloudCompare 2.10.2 plane fitting tool in order to obtain the angles between the scan stations and surface normal. The angles between the scan stations and the surface normal of the sample areas ranged from 9 to 88 degrees. Detailed information on the vertical and the horizontal sample areas can be found in Appendix A.2.

3. Results

3.1. Reference Color Target Measurements

Table 4 presents the reference sRGB values measured from the X-Rite ColorChecker Classic with the spectroradiometer. The measured spectral power distributions were converted into sRGB values applying Equations (1)–(4).

3.2. Luminance Measurement Comparison

Table 5 presents the laser scanner luminance measurements compared to the spectroradiometer luminance measurements. Only the lowest row of grayscale patches (1–6) were used for luminance calibration (see Figure 1). The 16-bit values were calculated into relative luminance values, applying Equation (5). The laser scanner absolute luminance measurements were derived using a simple linear regression with the spectroradiometer values. We assume that the sensor noise increases the low-end luminance values captured by the camera of the laser scanner. Hence, an improved iteration of the laser scanner absolute luminance values was derived by reducing the original 16-bit value by the absolute difference in the smallest compared luminance value (18.2 cd·m 2 − 13.9 cd·m 2 = 4.3 cd·m 2 ) multiplied by the calibration factor (146.3) obtained with the linear regression.
Figure 9 presents the laser scanner luminance measurement as a function of the reference luminance measurement and its linear trendline.
Applying the linear regression and the noise removal, the minimum and maximum measurable luminance values are 4.3 cd·m 2 and 443.6 cd·m 2 , respectively.
Table 6 presents the consistency of five consecutive images captured with the TLS. The relative standard deviation (RSD) in the 5 repetitions was less than 2% for every grayscale patch of the reference color target.
The channel-wise values can be found in Appendix A.1. For the measured patch number 12, the processing from the spectrum into sRGB values resulted in a negative value for the red channel. This is an expected outcome for certain colors. However, it obviously makes the comparison between the measured values and the image values questionable to a certain extent. Furthermore, some measured color patches were very out of proportion in the image compared to the measured values. However, the large relative differences in single channels did not carry through to the calculated luminance values and their relative differences. This may be explained by the fact that often the large relative difference in a single channel was due to a comparison to values that were absolutely small.
Table 7 displays the relative luminance values calculated from the TLS 16-bit linear images compared to the reference values measured with a spectroradiometer.
Table 8 presents the adjusted absolute luminance values for each patch in the color target measured with the TLS compared to the reference luminance values measured with the spectroradiometer. Figure 10 presents the 3D luminance point cloud of the reference color target.

3.3. Case Study

3.3.1. The Case Study of a Luminance Measurements

The chosen test site was measured with a luminance-calibrated TLS. Figure 11 shows the luminance measurement obtained a single scan station projected onto 3D points, while Figure 12 illustrates seven merged luminance measurements subsampled to the octree level 12 as described in Section 2.3. The range of measured luminances was 0–443.6 cd·m 2 . In the measured space, the measurement range covers most of the measurable surfaces. However, the luminance of the light sources and the surfaces around them were too high to be measured with the TLS used in this study.

3.3.2. Sample Area Analysis of 3D Luminance Measurements

Figure 13 illustrates the point clouds and their corresponding merged histograms for the vertical and horizontal sample areas (see Figure 8). Illustrations of each laser scan and their merged point clouds and corresponding histograms for both sample areas can be found in Appendix A.2.
Table 9 and Table 10 present the measured features and statistics for the vertical sample area and the horizontal sample area, respectively. The values presented are the median, Gaussian mean, minimum and maximum luminances, standard deviation, relative standard deviation, number of points, and angle between the surface normal and the measurement direction.
The sample areas show that, especially near the scanner, some scans are over-represented (Table 9 and Table 10). The number of points depends on the scanning angle and distance, so these features must be taken into account in the visual observation.
Table 11 presents the measured features and statistics for the sample areas A–G. The sample areas are from the merged luminance measurements subsampled to the octree level 12 as described in Section 2.3. The values presented are the median, Gaussian mean, minimum and maximum luminances, standard deviation, relative standard deviation, number of points, and angle between the surface normal and the measurement direction.

4. Discussion and Conclusions

4.1. Laboratory Measurements

We characterized the color and luminance measurement quality of a terrestrial laser scanner and we presented a workflow where an HDR image captured by a TLS instrument was converted into absolute luminance values. Compared to the reference, the TLS captured luminance values with an average absolute difference of 2.0 cd·m 2 and an average relative difference of 2.9% for the grayscale patches (No. 1–6). For all patches, the average absolute difference and average relative differences were 5.7 cd·m 2 and 7.5%, respectively. The relative difference between the TLS measurement and the reference measurement was notable for certain patches such as blue (46.7%) and cyan (22.3%). This indicates that certain heavily weighted spectra translate suboptimally into luminance values when using standard sRGB conversion factors. However, as we can characterize the channel-wise values for each patch in the X-Rite ColorChecker, we would be able to obtain conversion factors that would be more optimal for the camera in the TLS than the sRGB conversion factors. Optimized factors could possibly decrease the difference between the luminance values measured with the TLS and the reference values for the weighted spectra.

4.2. Field Measurements

We explored the possibilities of simultaneous laser scanning and luminance imaging through a case study. Thus, the level of automation increased in comparison with the previous luminance data and TLS point cloud integration, and the luminance data integrity and usability improved.
The dynamic range needed for luminance measurement depends on the application. The widest dynamic range is required when measuring nighttime outdoor environments, for example, road lighting. In order to measure the lowest end of mesopic luminances on the road surface to the glaring light source, a measurement range of 0.01 cd·m 2 to approximately 100,000 cd·m 2 would be needed. This is a little more than 23 f-stops. The system used in this study had an effective dynamic range of 4.3–443.6 cd·m 2 or a bit less than 9 f-stops. This dynamic range is almost sufficient to measure the luminance distribution of the surfaces in an indoor space but nowhere near wide enough for road lighting measurements. Moreover, it is a technologically difficult task to extend the dynamic range towards the low luminance levels. The sensors would have to be more sensitive yet have a better signal-to-noise ratio. Another solution is to apply HDR imaging with longer exposure times, which obviously makes measuring slower or less convenient.
For indoor applications, however, HDR imaging could be applied by adding images captured with shorter exposure times. This way, the dynamic range of a TLS could be extended to be sufficient for indoor measurement from the low-end surface luminances ( 1 cd·m 2 ) to the glaring light sources ( 100,000 cd·m 2 ). This upward extension of the dynamic range would enable the measurements needed when calculating the unified glare rating (UGR). Furthermore, it would be possible to measure the luminance of the light sources if the dynamic range of HDR imaging is wide enough.
To determine the location of the measuring device, terrestrial laser scanning allows the measurement angle to be defined for each measured point. The point can be assigned a location, color value, absolute luminance value, intensity, point normal, and angle between point normal and surface normal. This information can be used in the future to determine the properties of the scanned object, such as reflectivity and gloss. The angle between the scan station and the measured surface normal was not verified by any other method in this study. We considered the collected point cloud data accurate enough for angle measurement.

4.3. Limitations of TLS as a Luminance Photometer

Usually, a TLS instrument captures a panoramic image as a composite of several adjacent images that are overlapped and blended together. The technique is often called image stitching. The quality of the stitching is difficult to quantify, and we did not assess the inaccuracies of image stitching. However, the TLS instrument (Leica RTC360) could be more suitable if the uncertainty of the panoramic image stitching process was known and there was a possibility to maintain the bit depth of the measurement in the RGB-registered point clouds. As for now, registering the raw imaging bit depth into the point cloud requires manual effort.
Different TLS instruments employ various imaging sensor installations, such as completely separate camera systems operated from atop of the TLS instrument (e.g., Riegl [37]), integrated imaging sensors utilizing the same rotating mirror as the laser ranging sensor, or sets of cameras mounted in the instrument’s chassis, as in the applied Leica RTC360 scanner [3]. The realization of the imaging system affects the quality of produced panoramic images, e.g., through differences in parallax.
Contemporary TLS instruments are capable of obtaining rather high point densities and measurement speeds. For example, for the instrument applied in our work, the manufacturer reports a measuring speed of 2 million points per second and a point spacing of 3 mm at 10 m [31]. As a result, a single point cloud obtained with this instrument may contain up to approx. 200 million points [38]. A mapping campaign in a complex indoor environment may therefore well exceed a billion points. These data amounts present a technical challenge and require suitable storage systems to be applied in processing and distribution. Understandably, point cloud storage [39], distribution [40], and application [41] have become topical development tasks.
For assessing the color measurement of the TLS instrument, the 24 patch X-Rite ColorChecker Classic was applied. In order to improve the color measurement assessment, a color chart with 99 patches could be used as defined in ANSI/IES Method for Evaluating Light Source Color Rendition TM-30-20 [42].

4.4. Future Research Directions

In future studies, a method for determining the reflectivity of a surface can be developed as the locations of the measurements and the locations and luminances of the light sources are known. However, this method does not completely solve the reflectivity measurement. For more reliable reflectivity measurement, the light distribution of the light sources and the integration of light within the measurement space also need to be determined.
Simultaneous geometry and luminance measuring executed with a TLS can be applied in lighting design and lighting retrofitting. A 3D mesh model can be created from the measured point cloud. The mesh model can be converted into a CAD 3D model, which can be imported into lighting design software such as DIALux or Relux.
Since the scanner alone is not yet comparable in terms of image quality, the best result is obtained by combining terrestrial laser scanning and photogrammetry. As of yet, a TLS cannot replace conventional imaging luminance photometry in terms of luminance measurement. However, the TLS-based luminance measurement does not fall far behind. When the measurable luminance range is widened, the TLS luminance measurement would perform at a similar level as conventional imaging luminance photometry for indoor measurements and outdoor daytime measurements. Furthermore, both of these required improvements have been solved as individual technologies, but the advancements have not yet been implemented in a TLS. Hence, we are only a few steps away from luminance measurements being obtained as a side product of geometry measurement or vice versa. In TLS luminance measurement, the luminance data is registered into the measured geometry. This is a feature that is completely unobtainable using only conventional imaging luminance photometry.
As TLS point clouds capture the surrounding environment from all directions, their study requires different user interfaces than those used for navigating 2D image data sets. 3D point clouds can of course be studied on conventional displays, either with freely navigable 3D environments or—akin to panoramic images—by fixing the viewpoint and jumping from one measuring position to another. In complex indoor environments, immersive display devices, such as virtual reality head-mounted displays, offer a potentially more intuitive alternative for navigating complex virtual 3D environments. By leveraging game-engine technology, laser scanning point clouds can be brought into VR [43]. Adapting the point cloud visualization to the study of luminance data represents an obvious task for future development.

Author Contributions

Conceptualization, M.K., M.M., and A.J.; methodology, M.K., M.M., and A.J.; validation, M.K., M.M., T.R., and J.-P.V.; formal analysis, M.K. and M.M.; investigation, M.K., M.M., and T.R.; resources, H.H. and M.T.V.; writing—original draft preparation, M.K., M.M., A.J., T.R., J.-P.V.; writing—review and editing, M.K., M.M., A.J., T.R., J.-P.V., J.H., M.T.V., and H.H.; visualization, M.K., M.M., and T.R.; project administration H.H. and M.T.V.; funding acquisition, M.T.V., J.-P.V., J.H., and H.H. All authors have read and agreed to the published version of the manuscript.

Funding

The Strategic Research Council of the Academy of Finland is acknowledged for financial support for the project “Competence Based Growth Through Integrated Disruptive Technologies of 3D Digitalization, Robotics, Geospatial Information and Image Processing/Computing—Point Cloud Ecosystem (No. 293389, 314312)”. Additionally, this study has been done under the Academy of Finland Flagship Ecosystem “UNITE” (projects 337656 and VN/3482/2021), the Academy of Finland project Profi5 “Autonomous systems” (No. 326246), the European Social Fund (S21997) and the City of Helsinki Innovation Fund project “Helsinki Smart Digital Twin 2025”.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are openly available in Zenodo at 10.5281/zenodo.4743890, reference number [44].

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A. Measurement Details

Appendix A.1. Color Target

Table A1 and Table A2 show the 16-bit values for each R, G, and B channel measured with the TLS and the spectroradiometer respectively. Values are linear and scaled to be comparable. Table A3 presents the relative differences between the measurements.
Table A1. Linear TLS measurement scaled to be comparable with the X-Rite color target values for each channel R, G, and B.
Table A1. Linear TLS measurement scaled to be comparable with the X-Rite color target values for each channel R, G, and B.
Linear TLS Measurements
R
10,692.730,952.013,659.610,150.218,788.223,719.4
30,721.99517.424,409.875,61.327,007.035,455.9
6262.713,528.118,500.543,229.623,587.912,114.5
62,099.941,784.126,448.114,719.97959.93515.6
G
7979.823,218.514,811.011,299.716,310.033,972.4
17,776.48586.810,183.64419.733,092.526,607.6
6721.220,359.07095.939,333.010,118.417,906.8
62,077.741,613.926,379.514,525.87776.23321.9
B
6069.317,956.222,728.06253.327,530.830,691.4
5933.825,516.79621.19938.79466.15623.9
19,675.99071.14934.56909.819,288.626,818.1
60,513.941,550.726,353.414,199.28091.13544.0
Table A2. Linear spectroradiometer measurement scaled to be comparable with the X-Rite color target values for each channel R, G, and B.
Table A2. Linear spectroradiometer measurement scaled to be comparable with the X-Rite color target values for each channel R, G, and B.
Linear Spectroradiometer Measurements
R
12,564.938,994.48588.37652.116,275.19578.0
50,372.54940.838,649.17630.924,366.353,555.9
2660.24944.729,753.257,323.235,391.7−446.7
62,781.841,236.425,391.013,471.66497.82699.1
G
6283.620,566.613,919.411,057.214,978.036,325.3
14,128.27353.76226.03249.336,004.726,190.2
3735.420,983.33093.540,883.96377.416,835.5
62,268.641,525.325,701.313,374.66632.72605.9
B
4059.614,321.422,313.13513.828,129.326,760.9
1888.126,861.480,16.29593.02980.71327.7
20,103.24284.22760.3285.320,272.924,888.5
56,751.438,894.623,993.512,369.56284.72413.8
Table A3. The absolute values of relative differences between the linear TLS and spectroradiometer measurements for each channel R, G, and B.
Table A3. The absolute values of relative differences between the linear TLS and spectroradiometer measurements for each channel R, G, and B.
The Relative Differences
R
14.9%20.6%59.0%32.6%15.4%147.6%
39.0%92.6%36.8%0.9%10.8%33.8%
135.4%173.6%37.8%24.6%33.4%2811.9%
1.1%1.3%4.2%9.3%22.5%30.2%
average:76.4%
G
27.0%12.9%6.4%2.2%8.9%6.5%
25.8%16.8%63.6%36.0%8.1%1.6%
79.9%3.0%129.4%3.8%58.7%6.4%
0.3%0.2%2.6%8.6%17.2%27.5%
average:23.1%
B
49.5%25.4%1.9%78.0%2.1%14.7%
214.3%5.0%20.0%3.6%217.6%323.6%
2.1%111.7%78.8%2322.1%4.9%7.8%
6.6%6.8%9.8%14.8%28.7%46.8%
average:149.9%

Appendix A.2. Sample Areas

Table A4 and Table A5 show the luminance values of each sample area for each scan as well as illustrations of each laser scan and its merged point clouds and corresponding histogram for both of the samples. The deviations observed between the different scans were caused by different viewing angles and possible changes in luminance between the different scans.
Table A4. The vertical sample area: included scans (single scan stations “1–7”, merged scans “All”, and the merged scan subsampled with octree level 12), sample area visualization, histogram of luminances, number of points, and angle between the surface normal and the measurement direction.
Table A4. The vertical sample area: included scans (single scan stations “1–7”, merged scans “All”, and the merged scan subsampled with octree level 12), sample area visualization, histogram of luminances, number of points, and angle between the surface normal and the measurement direction.
No.Sample AreaHistogramPointsAngle
1 Jimaging 07 00085 i001 Jimaging 07 00085 i00210,06626
2 Jimaging 07 00085 i003 Jimaging 07 00085 i004946017
3 Jimaging 07 00085 i005 Jimaging 07 00085 i00610,18466
4 Jimaging 07 00085 i007 Jimaging 07 00085 i008831059
5 Jimaging 07 00085 i009 Jimaging 07 00085 i010353769
6 Jimaging 07 00085 i011 Jimaging 07 00085 i012627336
7 Jimaging 07 00085 i013 Jimaging 07 00085 i014880518
All Jimaging 07 00085 i015 Jimaging 07 00085 i01656,63517–69
Allsubsampled Jimaging 07 00085 i017 Jimaging 07 00085 i018929517–69
Table A5. The horizontal sample area: included scans (single scan stations “1–7”, merged scans “All”, and merged scan subsampled with octree level 12), sample area visualization, histogram of luminances, the number of points, and the angle between the surface normal and the measurement direction. Scans 2 and 3 were left with no observations due to the large measurement angle. Scan number 7 was partly overexposed and therefore omitted from the merged clouds.
Table A5. The horizontal sample area: included scans (single scan stations “1–7”, merged scans “All”, and merged scan subsampled with octree level 12), sample area visualization, histogram of luminances, the number of points, and the angle between the surface normal and the measurement direction. Scans 2 and 3 were left with no observations due to the large measurement angle. Scan number 7 was partly overexposed and therefore omitted from the merged clouds.
No.Sample areaHistogramPointsAngle
1 Jimaging 07 00085 i019 Jimaging 07 00085 i020737463
2---87
3---88
4 Jimaging 07 00085 i021 Jimaging 07 00085 i022361781
5 Jimaging 07 00085 i023 Jimaging 07 00085 i024226476
6 Jimaging 07 00085 i025 Jimaging 07 00085 i026262076
7 Jimaging 07 00085 i027 Jimaging 07 00085 i028633679
All Jimaging 07 00085 i029 Jimaging 07 00085 i03015,87563–88
Allsubsampled Jimaging 07 00085 i031 Jimaging 07 00085 i03210,36763–88

References

  1. Park, J.; Zhou, Q.Y.; Koltun, V. Colored Point Cloud Registration Revisited. In Proceedings of the IEEE International Conference on Computer Vision (ICCV), Venice, Italy, 22–29 October 2017. [Google Scholar]
  2. Zhan, Q.; Liang, Y.; Xiao, Y. Color-based segmentation of point clouds. Laser Scanning 2009, 38, 155–161. [Google Scholar]
  3. Julin, A.; Kurkela, M.; Rantanen, T.; Virtanen, J.P.; Maksimainen, M.; Kukko, A.; Kaartinen, H.; Vaaja, M.T.; Hyyppä, J.; Hyyppä, H. Evaluating the Quality of TLS Point Cloud Colorization. Remote Sens. 2020, 12, 2748. [Google Scholar] [CrossRef]
  4. Lerma, J.L.; Navarro, S.; Cabrelles, M.; Villaverde, V. Terrestrial laser scanning and close range photogrammetry for 3D archaeological documentation: The Upper Palaeolithic Cave of Parpalló as a case study. J. Archaeol. Sci. 2010, 37, 499–507. [Google Scholar] [CrossRef]
  5. Guarnieri, A.; Remondino, F.; Vettore, A. Digital photogrammetry and TLS data fusion applied to Cultural Heritage 3D modeling. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2006, 36, 1–6. [Google Scholar]
  6. Bienert, A.; Scheller, S.; Keane, E.; Mullooly, G.; Mohan, F. Application of terrestrial laser scanners for the determination of forest inventory parameters. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2006, 36, 1–5. [Google Scholar]
  7. Sternberg, H.; Kersten, T.P. Comparison of terrestrial laser scanning systems in industrial as-built-documentation applications. Opt. 3D Meas. Tech. VIII 2007, 1, 389–397. [Google Scholar]
  8. Buckley, S.J.; Howell, J.; Enge, H.; Kurz, T. Terrestrial laser scanning in geology: Data acquisition, processing and accuracy considerations. J. Geol. Soc. 2008, 165, 625–638. [Google Scholar] [CrossRef]
  9. Pinkerton, M. Terrestrial laser scanning for mainstream land surveying. Surv. Q. 2011, 300, 7. [Google Scholar]
  10. Yuan, L.; Guo, J.; Wang, Q. Automatic classification of common building materials from 3D terrestrial laser scan data. Autom. Constr. 2020, 110, 103017. [Google Scholar] [CrossRef]
  11. Sirmacek, B.; Lindenbergh, R. Active Shapes for Automatic 3D Modeling of Buildings. J. Imaging 2015, 1, 156–179. [Google Scholar] [CrossRef]
  12. Antón, D.; Medjdoub, B.; Shrahily, R.; Moyano, J. Accuracy evaluation of the semi-automatic 3D modeling for historical building information models. Int. J. Archit. Herit. 2018, 12, 790–805. [Google Scholar] [CrossRef] [Green Version]
  13. Yang, L.; Cheng, J.C.; Wang, Q. Semi-automated generation of parametric BIM for steel structures based on terrestrial laser scanning data. Autom. Constr. 2020, 112, 103037. [Google Scholar] [CrossRef]
  14. Stenz, U.; Hartmann, J.; Paffenholz, J.A.; Neumann, I. A Framework Based on Reference Data with Superordinate Accuracy for the Quality Analysis of Terrestrial Laser Scanning-Based Multi-Sensor-Systems. Sensors 2017, 17, 1886. [Google Scholar] [CrossRef] [Green Version]
  15. Ma, J.; Niu, X.; Liu, X.; Wang, Y.; Wen, T.; Zhang, J. Thermal Infrared Imagery Integrated with Terrestrial Laser Scanning and Particle Tracking Velocimetry for Characterization of Landslide Model Failure. Sensors 2020, 20, 219. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  16. Hiscocks, P.D.; Eng, P. Measuring luminance with a digital camera. Syscomp Electron. Des. Ltd. 2011, 686, 1–25. [Google Scholar]
  17. Wolska, A.; Sawicki, D. Practical application of HDRI for discomfort glare assessment at indoor workplaces. Measurement 2020, 151, 107179. [Google Scholar] [CrossRef]
  18. Chiou, Y.S.; Saputro, S.; Sari, D.P. Visual Comfort in Modern University Classrooms. Sustainability 2020, 12, 3930. [Google Scholar] [CrossRef]
  19. Jechow, A.; Kyba, C.C.; Hölker, F. Beyond All-Sky: Assessing Ecological Light Pollution Using Multi-Spectral Full-Sphere Fisheye Lens Imaging. J. Imaging 2019, 5, 46. [Google Scholar] [CrossRef] [Green Version]
  20. Wallner, S. Usage of Vertical Fisheye-Images to Quantify Urban Light Pollution on Small Scales and the Impact of LED Conversion. J. Imaging 2019, 5, 86. [Google Scholar] [CrossRef] [Green Version]
  21. Inanici, M.; Galvin, J. Evaluation of High Dynamic Range Photography as a Luminance Mapping Technique; OSTI: Oak Ridge, TN, USA, 2004; pp. 1–28. [Google Scholar] [CrossRef] [Green Version]
  22. Cauwerts, C.; Piderit, M.B. Application of High-Dynamic Range Imaging Techniques in Architecture: A Step toward High-Quality Daylit Interiors? J. Imaging 2018, 4, 19. [Google Scholar] [CrossRef] [Green Version]
  23. Hirai, K.; Osawa, N.; Hori, M.; Horiuchi, T.; Tominaga, S. High-Dynamic-Range Spectral Imaging System for Omnidirectional Scene Capture. J. Imaging 2018, 4, 53. [Google Scholar] [CrossRef] [Green Version]
  24. Merianos, I.; Mitianoudis, N. Multiple-Exposure Image Fusion for HDR Image Synthesis Using Learned Analysis Transformations. J. Imaging 2019, 5, 32. [Google Scholar] [CrossRef] [Green Version]
  25. Kurkela, M.; Maksimainen, M.; Julin, A.; Virtanen, J.P.; Männistö, I.; Vaaja, M.T.; Hyyppä, H. Applying photogrammetry to reconstruct 3D luminance point clouds of indoor environments. Archit. Eng. Des. Manag. 2020, 1–17. [Google Scholar] [CrossRef]
  26. Lehtola, V.V.; Kurkela, M.; Hyyppä, H. Automated image-based reconstruction of building interiors—A case study. Photogramm. J. Finl. 2014, 24, 1–13. [Google Scholar] [CrossRef]
  27. Rodrigue, M.; Demers, C.M.H.; Parsaee, M. Lighting in the third dimension: Laser scanning as an architectural survey and representation method. Intell. Build. Int. 2020, 1–17. [Google Scholar] [CrossRef]
  28. Vaaja, M.T.; Kurkela, M.; Virtanen, J.P.; Maksimainen, M.; Hyyppä, H.; Hyyppä, J.; Tetri, E. Luminance-Corrected 3D Point Clouds for Road and Street Environments. Remote Sens. 2015, 7, 11389–11402. [Google Scholar] [CrossRef] [Green Version]
  29. Vaaja, M.T.; Kurkela, M.; Maksimainen, M.; Virtanen, J.P.; Kukko, A.; Lehtola, V.V.; Hyyppä, J.; Hyyppä, H. Mobile mapping of night-time road environment lighting conditions. Photogramm. J. Finl. 2018, 26, 1–17. [Google Scholar] [CrossRef]
  30. Maksimainen, M.; Vaaja, M.T.; Kurkela, M.; Virtanen, J.P.; Julin, A.; Jaalama, K.; Hyyppä, H. Nighttime Mobile Laser Scanning and 3D Luminance Measurement: Verifying the Outcome of Roadside Tree Pruning with Mobile Measurement of the Road Environment. ISPRS Int. J. Geo-Inf. 2020, 9, 455. [Google Scholar] [CrossRef]
  31. Leica RTC360 3D Laser Scanner | Leica Geosystems. Available online: https://leica-geosystems.com/products/laser-scanners/scanners/leica-rtc360 (accessed on 24 March 2021).
  32. Biasion, A.; Moerwald, T.; Walser, B.; Walsh, G. A new approach to the Terrestrial Laser Scanner workflow: The RTC360 Solution. In FIG Working Week 2019: Geospatial Information for a Smarter Life and Environmental Resilience; FIG: Hanoi, Vietnam, 2019. [Google Scholar]
  33. X-Rite. New Color Specifications for ColorChecker SG and Classic Charts—X-Rite (2016). Available online: https://zenodo.org/record/3245895#.YJiptqExVPY (accessed on 24 March 2021).
  34. Smith, T.; Guild, J. The C.I.E. colorimetric standards and their use. Trans. Opt. Soc. 1931, 33, 73–134. [Google Scholar] [CrossRef]
  35. International Electrotechnical Commission. Multimedia Systems and Equipment-Colour Measurement and Management—Part 2-1: Colour Management-Default RGB Colour Space-sRGB. IEC 61966-2-1. 1999. Available online: https://webstore.iec.ch/publication/6169 (accessed on 24 March 2021).
  36. Leica Cyclone REGISTER 360 | Leica Geosystems. Available online: https://leica-geosystems.com/products/laser-scanners/software/leica-cyclone/leica-cyclone-register-360 (accessed on 24 March 2021).
  37. RIEGL-Produktdetail. Available online: http://www.riegl.com/nc/products/terrestrial-scanning/produktdetail/product/scanner/48/ (accessed on 24 March 2021).
  38. White Paper: Leica RTC360 Image Resolution | Leica Geosystems. Available online: https://leica-geosystems.com/products/laser-scanners/scanners/rtc360-image-resolution-white-paper (accessed on 24 March 2021).
  39. Van Oosterom, P.; Martinez-Rubi, O.; Ivanova, M.; Horhammer, M.; Geringer, D.; Ravada, S.; Tijssen, T.; Kodde, M.; Gonçalves, R. Massive point cloud data management: Design, implementation and execution of a point cloud benchmark. Comput. Graph. 2015, 49, 92–125. [Google Scholar] [CrossRef]
  40. El-Mahgary, S.; Virtanen, J.P.; Hyyppä, H. A Simple Semantic-Based Data Storage Layout for Querying Point Clouds. ISPRS Int. J. Geo-Inf. 2020, 9, 72. [Google Scholar] [CrossRef] [Green Version]
  41. Kulawiak, M.; Kulawiak, M.; Lubniewski, Z. Integration, Processing and Dissemination of LiDAR Data in a 3D Web-GIS. ISPRS Int. J. Geo-Inf. 2019, 8, 144. [Google Scholar] [CrossRef] [Green Version]
  42. David, A.; Fini, P.T.; Houser, K.W.; Ohno, Y.; Royer, M.P.; Smet, K.A.G.; Wei, M.; Whitehead, L. Development of the IES method for evaluating the color rendition of light sources. Opt. Express 2015, 23, 15888–15906. [Google Scholar] [CrossRef]
  43. Virtanen, J.P.; Daniel, S.; Turppa, T.; Zhu, L.; Julin, A.; Hyyppä, H.; Hyyppä, J. Interactive dense point clouds in a game engine. ISPRS J. Photogramm. Remote Sens. 2020, 163, 375–389. [Google Scholar] [CrossRef]
  44. Kurkela, M.; Maksimainen, M.; Julin, A.; Rantanen, T.; Virtanen, J.P. B-Hall Sample Areas for Lighting Analysis (Version 1.0.0). Zenodo. 2021. Available online: https://zenodo.org/record/4743890#.YJkVAdwRWhc (accessed on 24 March 2021).
Figure 1. Time-of-flight scanner Leica RTC360.
Figure 1. Time-of-flight scanner Leica RTC360.
Jimaging 07 00085 g001
Figure 2. The measured target X-Rite ColorChecker Classic and the patch numbers used in this study.
Figure 2. The measured target X-Rite ColorChecker Classic and the patch numbers used in this study.
Jimaging 07 00085 g002
Figure 3. The normalized spectral power function of patch No. 1.
Figure 3. The normalized spectral power function of patch No. 1.
Jimaging 07 00085 g003
Figure 4. The CIE color matching functions x ¯ ( λ ) , y ¯ ( λ ) , z ¯ ( λ ) .
Figure 4. The CIE color matching functions x ¯ ( λ ) , y ¯ ( λ ) , z ¯ ( λ ) .
Jimaging 07 00085 g004
Figure 5. The color target cropped from the panoramic image captured with the TLS instrument.
Figure 5. The color target cropped from the panoramic image captured with the TLS instrument.
Jimaging 07 00085 g005
Figure 6. The workflow for creating data for indoor 3D luminance maps.
Figure 6. The workflow for creating data for indoor 3D luminance maps.
Jimaging 07 00085 g006
Figure 7. The 360 panoramic image taken with the TLS instrument.
Figure 7. The 360 panoramic image taken with the TLS instrument.
Jimaging 07 00085 g007
Figure 8. The intensity image from scanning station 6 shows the locations of the sample areas for luminance measurements. The green areas represent the sample areas A–G. The red area represents the vertical sample area. The yellow area represents the horizontal sample area. White points represent the 6 different scanning locations with the seventh scanning location being the observer of the image.
Figure 8. The intensity image from scanning station 6 shows the locations of the sample areas for luminance measurements. The green areas represent the sample areas A–G. The red area represents the vertical sample area. The yellow area represents the horizontal sample area. White points represent the 6 different scanning locations with the seventh scanning location being the observer of the image.
Jimaging 07 00085 g008
Figure 9. The TLS luminance measurement (y) presented as a function of the reference luminance measurement (x) and its linear trendline.
Figure 9. The TLS luminance measurement (y) presented as a function of the reference luminance measurement (x) and its linear trendline.
Jimaging 07 00085 g009
Figure 10. Luminances of the measured color target X-Rite ColorChecker Classic. The patches in the lowest row of patches (1–6) are the grayscale patches used for luminance calibration.
Figure 10. Luminances of the measured color target X-Rite ColorChecker Classic. The patches in the lowest row of patches (1–6) are the grayscale patches used for luminance calibration.
Jimaging 07 00085 g010
Figure 11. The luminance point cloud of scanning station 2.
Figure 11. The luminance point cloud of scanning station 2.
Jimaging 07 00085 g011
Figure 12. The subsampled luminance point cloud of all scanning stations.
Figure 12. The subsampled luminance point cloud of all scanning stations.
Jimaging 07 00085 g012
Figure 13. The point cloud (a) and its corresponding histogram (b) for the vertical sample area, and the point cloud (c) and its corresponding histogram (d) for the horizontal sample area.
Figure 13. The point cloud (a) and its corresponding histogram (b) for the vertical sample area, and the point cloud (c) and its corresponding histogram (d) for the horizontal sample area.
Jimaging 07 00085 g013
Table 1. Measurements and results performed in the study and their sections.
Table 1. Measurements and results performed in the study and their sections.
Laboratory measurementsMethod:Reference color target measurements: (Section 2.2 Luminance calibration of a terrestrial laser scanner; Section 2.3 Luminance data processing)
Results:Luminance calibration factor for TLS: (Section 3.1 Reference color target measurements; Section 3.2 Luminance measurement comparison and Appendix A.1 Color target)
Field measurementsMethod:TLS data of the study area: (Section 2.4 Case study) and the luminance calibration factor from laboratory measurements
Results:Absolute luminance point clouds: (Section 3.3 Case study and Appendix A.2 Sample areas)
Table 2. The colorimetric reference data for the ColorChecker Classic chart provided by X-Rite. X-Rite No. is the patch name used by X-Rite; L is the luminance value; a and b are color coordinates.
Table 2. The colorimetric reference data for the ColorChecker Classic chart provided by X-Rite. X-Rite No. is the patch name used by X-Rite; L is the luminance value; a and b are color coordinates.
Patch No.X-Rite No.Lab
1A495.19−1.032.93
2B481.29−0.570.44
3C466.89−0.75−0.06
4D450.76−0.130.14
5E435.63−0.46−0.48
6F420.640.07−0.46
7A328.3715.42−49.8
8B354.38−39.7232.27
9C342.4351.0528.62
10D381.82.6780.41
11E350.6351.28−14.12
12F349.57−29.71−28.32
13A262.7335.8356.5
14B239.4310.75−45.17
15C250.5748.6416.67
16D230.122.54−20.87
17E271.77−24.1358.19
18F271.5118.2467.37
19A137.5414.3714.92
20B164.6619.2717.5
21C149.32−3.82−22.54
22D143.46−12.7422.72
23E154.949.61−24.79
24F170.48−32.26−0.37
Table 3. The sample areas.
Table 3. The sample areas.
Sample AreaMaterial
Awooden door
Bpainted wood
Cpainted concrete wall
Dpainted vertical slatted timber
Epainted vertical slatted timber
Fpainted horizontal slatted timber
Gpainted concrete wall
Verticaltextile-covered acoustic board
Horizontalwooden table
Table 4. The sRGB values (16-bit) of the X-Rite ColorChecker Classic board, calculated from the spectra measured with the spectroradiometer, and scaled to match the X-Rite nominal values.
Table 4. The sRGB values (16-bit) of the X-Rite ColorChecker Classic board, calculated from the spectra measured with the spectroradiometer, and scaled to match the X-Rite nominal values.
Patch No.RGB
162,78262,26956,751
241,23641,52538,895
325,39125,70123,993
413,47213,37512,370
5649866336285
6269926062414
72660373520,103
8494520,9834284
929,75330942760
1057,32340,884285
1135,392637720,273
12−44716,83624,888
1350,37214,1281888
144941735426,861
1538,64962268016
16763132499593
1724,36636,0052981
1853,55626,1901328
1912,56562844060
2038,99420,56714,321
21858813,91922,313
22765211,0573514
2316,27514,97828,129
24957836,32526,761
Table 5. The laser scanner luminance measurements compared to a spectroradiometer. The table shows the differences and relative differences between the reference luminance measured with a spectroradiometer and the luminance measured with a TLS. The luminance measured with a TLS was calculated by linear regression and by linear regression and noise removal.
Table 5. The laser scanner luminance measurements compared to a spectroradiometer. The table shows the differences and relative differences between the reference luminance measured with a spectroradiometer and the luminance measured with a TLS. The luminance measured with a TLS was calculated by linear regression and by linear regression and noise removal.
Patch No.ABCDiff. (A,C)Relative Diff. (A,C)DDiff. (A,D)Relative Diff. (A,D)
1329.848,753.6333.23.41.0%330.81.00.3%
2219.632,784.2224.14.42.0%221.01.40.6%
3135.820,786.2142.16.34.7%138.52.82.0%
470.911,449.678.37.410.4%74.43.54.9%
535.06165.642.17.120.4%38.13.08.7%
613.92665.218.24.331.1%14.00.10.7%
A: Spectroradiometer value in cd·m 2 . B: Laser scanner 16-bit value, average of five scans. C: Laser scanner luminance value in cd·m 2 , obtained by linear regression. D: Laser scanner luminance value in cd·m 2 , obtained by linear regression and noise removal.
Table 6. The consistency of images in five consecutive TLS scans.
Table 6. The consistency of images in five consecutive TLS scans.
Patch No.#1#2#3#4#5avgSTDRSD
148,40348,07348,43749,86748,98848,753.6703.71.44%
232,65832,38432,80333,31732,75932,784.2339.51.04%
320,71820,57220,98721,02820,62620,786.2209.21.01%
411,39611,39411,51211,60111,34511,449.6104.50.91%
5614461126180629261006165.677.21.25%
6263826362680271226602665.231.71.19%
Table 7. The relative luminance values calculated from the TLS 16-bit linear images compared to different sets of reference values. The table presents the values ordered according to the reference color target (Figure 2).
Table 7. The relative luminance values calculated from the TLS 16-bit linear images compared to different sets of reference values. The table presents the values ordered according to the reference color target (Figure 2).
TLS 16-Bit Linear Images Compared to the ReferenceValues Measured with a Spectroradiometer
Relative luminance values calculated from the TLS 16-bit linear images
8418.624,482.715,137.810,691.017,647.031,555.7
19,673.610,007.013,167.55486.030,092.926,973.7
7559.018,091.837,820.537,820.513,644.117,318.7
61,969.541,645.514,543.514,543.57838.03379.1
Linear spectroradiometer values calculated and scaled from the measured spectra
7458.424,033.513,392.09788.616,203.329,948.3
20,950.08249.213,248.44638.931,146.130,213.0
4688.516,367.88737.341,447.713,549.13,742.7
61,979.341,273.925,512.013,322.66578.92611.9
Relative difference between the values measured with a TLS and a spectroradiometer
12.9%1.9%13.0%9.2%8.9%5.4%
6.1%21.3%0.6%18.3%3.4%10.7%
61.2%10.5%7.2%8.8%0.7%26.0%
0.0%0.9%3.5%9.2%19.1%29.4%
Average:12.0 %
Table 8. The adjusted TLS luminance measurements compared to the reference values measured with a spectroradiometer. The table presents the values ordered according to the reference color target (Figure 2).
Table 8. The adjusted TLS luminance measurements compared to the reference values measured with a spectroradiometer. The table presents the values ordered according to the reference color target (Figure 2).
TLS Luminance Measurements Compared to the Reference Values Measuredwith a Spectroradiometer
Absolute adjusted luminance values measured with a TLS
41.1127.977.753.491.3166.4
101.750.166.725.4158.1141.1
36.893.446.2199.669.589.6
330.9221.0138.474.338.114.0
Absolute adjusted luminance values measured with a spectroradiometer
39.7127.871.452.086.4159.4
111.344.170.524.7165.5160.5
25.187.046.4220.272.273.3
329.8219.6135.870.935.013.9
Absolute difference
1.50.16.41.45.07.1
9.66.03.70.77.419.4
11.76.40.320.62.716.3
1.11.32.73.43.10.1
Average:5.7
Relative difference
3.7%0.1%8.9%2.7%5.8%4.4%
8.6%13.7%5.3%2.8%4.5%12.1%
46.7%7.4%0.6%9.3%3.7%22.3%
0.3%0.6%2.0%4.9%8.8%0.4%
Average:7.5 %
Table 9. The vertical sample area: median luminance, Gaussian mean luminance, minimum luminance, maximum luminance, standard deviation, relative standard deviation, number of points, and angle between the surface normal and the measurement direction.
Table 9. The vertical sample area: median luminance, Gaussian mean luminance, minimum luminance, maximum luminance, standard deviation, relative standard deviation, number of points, and angle between the surface normal and the measurement direction.
No.LmedianLmeanLminLmaxLSTDLRSDPointsAngle
1243.0242.2198.1279.711.04.5%10,06626
2346.6345.7278.3402.013.84.0%946017
3379.9349.9296.9443.022.76.0%10,18466
4358.7358.2314.7411.712.93.6%831059
5334.7334.3283.8388.915.64.7%353769
6299.4300.0265.9341.410.93.6%627336
7353.8353.4300.9407.913.93.9%880518
All346.3330.7198.1443.048.914.8%56,63517–69
Allsubsampled346.5331.1198.1442.949.715.0%929517–69
Table 10. The horizontal sample area: median luminance, Gaussian mean luminance, minimum luminance, maximum luminance, standard deviation, relative standard deviation, number of points, and angle between the surface normal and the measurement direction. Scans 2 and 3 were left with no observations due to the large measurement angle.
Table 10. The horizontal sample area: median luminance, Gaussian mean luminance, minimum luminance, maximum luminance, standard deviation, relative standard deviation, number of points, and angle between the surface normal and the measurement direction. Scans 2 and 3 were left with no observations due to the large measurement angle.
No.LmedianLmeanLminLmaxLSTDLRSDPointsAngle
1244.9244.4194.8346.314.45.9%737463
2-------87
3-------88
4369.5369.7311.6410.012.43.4%361781
5310.2310.0266.0356.713.04.2%226476
6367.5367.9315.2419.514.43.9%262076
7428.1 *427.9 *399.1 *443.5 *5.1 *1.2% *633679
All302.6302.7194.8419.559.219.5%15,87563–88
Allsubsampled257.3284.2194.8419.554.419.1%10,36763–88
* Scan number 7 was partly overexposed and therefore omitted from the merged clouds.
Table 11. The sample areas A–G: median luminance, Gaussian mean luminance, minimum luminance, maximum luminance, standard deviation, relative standard deviation, number of points, and angle between the surface normal and the measurement direction.
Table 11. The sample areas A–G: median luminance, Gaussian mean luminance, minimum luminance, maximum luminance, standard deviation, relative standard deviation, number of points, and angle between the surface normal and the measurement direction.
Sample AreaLmedianLmeanLminLmaxLSTDLRSDPointsAngle
A90.789.344.0128.814.015.7%846816–69
B194.3192.598.0255.026.213.6%13,24711–51
C369.2369.3277.9443.435.29.5%937618–63
D159.0165.542.3336.460.936.8%20,15811–79
E268.3268.1117.0430.956.321.0%19,1729–65
F178.6174.4105.0233.224.314.0%12,20815–65
G231.2223.9150.6276.929.313.1%983614–63
Vertical346.5331.1198.1442.949.715.0%929517–69
Horizontal257.3284.2194.8419.554.419.1%10,36763–88
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Kurkela, M.; Maksimainen, M.; Julin, A.; Rantanen, T.; Virtanen, J.-P.; Hyyppä, J.; Vaaja, M.T.; Hyyppä, H. Utilizing a Terrestrial Laser Scanner for 3D Luminance Measurement of Indoor Environments. J. Imaging 2021, 7, 85. https://doi.org/10.3390/jimaging7050085

AMA Style

Kurkela M, Maksimainen M, Julin A, Rantanen T, Virtanen J-P, Hyyppä J, Vaaja MT, Hyyppä H. Utilizing a Terrestrial Laser Scanner for 3D Luminance Measurement of Indoor Environments. Journal of Imaging. 2021; 7(5):85. https://doi.org/10.3390/jimaging7050085

Chicago/Turabian Style

Kurkela, Matti, Mikko Maksimainen, Arttu Julin, Toni Rantanen, Juho-Pekka Virtanen, Juha Hyyppä, Matti Tapio Vaaja, and Hannu Hyyppä. 2021. "Utilizing a Terrestrial Laser Scanner for 3D Luminance Measurement of Indoor Environments" Journal of Imaging 7, no. 5: 85. https://doi.org/10.3390/jimaging7050085

APA Style

Kurkela, M., Maksimainen, M., Julin, A., Rantanen, T., Virtanen, J. -P., Hyyppä, J., Vaaja, M. T., & Hyyppä, H. (2021). Utilizing a Terrestrial Laser Scanner for 3D Luminance Measurement of Indoor Environments. Journal of Imaging, 7(5), 85. https://doi.org/10.3390/jimaging7050085

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop