Next Article in Journal
Robust Feature Matching for 3D Point Clouds with Progressive Consistency Voting
Next Article in Special Issue
In the Direction of an Artificial Intelligence-Enabled Monitoring Platform for Concrete Structures
Previous Article in Journal
Empirical Mode Decomposition-Based Feature Extraction for Environmental Sound Classification
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Assessment of the Performance of a Portable, Low-Cost and Open-Source Device for Luminance Mapping through a DIY Approach for Massive Application from a Human-Centred Perspective

by
Francesco Salamone
1,2,*,
Sergio Sibilio
1,2 and
Massimiliano Masullo
2
1
Construction Technologies Institute, National Research Council of Italy (ITC-CNR), Via Lombardia, 49, 20098 San Giuliano Milanese, MI, Italy
2
Dipartimento di Architettura e Disegno Industriale, Università degli Studi Della Campania “Luigi Vanvitelli”, Via San Lorenzo, 81031 Aversa, CE, Italy
*
Author to whom correspondence should be addressed.
Sensors 2022, 22(20), 7706; https://doi.org/10.3390/s22207706
Submission received: 20 September 2022 / Revised: 3 October 2022 / Accepted: 4 October 2022 / Published: 11 October 2022

Abstract

:
Ubiquitous computing has enabled the proliferation of low-cost solutions for capturing information about the user’s environment or biometric parameters. In this sense, the do-it-yourself (DIY) approach to build new low-cost systems or verify the correspondence of low-cost systems compared to professional devices allows the spread of application possibilities. Following this trend, the authors aim to present a complete DIY and replicable procedure to evaluate the performance of a low-cost video luminance meter consisting of a Raspberry Pi and a camera module. The method initially consists of designing and developing a LED panel and a light cube that serves as reference illuminance sources. The luminance distribution along the two reference light sources is determined using a Konica Minolta luminance meter. With this approach, it is possible to identify an area for each light source with an almost equal luminance value. By applying a frame that covers part of the panel and shows only the area with nearly homogeneous luminance values and applying the two systems in a dark space in front of the low-cost video luminance meter mounted on a professional reference camera photometer LMK mobile air, it is possible to check the discrepancy in luminance values between the low-cost and professional systems when pointing different homogeneous light sources. In doing so, we primarily consider the peripheral shading effect, better known as the vignetting effect. We then differentiate the correction factor S of the Radiance Pcomb function to better match the luminance values of the low-cost system to the professional device. We also introduce an algorithm to differentiate the S factor depending on the light source. In general, the DIY calibration process described in the paper is time-consuming. However, the subsequent applications in various real-life scenarios allow us to verify the satisfactory performance of the low-cost system in terms of luminance mapping and glare evaluation compared to a professional device.

1. Introduction

Glare is essentially produced by daylight or electrical sources and is essentially characterised by an uneven luminance distribution in the field of view (FoV) [1]. Glare can impair people’s visual performance or cause discomfort [2]. There are various indices for quantifying the glare in different situations—from the unified glare rating (UGR) used for artificial lighting to the daylighting glare probability (DGP) for light entering through windows to the contrast ratio (CR) defined by considering the contrast between certain luminance values and those of the surroundings [3]. In defining glare issues, the luminance of the glare source is, of course, the most important factor, but there are several other factors involved in the perception of discomfort, mainly based on the subjective adaptation level, which depends on the ability of the subject’s pupils to adapt to the light intensity [4].
Glare assessment could be based on the analysis of light distributions by luminance mapping, which allows rapid data collection in a large FoV. Low dynamic range (LDR) images are limited in the contrast ratio of the camera, i.e., the range of the light and dark parts of the image that it can reproduce. To overcome this technical limitation, it is possible to consider high dynamic range (HDR) images, which are created by taking and then combining several different exposures of the same scene. The main advantage of HDR is that it presents a similar range of luminance as that perceived by the human visual system. Although it is possible to create HDR using an absolute calibration method, there is also the option of using a stepwise method, which is described in detail in Ref [5].
Based on these premises, this paper aims to describe a do-it-yourself (DIY) approach to calibrating a low-cost wide camera connected to a Raspberry Pi microprocessor. In more detail, the study, which follows the dictates of the recent CIE 244-2021 technical report [6], intends to answer the following questions:
  • What is the response of a low-cost camera compared with a professional camera photometer in different controlled environments with different light sources?
  • Is there a considerable difference between the luminance values of the low-cost camera and the professional one, and is it possible to consider an eventually differentiated correction factor for the different lighting systems?
  • Eventually, is it possible to consider an even simpler algorithm that automatically adjusts the luminance distribution of the low-cost system considering the different lighting systems to adapt to that of the professional camera?
The method described in Ref [5] is time-consuming and cannot be performed automatically in a few seconds on a portable device. We would like to find out whether it is possible to limit the time for capturing the images to less than 3 s and how large the error is in the definition of luminance mapping, considering this important constraint and considering different light sources. For this purpose, we considered two cameras: a professional DSLR camera from Canon equipped with a Sigma fisheye and a Raspi cam controlled by a Raspberry Pi. These two devices were positioned in front of different lighting panels used as a reference luminance source (see the Materials and Methods section) to collect different data and check the discrepancy between the two camera devices used for luminance mapping. The main results of the study are then applied to different everyday scenarios to confirm our findings. The idea is to verify if it may be possible to attach the device to a helmet and capture information about the luminance level during the day from a human-centred perspective.

2. Materials and Methods

Two lighting panels were built, and different light sources (i.e., different light spectra) were considered on a small area with uniform luminance, as described in more detail in Section 2.1 below. Two luminance measurement systems were considered: one based on a low-cost approach and another on a professional reference instrument. For more details on the video luminance meters, see Section 2.2.

2.1. Lighting Panels Used as a Reference Luminance Source

Two different lighting systems were developed for the luminance analysis, following the principle of the DIY approach. They consist of a LED panel and a cube with a standard E27 attack (Figure 1).
The LED panel is composed of different layers, from the bottom:
  • An aluminium frame where the led strip was located on the long sides of the aluminium frame;
  • An ethylene vinyl acetate EVA layer;
  • A reflective paper;
  • A light guide panel;
  • A diffuser paper.
The strips consist of SMD2835 LEDs, both cool and warm white, spaced 1.6 cm (Figure 1a). A black frame is attached to the panel on which a Cartesian plane was drawn to define a mesh of points with a resolution of 3 × 3 cm (Figure 1c).
The cube, with external dimensions of 32 × 32 cm, is realised using laminated pieces of wood, with an inner cover made of white alveolar polypropylene and an E27 light bulb attack positioned at 6 cm from the bottom (Figure 1b), allowing the consideration of different lighting sources (i.e., halogen, fluorescent, incandescent). A foil of alveolar polypropylene was placed horizontally at 15 cm from the floor to reduce the luminance discrepancy on the test surface. The upper surface consists of a white synthetic glass panel. The same Cartesian plane with a grid of 3 × 3 cm points was drawn over this test surface (Figure 1d).
A Konica Minolta LS-110 luminance meter is then used to evaluate the two panels’ luminance distribution, considering a template that follows the reference points across the x and y axes of the Cartesian orthogonal system (Figure 2).
The luminance values of the LED panel are defined in different configurations to allow CCT and intensity changes. On the other hand, only one configuration is considered for the halogen, fluorescent and incandescent lamp in the cube.
This approach made it possible to identify an area of the two plates with little differences in luminance distributions (see the details in Section 3.2 and Appendix A). In this way, it was possible to install masks on the 6 × 6 cm panels that limited the effective size of the lighting source, which was characterised by almost constant luminance and was useful for the subsequent analysis.

2.2. Equipment Used and Flowchart Used to Acquire the High Dynamic Range Images

The wide-angle camera with a focal distance of 1.67 mm, an optical FoV D of 160° (FoV H 122°, FoV V 89.5°) based on the OV5647 sensor, namely the V1 camera series, is considered in this research study. It has a native resolution of 5 MP and dimensions of 22.5 mm × 24 mm × 9 mm, making it perfect for mobile or other applications. The camera is connected to a Raspberry Pi 3 A+ equipped with a 64-bit quad-core processor running at 1.4 GHz, dual-band 2.4 GHz and 5 GHz wireless LAN, and Bluetooth 4.2/BLE [7]. The data collected by this device are compared with those of the camera photometer based on the Canon EOS70D digital single-lens reflex (DSLR) camera equipped with a CMOS Canon APS-C sensor and a Sigma Fisheye 4.5 mm F2.8 EX DC HSM [8]. Table 1 shows the most important lighting characteristics.
A 3D printed adapter was designed to install the Raspberry with the wide-angle camera on the fisheye lens of the DSLR camera (Figure 3a). The setup also uses an HD30.1 spectroradiometer data logger equipped with the HD30.S1 probe (Figure 3b) for spectral analysis of light in the visible range (380 nm–780 nm). It enables the calculation of the following photocolorimetric quantities: luminance (E) in [lx], correlated colour temperature (CCT) in [K], trichromatic coordinates [x,y] (CIE 1931) or [u’,v’] (CIE1978), colour rendering index (CRI_Ra) [9].
Both cameras took three different pictures of the same subject with different exposure times and combined them to create an HDR. The procedure for setting the shutter speed of the camera photometer corresponds to the A2 procedure described in Ref [10] and is based on the use of the hand-held Konica Minolta luminance meter (Figure 3c), which makes it possible to determine the correct time of high dynamic range (THDR). The procedure allows the measurement of the highest luminance value. The three “CR2” files collected with the camera photometer are then processed with LMK LabSoft to create the HDR file and generate a false-colour image of the luminance.
On the other hand, the three jpg files taken with the low-cost device are processed with the hdrgen software [11] to create the HDR file. The resulting HDR file is processed with the freely available Aftab HDR False Colour Analysis tool (Figure 4).

2.3. Final Setup

The final setup of the low-cost camera calibration system is shown in Figure 5.
The illuminated panels face the cameras positioned on a tripod. The tripod was also alternatively used to position the spectroradiometer (Figure 5). The vertical position is defined so that the centre of the spectroradiometer, or the centre of the segment connecting the centre of the Canon camera lens to the centre of the Raspberry cam lens, is placed at the same height as the centre of the lit panel. This configuration made it possible to collect data on luminance, which was collected in various configurations with both the professional and the low-cost camera. The same configuration also allowed the collection of data on the visual spectrum. The data are then processed to check the discrepancy in luminance mapping captured by the low-cost camera compared to the professional sensor and to see if the differences can be corrected depending on the lighting source.
Before starting the acquisition, a uniform white image was positioned in front of the camera, and a script was launched to correct via software the lens shading, also known as the vignetting effect [12,13], with a methodology often used for a microscope based on Raspberry Pi and a different type of camera with different customised lens. Then, we checked if this software correction was performed correctly. For this reason, in line with paragraph 2.3.5 of Ref [5], the setup described here was also used to verify the lens shading [13]. In this case, the tripod was positioned 60 cm from the LED panel, and the area illuminated by the LED panels was reduced to a surface of 2 × 2 cm (Figure 6).
The low-cost camera is rotated by 11.25° each time, covering the FOV of the lens, and three images at different exposure are acquired each time.

3. Results

3.1. Vignetting Assessment

As reported in the previous paragraph, the setup allows us to acquire three images with different exposure for the different rotation angles. By managing the derived HDR file for each rotation step with Aftab HDR False Colour Analysis tool, we determined the “luminance” value for the illuminated area. We normalised those values by considering the values in the centre of the image as equal to 1 (relative luminance, y-axis, Figure 7). The same approach was used for relative distance (x-axis in Figure 7) in line with Ref [14], where the relative distance equal to 0 refers to the centre of the image, and the relative distance of 1 refers to the corner of the image.
Figure 7 allows us to make some useful considerations:
  • By applying the software correction of the low-cost camera as described above, the centre of the image records lower luminance values than those moving towards the corner of the image.
  • It is possible to confirm the symmetrical distribution of the values, in line with expectations.
As confirmed by Figure 7, assuming a symmetrical distribution of the relative luminance differences, it is possible to define a calibration curve that starts from the centre of the FOV and extends to the corner. In this case, the polynomial of the third order used in the cal file of the pcomb function is composed of the coefficient reported in Figure 7b. By applying the -f function provided by pcomb, considering the cal file, it was possible to remove the spatial disuniformity of luminance, as confirmed by Figure 8.
Figure 8 shows how, effectively, the relative luminance distribution among the different relative distances almost equals 1. In the next paragraph, we focus on the difference between low-cost values of luminance and those monitored with a professional camera.

3.2. Panel and Cube Characterisation with Konica Minolta Luminance Reference Meter

Appendix A shows the details of the analysis of luminance resulting from applying the Konica Minolta luminance meter over the two reference sources. The data are classified, considering a string consisting of three parts (e.g., 100_C_1 ). The first is used to identify the light intensity (100% or 50%), and the second is used to identify the white type among warm (W), cool (C) or neutral (N). The third is the distance from the lighting sources: 1 = 55 cm, 2 = 30 cm, 3 = 15 cm from the lighting sources. In the case of the cube, the H, F or I letters indicate, respectively, the halogen, fluorescent or incandescent lamp used in the test without changing the intensity. Numbers 4 = 55 cm, 5 = 30 cm or 6 = 15 cm refer to the distance from the reference lighting sources. In all cases, it is possible to check the luminance distribution over the reference surfaces and identify an area of 6 × 6 cm where the monitored values are almost constant. Even though we do not know the light distribution of warm white and cool white LEDs because the manufacturer’s data are unknown (e.g., .ies file), we can test experimentally that the selected area for LED is the same for the different configurations. This is due to the geometric distribution of LEDS (Figure 1a), which is quite the same for warm and cool white LEDs, thus supporting the idea that there is no relevant difference in light distribution for the two types of LEDs. Table 2 summarises some details of the area luminance marked in black in Appendix A.
Figure 9 summarises the spectrum profiles for the different configurations considered. For better comprehension of the light source colour rendition, see Ref. [15].

3.3. Camera Photometer and Raspberry Camera Comparison

Table 3 reports in the second and third columns the pairwise results of the luminance evaluation with the camera photometer and the Raspberry camera. The table also reports the value of the adimensional coefficient S, the ratio between the luminance value measured with the camera phonometer and that measured with the Raspberry camera. This is a factor used in the pcomb [16] feature developed by Greg Ward to edit the starting HDR image. The fourth column reports the corrected factor of luminance by applying the S_pcomb_factor. The last three columns report data acquired with the spectroradiometer.
From Table 3, it is possible to highlight how for all the considered configurations for LED lighting, the S_pcomb factor is equal to 0.105, on average, with a minimum value of 0.08 and a maximum of 0.12. The average S_pcomb for halogen lamp configurations is 0.042, while it is equal to 0.116 for daylight, 0.136 for fluorescent and 0.045 for incandescent lamps.
To answer the second question posed in the introduction, we want to verify whether it is possible to classify the S_pcomb as a function of some variables among those reported in the previous Table 3. For this purpose, Figure 10 reports S_pcomb in a two-dimensional plot as a function of different parameters characterising the different spectra.
S_pcomb does not seem to be clearly classifiable considering only one parameter among CRI_Ra (Figure 10a), CCT (Figure 10b), Integral of spectral irradiance (Figure 10c) and E (Figure 10d). It is possible to highlight how all LED configurations are characterised by a CRI_Ra of less than 81. While the Daylight and Halogen configurations are characterised by a CRI_Ra higher than 90, the difference in terms of the Integral of spectral irradiance is remarkable. For this reason, it is feasible to define a possible conditional statement that allows us to classify the lighting source in LED, Halogen, Fluorescent, Incandescent and Daylight and consequently identify the correct S factor:
  • IF CRI_Ra ≤ 81 => “LED” => S = 0.105;
  • ELSE IF 81 < CRI_Ra ≤ 90 & Integral of spectral irradiance< 300 => “Fluorescent” => S = 0.136;
  • ELSE IF CRI_Ra > 90 & Integral of spectral irradiance< 300 => “Incandescent” of “Halogen” => S = 0.043;
  • ELSE Daylight => S = 0.116.
The fairly marginal difference between halogen and incandescent lamps and the minimal difference in terms of the S factor convinced us to consider an average value for S equal to 0.043 and not to distinguish between the two types of lamps.
We can apply the proper factor S_pcomb by considering the different lighting sources.

3.4. False-Colour Analysis in Real Cases

Different scenarios are considered:
  • indoor space, office with daylight only (lat: 45.40182, long: 9.24962; date: 07/04/2022; time: 13:02) (CRI_Ra > 90 (96.4) and Integral of spectral irradiance > 300 (1112.5) => “Daylight” => S = 0.116);
  • indoor space, office with daylight and fluorescent lamps (lat: 45.40182, long: 9.24962; date: 07/04/2022; time: 13:22) (CRI_Ra > 90 (94.3) and Integral of spectral irradiance > 300 (1226.1) => “Daylight” => S = 0.116);
  • indoor space, industrial fabric (lat: 45.40182, long: 9.24962; date: 07/04/2022; time: 14:02) (CRI_Ra > 90 (96.5) and Integral of spectral irradiance > 300 (507.87) => “Daylight” => S = 0.116);
  • outdoor space, Ponte Coperto (PV) (lat: 45.180681, long: 9.156303; date: 06/26/2022; time: 08:50) (CRI_Ra > 90 (96.5) and Integral of spectral irradiance > 300 (4680.44) => “Daylight” => S = 0.116);
  • indoor space, living room at dusk (lat: 45.163057, long: 9.135930; date: 07/05/2022; time: 21:28) (CRI_Ra ≤ 81 (80.2) => “LED” => S = 0.105).
Figure 11 shows the comparison of illuminance mapping in false colour, considering the proper S factor, defined in accordance with the conditional statement used to classify the predominant light source.
It is possible to make the following considerations about the luminance distribution [cd/m2] with the HDRs acquired with the two systems:
  • The raspicam is less resolute and also has less FoV, but we already knew this in advance;
  • Even in a very low light scenario (living room at dusk), it is possible to highlight a good comparison in terms of luminance distribution, demonstrating a good criterion for selection of the light source and, consequently, the correct S factor to apply to a low-cost HDR image.

3.5. Glare Index Analysis

To perform glare analysis, different methods are considered, depending on the system considered.
In the case of the low-cost instrument, two different methods are used:
  • The first one considers a task area, as recommended in Ref [17]—a useful approach, especially in the case of scenarios 1, 2 and 5, where users are expected to concentrate their gaze towards a specific area. The average luminance is calculated, and each pixel exceeding this value multiplied by a default factor equal to 5 [17] is considered a glare source.
  • The second approach—especially useful in the case of walking, when users are not concentrated in a specific area—does not consider a task area, in contrast to what is reported in Ref [17]. This allows us to consider the entire area captured. In this case, a constant threshold luminance level equal to 1500 cd/m2 is used. This second method also considers the difference in glare assessment due to the different FoV of the acquired figures. Depending on the derived HDR image, two different approaches are considered (Figure 12).
For the HDR file generated with the professional camera photometer, the value of UGR is defined in accordance with Section 17.1.5 of Ref. [18], as synthesised in Figure 12a. Among the different methods of glare calculation reported in Ref [16], we considered the following three methods:
a.
The first method—the most accurate—is based on the analysis of the overall luminance histogram and sets the first minimum after the first maximum as the luminance threshold level.
b.
The second method is based on using a task area defined in the LMK LabSoft, and the average luminance of the task zone area is defined as the threshold level. The threshold level is multiplied by a factor set to 5.
c.
The third method is based on manually setting a luminance threshold level—in this case, equal to 1500 cd/m2—for the first four scenarios, while for the fifth, a value equal to 1000 cd/m2 is considered.
The low-cost images are processed with ra_xyze to create the RGBE radiance file with the following code:
  • ra_xyze -r -o 20220705_2128.hdr > 20220705_2128_EVinpixel.hdr
  • The pcomb function is then used to apply the S factor and vignetting adjusting, as reported in the following example:
  • pcomb -f vignettingfilter.cal -s 0.105 -o 20220705_2128_EVinpixel.hdr > 20220705_2128_EVinpixel_0105corr.hdr
Then, a smaller image is created with the extension pic file using the Pfilt program [19]:
  • pfilt -1 -e 1 -x 1120 -y 840 20220705_2128_EVinpixel_0105corr.hdr > 20220705_2128_EVinpixel_0105corr.pic
  • Pfilt -1 -e 1 -x 1120 -y 840 xxx.hdr > xxx.pic (where “xxx” expresses the name of the initial hdr file)
Then, the evalglare program [17] is used to calculate the glare metrics:
  • In the case of considering the task area, the following script is used, which allows first calculating the glare indices and then saving a pic file with the highlighted task area by considering the following script:
  • evalglare -T 580 350 0.7 -vth -vv 122 -vh 90 -c taskarea.pic 20220704_1302_EVinpixel_0116corr.pic
  • In the case of scenarios 3 and 4, typically a walking scenario, the y position of the task area is lowered slightly and set equal to 100, imagining that the user is focused on looking at the area where they will place their feet. Then, the pic file is converted to a more useful tif file by considering the following:
  • ra_tiff -z taskarea.pic taskarea.tif
  • Meanwhile, in the case of considering the entire area captured, the following script is considered:
  • evalglare -vth -vv 122 -vh 90 -b 1500 xxx.pic > glare_xxx.txt
The term -b allows setting the threshold luminance value in line with the third method used by the professional glare calculation method. In the case of scenario 5, this value is set to 1000 cd/m2.
Table 4 reports the values of UGR for the different scenarios and different methods considered above and the sensation based on the 9-point Hopkinson’s glare scale [20,21] below.
Table 4 allows some useful comments to be made. Even if we consider only the professional device, the glare evaluation in relation to the sensation scale could be very different in cases where it is not a “standard” office scenario. In particular, if we look at scenario 4 (outdoor assessment), we can see that the glare sensation calculated based on the professional device data could be “uncomfortable” or “unacceptable” or “just uncomfortable”, depending on the method used.
On the other hand, if we compare the results of methods 1 and 2 of the low-cost device for scenarios 1, 2 and 5 with the corresponding methods b and c of the professional device, we can see that there is a difference in terms of glare sensation when considering a task area (method 1 and b), while there is no difference when the whole area is evaluated (method 2 and c). Additionally, when we compare method 2 with method a for the same scenarios, there are no differences in glare sensation.

4. Discussion and Future Improvements

The idea of performing luminance mapping with a low-cost camera is certainly not new [22,23], as the costs are more than an order of magnitude less than those of professional equipment, and the automated procedure for determining the glare index is very fast when compared to a classic manual procedure in which the photos have to be copied to the PC and then processed with dedicated software. The novelty of the proposed approach lies in the use of the DIY approach used to assess the performance of the low-cost camera, thus allowing the description and implementation of a method that is practically replicable and applicable by considering different light sources, even different from those considered in this study. With this in mind, Figure 13 shows the profiles of the light sources considered at 100% of light intensity and for the closest position to the reference light source in the 16 hue bins circle, which allow us to identify the difference in hue shift compared to a reference blackbody radiator (black line in the figure).
Some of the sources (incandescent lamp—I_6_100, halogen—H_6_100 and daylight—D_3) have a similar colour behaviour to the reference colour; others deviate by a maximum of 20% (fluorescent lamp—F_6_100, warm white LED—W_3_100, cold white LED—C_3_100, neutral white LED—N_3_100) and others still (blue LED light—NB_3_100, red LED light—NR_3_100, green LED light—NG_3_100) are intentionally very far from the black reference circle. A comprehensive overview of the colour rendering of all light sources can be found in Ref [15].
However, the approach described in this way could be repeated, and it is not surprising that more information is provided in Appendix A and in Ref [15]. This is because other researchers interested in the same aspect could replicate the easy and inexpensive instrumentation to understand how the system behaves under the action of other sources, different from those considered so far, or to consider a more in-depth study of contrasting fields, bright and dark areas side by side, which may also influence the final glare assessment due to the small size of the optical element of the Raspberry camera.
Another consideration is the presence of different light sources. In this case, the algorithm considers a total spectrum and then applies a correction coefficient that considers the predominant source. In this sense, in the case of daylight at midday, which is predominant compared to the fluorescent spectrum, the algorithm considers the total spectrum as “daylight” and assigns the corresponding S factor (S = 0.116, Figure 11b,c), while at dusk, when the daylight brightness is low and in the presence of LED light, the algorithm considers the total spectrum and assigns an S factor corresponding to the “LED” condition (S = 0.105, Figure 11h,i). The approach designed in this way allows different light sources to be considered by considering the total spectrum.
A future improvement could involve placing a surface orthogonal (or with a different angle) to the illuminated area on which different surface finishes could be applied and also investigating how the reflection effect could affect the luminance mapping of the low-cost system. This aspect is not considered in this study but does not seem to impact the overall luminance mapping and glare assessment significantly. Another improvement could be the use of a camera with higher FoV.
Another consideration regards the use of this low-cost solution for glare assessment. If we refer to the results of Section 3.4, in our opinion, it would be possible to consider a low-cost solution for indoor glare assessment in the case of office spaces (scenario 1 and 2) or home environments (scenario 5). Using a low-cost scenario for glare assessment in outdoor spaces (scenario 4) or indoor spaces (scenario 3) that differ from the classical office space requires further investigation, since, as shown, the same professional device can give different results depending on the method used.

5. Conclusions

A new calibration setup based on a DIY approach was proposed. The setup made it possible to perform calibration of a low-cost camera and compare the results in terms of luminance mapping with a professional DSLR camera photometer in a controlled environment but also considering real case studies.
According to the main questions formulated at the beginning of this study, we can conclude that:
  • Luminance mapping can be performed using a low-cost camera if it is subjected to a time-consuming but necessary calibration process;
  • The S factor of the pcomb function allows us to consider a correction factor that can be applied to the low-cost system to better match the luminance values of the professional device;
  • The S factor can be differentiated by considering different light sources, and in our study, we introduce a rough algorithm that performs this;
  • The calibration process could be replicated following a DIY approach to account for the different limitations/improvements, as described in the previous section.

Author Contributions

Conceptualisation, F.S., M.M. and S.S.; methodology, F.S., M.M. and S.S.; data acquisition and analysis, F.S.; writing—original draft preparation, F.S.; writing—review and editing, F.S., M.M. and S.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Luminance distribution for different configurations. The selected region 6 × 6 cm is marked in black.
50_C
y/x [cm]0.5369121518212427303336
0.5440704655731678631568584636640613620479
369110401066105010301029102410091011102210741054680
669310091017103310281037104410521031103510331019706
96929831002100910241058108310891050102710161012665
126549801001990100110571072108310301017995975658
15700961976982990101710311036101910121002958698
18748985979984997101810201022101010161009984682
21528701759786715812557758439523545568488
100_C
y/x [cm]0.5369121518212427303336
0.578711901231112912091099107112871284114211611376761
31527188718821897186618521846179918111823191818961269
61164179918071858186718741888187018581849187317921244
91126175517581823185519021950191618531837182017681141
121171176617631797183418871926189018221805177317171160
151200171417251775178718251848182418091799176717021101
181296175217701766179018211826180218111807177017481211
2179613581288134311871671156112881291142315521278802
50_W
y/x [cm]0.5369121518212427303336
0.5374682693657661641635626627648636606614
36991028103810601062102399998998599910171048654
66439891023104310511032102810191014100910241020666
96419741003101710331056107510551034102110181011627
126549609869881013104310731046101710081006997635
155859319549731001101210141005997992976945664
1864994997710011001996994988993985978968662
21462578559623682663666682669588623705451
100_W
y/x [cm]0.5369121518212427303336
0.571913551356130112871231121212471314135513871340696
390618651878190819091852181917961792182618811916830
61122177518461878189518811874184818391843186918471009
91037176618221842188319121944191018711847186018451092
121114177117971803183318991942187818321812180617941139
151006170217541786181418261821181118051799177917231095
18120417351745179718301820179818001810180317971808970
2172310261062122212191116108510561099116410281220766
50_N
y/x [cm]0.5369121518212427303336
0.5497642705682665649697654664678718753544
3573100010111016101299097895996497210071021596
666195298910011011101010109969909891000979524
956295097798410081032104810251007990986969534
12576949960969992101710381018995980971953524
15565924943954968983987979973966950919629
18652941942957969972967959964962955942632
21420572627669630606608592594617634656434
100_N
y/x [cm]0.5369121518212427303336
0.5570996982101910038869109131015932948989551
3783178618011820181917801754173017281750179818171227
61055169417691794181018091810179017711772179417691108
91056169517491770180518451880184118051778177617421037
1297116861719173717711825185218261789176117481710926
1589916481690170317291755176517561749173817081643922
1895416731688171017331737172917181731172617001666964
2165699910141042103189287584786793810051015556
Cube_H
y/x [cm]0.5369121518212427
0.5164180185197198193201193208177
3251274282292295296295294288218
6244283291301309312312311301228
9259297310320324329327324312228
12278308323331336338342338328231
15304315328341351355355350342317
18311325343359366369370370357324
21292329343358368375376375365290
24288318334332339347353352343286
27247286303315320327331325311225
Cube_F
y/x [cm]0.5369121518212427
0.5323378389414416405422405437372
3527575592613620622620617605458
6512594611632649655655653632479
9544624651672680691687680655479
12584647678695706710718710689485
15638662689716737746746735718666
18653683720754769775777777750680
21613691720752773788790788767609
24605668701697712729741739720601
27519601636662672687695683653473
Cube_I
y/x [cm]0.5369121518212427
0.5307359369393395385401385415353
3501547563583589591589587575435
6487565581600616622622620600455
9517593618638646656652646622455
12555614644660670674682674654461
15606628654680700708708698682632
18620648684716730736738738712646
21583656684714734748750748728579
24575634666662676692704702684571
27493571604628638652660648620449

References

  1. Hirning, M.; Coyne, S.; Cowling, I. The use of luminance mapping in developing discomfort glare research. J. Light Vis. Environ. 2010, 34, 101–104. [Google Scholar] [CrossRef] [Green Version]
  2. Scorpio, M.; Laffi, R.; Masullo, M.; Ciampi, G.; Rosato, A.; Maffei, L.; Sibilio, S. Virtual reality for smart urban lighting design: Review, applications and opportunities. Energies 2020, 13, 3809. [Google Scholar] [CrossRef]
  3. Bellazzi, A.; Danza, L.; Devitofrancesco, A.; Ghellere, M.; Salamone, F. An artificial skylight compared with daylighting and LED: Subjective and objective performance measures. J. Build. Eng. 2022, 45, 103407. [Google Scholar] [CrossRef]
  4. Pierson, C.; Wienold, J.; Bodart, M. Review of Factors Influencing Discomfort Glare Perception from Daylight. LEUKOS J. Illum. Eng. Soc. N. Am. 2018, 14, 111–148. [Google Scholar] [CrossRef]
  5. Pierson, C.; Cauwerts, C.; Bodart, M.; Wienold, J. Tutorial: Luminance Maps for Daylighting Studies from High Dynamic Range Photography. LEUKOS J. Illum. Eng. Soc. N. Am. 2021, 17, 140–169. [Google Scholar] [CrossRef] [Green Version]
  6. Krüger, U.; Blattner, P.; Bergen, T.; Bouroussis, C.; Campos Acosta, J.; Distl, R.; Heidel, G.; Ledig, J.; Rykowski, R.; Sauter, G.; et al. CIE 244: 2021 Characterization of Imaging Luminance Measurement Devices (ILMDs); International Commission on Illumination: Vienna, Austria, 2021; p. 59. [Google Scholar]
  7. Filipeflop Raspberry Pi 3 Model B-Raspberry Pi. Available online: https://www.raspberrypi.com/products/raspberry-pi-3-model-b/ (accessed on 8 March 2022).
  8. LMK Camera Photometer Descricption. Available online: https://www.technoteam.de/apool/tnt/content/e5183/e5432/e5733/e6645/lmk_ma_web_en2016_eng.pdf (accessed on 16 July 2022).
  9. Deltaohm Spectroradiometer Description. Available online: https://www.deltaohm.com/product/hd30-1-spectroradiometer-data-logger/ (accessed on 16 July 2022).
  10. Wolska, A.; Sawicki, D. Practical application of HDRI for discomfort glare assessment at indoor workplaces. Meas. J. Int. Meas. Confed. 2020, 151, 107179. [Google Scholar] [CrossRef]
  11. Anyhere Software. Available online: anyhere.com (accessed on 16 July 2022).
  12. Lens Shading Correction for Raspberry Pi Cam. Available online: https://openflexure.discourse.group/t/lens-shading-correction-for-raspberry-pi-camera/682/2 (accessed on 16 July 2022).
  13. Bowman, R.W.; Vodenicharski, B.; Collins, J.T.; Stirling, J. Flat-Field and Colour Correction for the Raspberry Pi Camera Module. arXiv 2021, arXiv:1911.13295. [Google Scholar] [CrossRef] [Green Version]
  14. Jacobs, A.; Wilson, M. Determining Lens Vignetting with HDR Techniques. In Proceedings of the XII National Conference on Lighting, Varna, Bulgaria, 10–12 June 2007; pp. 10–12. [Google Scholar]
  15. ies-tm30 Files of All Considered Lighting Sources. Available online: https://cnrsc-my.sharepoint.com/personal/francesco_salamone_cnr_it/_layouts/15/onedrive.aspx?id=%2Fpersonal%2Ffrancesco_salamone_cnr_it%2FDocuments%2FDottorato_Vanvitelli%2FPaper_raspi_cam%2FPaper_Annex_ies-tm30&ga=1 (accessed on 1 September 2022).
  16. Pcomb-Radiance. Available online: https://floyd.lbl.gov/radiance/man_html/pcomb.1.html (accessed on 16 July 2022).
  17. Evalglare-Radiance. Available online: https://www.radiance-online.org/learning/documentation/manual-pages/pdfs/evalglare.pdf/at_download/file (accessed on 16 July 2022).
  18. Operation Manual LMK LabSoft. Available online: https://www.technoteam.de/apool/tnt/content/e5183/e5432/e5733/e5735/OperationmanualLMKLabSoft_eng.pdf (accessed on 16 July 2022).
  19. Pfilt-Radance. Available online: https://floyd.lbl.gov/radiance/man_html/pfilt.1.html (accessed on 14 March 2022).
  20. Carlucci, S.; Causone, F.; De Rosa, F.; Pagliano, L. A review of indices for assessing visual comfort with a view to their use in optimization processes to support building integrated design. Renew. Sustain. Energy Rev. 2015, 47, 1016–1033. [Google Scholar] [CrossRef] [Green Version]
  21. Sawicki, D.; Wolska, A. The Unified semantic Glare scale for GR and UGR indexes. In Proceedings of the IEEE Lighting Conference of the Visegrad Countries, Karpacz, Poland, 13–16 September 2016. [Google Scholar] [CrossRef]
  22. Mead, A.; Mosalam, K. Ubiquitous luminance sensing using the Raspberry Pi and Camera Module system. Light. Res. Technol. 2017, 49, 904–921. [Google Scholar] [CrossRef]
  23. Huynh, T.T.M.; Nguyen, T.-D.; Vo, M.-T.; Dao, S.V.T. High Dynamic Range Imaging Using A 2x2 Camera Array with Polarizing Filters. In Proceedings of the 19th International Symposium on Communications and Information Technologies (ISCIT), Ho Chi Minh City, Vietnam, 25–27 September 2019; pp. 183–187. [Google Scholar]
Figure 1. Luminance device based on a DIY approach: (a) LED panel as built; (b) wood cube with halogen E27 bulb lamp as built; (c) LED panel finished; (d) wood cube with warm white halogen lamp finished.
Figure 1. Luminance device based on a DIY approach: (a) LED panel as built; (b) wood cube with halogen E27 bulb lamp as built; (c) LED panel finished; (d) wood cube with warm white halogen lamp finished.
Sensors 22 07706 g001
Figure 2. Example of characterisation of the LED panel (the same approach was used to characterise the cube).
Figure 2. Example of characterisation of the LED panel (the same approach was used to characterise the cube).
Sensors 22 07706 g002
Figure 3. Equipment used: (a) reference camera photometer and low-cost Raspberry Pi with a wide-angle camera mounted on the 3D printed support; (b) spectroradiometer and light probe; (c) Konica Minolta luminance reference meter.
Figure 3. Equipment used: (a) reference camera photometer and low-cost Raspberry Pi with a wide-angle camera mounted on the 3D printed support; (b) spectroradiometer and light probe; (c) Konica Minolta luminance reference meter.
Sensors 22 07706 g003
Figure 4. Flowchart used to create false-colour luminance map: (a) professional system; (b) low-cost system.
Figure 4. Flowchart used to create false-colour luminance map: (a) professional system; (b) low-cost system.
Sensors 22 07706 g004
Figure 5. Plan view of the setup for acquiring the luminance mapping of the selected region of the LED panel and cube panel.
Figure 5. Plan view of the setup for acquiring the luminance mapping of the selected region of the LED panel and cube panel.
Sensors 22 07706 g005
Figure 6. Setup for vignetting assessment: LED panel set as neutral white with 100% of intensity. Illuminated area = 2 × 2 cm2. Low-cost camera positioned at 60 cm from the LED panel.
Figure 6. Setup for vignetting assessment: LED panel set as neutral white with 100% of intensity. Illuminated area = 2 × 2 cm2. Low-cost camera positioned at 60 cm from the LED panel.
Sensors 22 07706 g006
Figure 7. Lens shading effect pre-assessment: (a) relative distance 0 = centre of the image, (b) relative distance 1 = corner of the image.
Figure 7. Lens shading effect pre-assessment: (a) relative distance 0 = centre of the image, (b) relative distance 1 = corner of the image.
Sensors 22 07706 g007
Figure 8. Lens shading effect post-assessment: relative distance 0 = centre of the image, relative distance 1 = corner of the image.
Figure 8. Lens shading effect post-assessment: relative distance 0 = centre of the image, relative distance 1 = corner of the image.
Sensors 22 07706 g008
Figure 9. Spectrum plot differentiated for the different configurations: in the legend, X = W (Warm LED white) or N (Neutral LED white) or C (Cool LED white) or NB (Blue filter over Neutral LED white) or NG (Green filter over Neutral LED white) or NR (Red filter over Neutral LED white) or H (Halogen) or D (Daylight) or F (Fluorescent) or I (Incandescent); 100/50 = intensity; 1/2/3 or 4/5/6 = positions.
Figure 9. Spectrum plot differentiated for the different configurations: in the legend, X = W (Warm LED white) or N (Neutral LED white) or C (Cool LED white) or NB (Blue filter over Neutral LED white) or NG (Green filter over Neutral LED white) or NR (Red filter over Neutral LED white) or H (Halogen) or D (Daylight) or F (Fluorescent) or I (Incandescent); 100/50 = intensity; 1/2/3 or 4/5/6 = positions.
Sensors 22 07706 g009
Figure 10. S_pcomb as a function of different parameters: (a) CRI_Ra; (b) CCT; (c) Integral of spectral irradiance; (d) E. CRI_Ra = Colour Rendering Index, CCT = Correlated Colour Temperature, E = luminance.
Figure 10. S_pcomb as a function of different parameters: (a) CRI_Ra; (b) CCT; (c) Integral of spectral irradiance; (d) E. CRI_Ra = Colour Rendering Index, CCT = Correlated Colour Temperature, E = luminance.
Sensors 22 07706 g010
Figure 11. False-colour distribution of luminance in different scenarios: (a) office space with low-cost camera—Daylight; (b) office space with the professional camera—Daylight; (c) office space with the low-cost camera—Daylight and fluorescent light; (d) office space with the professional camera—Daylight and fluorescent light; (e) industrial fabric with the low-cost camera—Daylight; (f) industrial fabric with the professional camera—Daylight; (g) outdoor space with the low-cost camera; (h) outdoor space with the professional camera; (i) indoor space, living room at evening with the low-cost camera—LED; (l) indoor space, living room at evening with the professional camera—LED.
Figure 11. False-colour distribution of luminance in different scenarios: (a) office space with low-cost camera—Daylight; (b) office space with the professional camera—Daylight; (c) office space with the low-cost camera—Daylight and fluorescent light; (d) office space with the professional camera—Daylight and fluorescent light; (e) industrial fabric with the low-cost camera—Daylight; (f) industrial fabric with the professional camera—Daylight; (g) outdoor space with the low-cost camera; (h) outdoor space with the professional camera; (i) indoor space, living room at evening with the low-cost camera—LED; (l) indoor space, living room at evening with the professional camera—LED.
Sensors 22 07706 g011
Figure 12. Flowchart calculates UGR: (a) professional system; (b) low-cost system.
Figure 12. Flowchart calculates UGR: (a) professional system; (b) low-cost system.
Sensors 22 07706 g012
Figure 13. The 16 hue bins circle for different lighting sources: incandescent—I_6_100, halogen—H_6_100, daylight—D_3, fluorescent—F_6_100, warm white LED—W_3_100, cold white LED—C_3_100, neutral white LED—N_3_100, blue LED light—NB_3_100, red LED light—NR_3_100 and green LED light—NG_3_100.
Figure 13. The 16 hue bins circle for different lighting sources: incandescent—I_6_100, halogen—H_6_100, daylight—D_3, fluorescent—F_6_100, warm white LED—W_3_100, cold white LED—C_3_100, neutral white LED—N_3_100, blue LED light—NB_3_100, red LED light—NR_3_100 and green LED light—NG_3_100.
Sensors 22 07706 g013
Table 1. Lighting characteristics of the reference camera photometer.
Table 1. Lighting characteristics of the reference camera photometer.
VariableValue
Integral spectral mismatch for halogen metal discharge lamps2–9 [%]
Integral spectral mismatch for high-pressure sodium discharge lamps7–13 [%]
Integral spectral mismatch for fluorescent lamps8–10 [%]
Integral spectral mismatch for LED white5–12 [%]
Calibration uncertainty ΔL 2.5 [%]
Repeatability ΔL 0.5–2 [%]
Uniformity ΔL±2 [%]
Table 2. Luminance values of the selected regions for the different configurations (3 × 3 mesh).
Table 2. Luminance values of the selected regions for the different configurations (3 × 3 mesh).
ConfigurationMin Luminance [cd/m2]Mean Luminance [cd/m2]Max Luminance [cd/m2]
50_C107210811089
100_C189019201950
50_W104610621075
100_W187819181944
50_N101810321048
100_N182618491880
Cube_H370373376
Cube_F777783790
Cube_I738744750
Table 3. Luminance values of the selected regions for the different configurations considering the camera data and S coefficient.
Table 3. Luminance values of the selected regions for the different configurations considering the camera data and S coefficient.
Configuration 1Camera PhotometerDefault Raspberry ValuesS_Pcomb Factor_MeanRaspberry Corrected ValueECCTCRI
_Ra
Integral of Spectral Irradiance
[cd/m2][-][-][cd/m2] 2[lx][K][-][mW/m2]
100_C_11961171630.1142571802.11536647170.4109.36
100_N_11880166450.1129471747.72535414371.599.42
100_W_11944164370.1182701725.88536306868.697.5
100_C_21956171440.1140921800.1290642872.7279.99
100_N_21873165900.1128991741.9587421574257.82
100_W_21946169290.1149511777.54589301370.6251.93
100_C_31844168940.1091511773.87271643071.8841.3
100_N_31768161680.1093521697.64260420673.4768.71
100_W_31836161560.1136421696.38271300869.9758.82
100_NG_361363880.095961670.7486719540.4203.81
100_NG_265165350.099617686.17530721840.572.62
100_NG_166265480.101100687.5413712042.233.13
100_NR_346342700.108431448.3569218928.5253.05
100_NR_249443740.112940459.2720217031.185.18
100_NR_147242640.110694447.729209339.541.07
100_NB_344249470.089347519.4351021702250.8195.23
100_NB_247352790.089600554.295311718646.2100.36
100_NB_148153180.090448558.39151651551.450.36
50_C_1108388510.122359929.35520637672.862.5
50_N_1103685960.120521902.5819417474.157.67
50_W_1107585060.126381893.1320300370.756.29
50_C_2105799180.1065741041.3950639871.6155.53
50_N_2101794650.107448993.82547420174.6139.9
50_W_2105897020.1090501018.7150292869.5140.27
50_C_398897740.1010851026.2764640871.9459.01
50_N_393794000.099681987142418673.4420.88
50_W_399095650.1035021004.325148301070.3416.62
50_NG_333335430.093988372.01549719240.3116.2
50_NG_235335520.099381372.9617714639.841.41
50_NG_136136600.098634384.37704943.219.51
50_NR_325322090.114531231.94532218728.6139.14
50_NR_226822240.120504233.5211210936.748.01
50_NR_127022130.122006232.3655199649.222.63
50_NB_318219320.094203202.86531716845.6172.12
50_NB_225226400.095455277.2161848540.251.24
50_NB_126126710.097716280.4558171915128.67
100_H_433475660.044145317.7725214793.740.97
100_H_533275600.043915317.529220596.580.42
100_H_631680000.03950033625227297.2212.55
100_F_478154960.142103747.45610219882.0532.96
100_F_576656490.135546768.26418226883.855.51
100_F_675758120.130282790.43237226283.5105.71
100_I_4748166570.044894749.5658213195.871.66
100_I_5750172420.043510775.8915213897.2141.38
100_I_6749161620.046312727.2931222197.3274.74
D_11019120.110746105.792255491395.91210.23
D_21048910.116723103.356113536983.2576.69
D_31068750.121143101.5107480495.2449.3
1 C = cool white, N = neutral white, W = warm white, NG = green filter, NR = red filter, NB = blue filter, H = halogen, F = fluorescent, I = incandescent, D = daylight. 2 values obtained by considering an average Scomb = 0.105 for all configurations with LED panel, 0.042 for configurations with halogen lamps, 0.116 for configurations with daylight, 0.136 for fluorescent and 0.045 for incandescent lamps.
Table 4. UGR values for the different scenarios and related sensation based on a 9-point scale.
Table 4. UGR values for the different scenarios and related sensation based on a 9-point scale.
Low-CostProfessional
Scenario No.Method 1Method 2Method aMethod bMethod c
120.15
(unacceptable 2)
20.43
(unacceptable 2)
21.99 1
(unacceptable 2)
22.72 1
(just uncomfortable 2)
21.50 1
(unacceptable 2)
221.16
(unacceptable 2)
21.35
(unacceptable 2)
21.871
(unacceptable 2)
22.43 1
(just uncomfortable 2)
20.90 1
(unacceptable 2)
316.42
(just acceptable 2)
24.08
(just uncomfortable 2)
19.37 1
(unacceptable 2)
17.84 1
(just acceptable 2)
16.22 1
(just acceptable 2)
421.95
(unacceptable 2)
27.95
(uncomfortable 2)
25.99 1
(uncomfortable 2)
21.56 1
(unacceptable 2)
22.67 1
(just uncomfortable 2)
50.00
(imperceptible 2)
2.13 1
(imperceptible 2)
1.97 1
(imperceptible 2)
2.08 1
(imperceptible 2)
0.00 1
(imperceptible 2)
1 weighted with solid angle Ωp [18]. 2 according to the 9-point Hopkinson’s glare sensation scale [20,21].
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Salamone, F.; Sibilio, S.; Masullo, M. Assessment of the Performance of a Portable, Low-Cost and Open-Source Device for Luminance Mapping through a DIY Approach for Massive Application from a Human-Centred Perspective. Sensors 2022, 22, 7706. https://doi.org/10.3390/s22207706

AMA Style

Salamone F, Sibilio S, Masullo M. Assessment of the Performance of a Portable, Low-Cost and Open-Source Device for Luminance Mapping through a DIY Approach for Massive Application from a Human-Centred Perspective. Sensors. 2022; 22(20):7706. https://doi.org/10.3390/s22207706

Chicago/Turabian Style

Salamone, Francesco, Sergio Sibilio, and Massimiliano Masullo. 2022. "Assessment of the Performance of a Portable, Low-Cost and Open-Source Device for Luminance Mapping through a DIY Approach for Massive Application from a Human-Centred Perspective" Sensors 22, no. 20: 7706. https://doi.org/10.3390/s22207706

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop