Evaluation of Applicability of Various Color Space Techniques of UAV Images for Evaluating Cool Roof Performance

: Global warming is intensifying worldwide, and urban heat islands are occurring as urbanization progresses. The cool roof method is one alternative for reducing the urban heat island phenomenon and lowering the heat on building roofs for a comfortable indoor environment. In this study, a cool roof evaluation was performed using an unmanned aerial vehicle (UAV) and a red, green and blue (RGB) camera instead of a laser thermometer and a thermal infrared sensor to evaluate existing cool roofs. When using a UAV, an RGB sensor is used instead of expensive infrared sensor. Various color space techniques, namely light-reﬂectance value, hue saturation value (HSV), hue saturation lightness, and YUV (luma component (Y) and two chrominance components, called U (blue projection) and V (red projection)) derived from RGB images, are applied to evaluate color space techniques suitable for cool roof evaluation. This case study shows the following quantitative results: (average surface temperature: 44.1 ◦ C; average indoor temperature: 33.3 ◦ C) showed highest HSV, while the black roof with the highest temperature (surface temperature average: 73.4 ◦ C; indoor temperature average: 37.1 ◦ C) depicted the lowest HSV. In addition, the HSV showed the highest correlation in both the Pearson correlation coe ﬃ cient and the linear regression analyses when the correlation among the brightness, surface temperature, and indoor temperature of the four color space techniques was analyzed. This study is considered a valuable reference for using RGB cameras and HSV color space techniques, instead of expensive thermal infrared cameras, when evaluating cool roof performance.


Introduction
Temperature rises are intensifying with global warming worldwide. Abnormally high temperatures have occurred in Central Africa, Europe, the Middle East, Alaska, and South America [1,2]. Moreover, with the development of industries, the urban heat island phenomenon has occurred with urbanization, population increase, and vegetation reduction. The phenomenon is mainly caused by asphalt roads, concrete artificial structures, and high-rise buildings. The solar radiation reflected on building roofs raises the external surface temperature by up to 50 to 60 • C [3,4]. Such a structure has a bad effect on the living environment of people because it raises the temperature of the city, thereby causing problems for residential life and the cooling load [5]. In an effort to reduce the urban heat island phenomenon, research is being conducted to lower the surface temperature of building roofs.
The building roof temperature is important because the building roof comprises approximately 20% to 25% of the city's surface, and the energy consumption for cooling the city building is higher than that of non-urban buildings [6,7]. Cool roofs, rooftop greening, sprinkling treatment, solar power generation, and dual roofs, among others, are being investigated to reduce roof temperature [8][9][10][11][12].
Cool roofs are difficult to install because of structural problems and high installation and maintenance costs. White or light-colored paints that reflect well are applied to cool roofs using heat absorption from the color difference to reduce the heat accumulation in the roof. This can be easily applied to an existing building and is excellent in terms of cost because it is easy to install after the initial design, construction, and completion of the building [13,14].
Two previous studies on cool roof evaluation have been reported. First, evaluations according to environmental conditions, such as the material's exterior wall thickness, thickness, roof condition, window insulation and size, and solar radiation energy, have been performed [15][16][17][18][19]. Second, only the roof surface temperature was evaluated; the external factors were excluded [20,21]. Satellite images, handle-type thermal infrared (TIR) images, and laser thermometers are frequently used in the existing method of roof surface temperature evaluation. Moreover, studies using unmanned aerial vehicle (UAV) infrared cameras have recently been conducted [22,23]. However, satellite images have a low spatial resolution; hence, the analysis of large areas is possible, and the local analysis is difficult. The handle-type TIR image and the laser thermometer have a disadvantage, in that obtaining the overall temperature of the rooftop surface is difficult, and the TIR camera for a UAV is expensive. To solve these shortcomings, a high spatial resolution is obtained, and the overall image acquisition of the roof surface is realized herein. The cool roof evaluation will be conducted using the RGB camera for a UAV instead of the TIR camera. Many studies have been conducted on the possibility of applying near-field monitoring because of the advantage of the UAV photogrammetry technique being able to obtain high-resolution images [24][25][26]. RGB cameras are cheaper than TIR cameras and can fly without restrictions in legal spaces, except for military areas; hence, you can obtain high-resolution images in cm and acquire the entire roof image. After applying various color space techniques to the image acquired by the RGB camera, we obtain the correlation between the roof surface temperature value of the TIR image acquired by a thermal infrared camera and the indoor temperature value obtained by a digital thermometer. The applied color space technologies are light-reflectance value (LRV), hue saturation value (HSV), hue saturation lightness (HSL), and YUV (luma component (Y) and two chrominance components, called U (blue projection) and V (red projection)). We evaluate the applicability of the cool roof evaluation by the color space technique through each correlation.

Study Area and Equipment
The selected study area was the 9th building of Kyungpook National University Sangju Campus in Sangju, Gyeongsangbuk-do, Republic of Korea ( Figure 1). The colors used to evaluate the cool roof performance are as follows: (1) white, which is the most effective color in cool roof studies; (2) gray, which is similar to cement color; (3) green, which is most commonly used; (4) blue, which is used most often in factories; and (5) black, which absorbs the most sunlight. The gray color represents the color of the initial building. However, in the absence of maintenance, it is changed to dark gray or black; hence, black is added. Blue and green are waterproof paint colors mainly used only in South Korea. Overall, five colors were applied ( Figure 1).  Building No. 9 has a few residents, and a few factors affect the surface temperature because no equipment is used for the experiment rooms, except for the outdoor unit on the rooftop. Figure 3 shows a room with indoor temperature measurements. Building No. 9 was chosen because the interior space of the fourth floor is divided equally, and the room indicated by the circle in the figure is not currently used. Hence, the experiment can be conducted under the same conditions ( Figure 2). Building No. 9 was chosen because the interior space of the fourth floor is divided equally, and the room indicated by the circle in the figure is not currently used. Hence, the experiment can be conducted under the same conditions ( Figure 2). Building No. 9 has a few residents, and a few factors affect the surface temperature because no equipment is used for the experiment rooms, except for the outdoor unit on the rooftop. Figure 3 shows a room with indoor temperature measurements. Building No. 9 has a few residents, and a few factors affect the surface temperature because no equipment is used for the experiment rooms, except for the outdoor unit on the rooftop. Figure 3 shows a room with indoor temperature measurements.  The UAV images were acquired herein using Inspire 1, which can fly for up to 18 min using the remote sensing technique, while the RGB images for applying the color space techniques were acquired using a Zenmuse X3 camera. A Zenmuse XT630 TIR camera was used to measure the roof surface temperature for comparing and evaluating the color space technique. The indoor temperature was measured using a Xiaomi digital thermometer ( Table 1). The performance of Zenmuse X3 is characterized by 12.4 million pixels and a wide-angle camera with a 94° field of view (FOV). Meanwhile, Zenmuse XT630 is a system that displays the difference in the amount of infrared radiation emitted by an object as a temperature value. The angle in the case of the UAV may be changed due to vibration, but for Zenmuse XT630, the vibration angle range was accurate at 0.03°. Furthermore, the temperature accuracy was high at ±5%. The temperature display unit of the digital thermometer was 0.1 °C, and its temperature accuracy was ±0.3 °C

Unmanned Aerial Vehicle (UAV) Data Acquisition
The UAV images were taken during the third week of August 2018 when the temperature was the highest and between 12:00 and 13:00 when the Sun was the highest [27,28]. Moreover, the photographs were taken on a clear day without clouds (external average temperature: 31.1 °C; relative humidity: 60.4%; and wind speed: 0.69 m/s) to avoid the influence of shadows. Pix4D Capture, a UAV flight planning software linked to Google Earth and OpenStreetMap, was used to maintain a The UAV images were acquired herein using Inspire 1, which can fly for up to 18 min using the remote sensing technique, while the RGB images for applying the color space techniques were acquired using a Zenmuse X3 camera. A Zenmuse XT630 TIR camera was used to measure the roof surface temperature for comparing and evaluating the color space technique. The indoor temperature was measured using a Xiaomi digital thermometer ( Table 1). The performance of Zenmuse X3 is characterized by 12.4 million pixels and a wide-angle camera with a 94 • field of view (FOV). Meanwhile, Zenmuse XT630 is a system that displays the difference in the amount of infrared radiation emitted by an object as a temperature value. The angle in the case of the UAV may be changed due to vibration, but for Zenmuse XT630, the vibration angle range was accurate at 0.03 • . Furthermore, the temperature accuracy was high at ±5%. The temperature display unit of the digital thermometer was 0.1 • C, and its temperature accuracy was ±0.3 • C.  The UAV images were acquired herein using Inspire 1, which can fly for up to 18 min using the remote sensing technique, while the RGB images for applying the color space techniques were acquired using a Zenmuse X3 camera. A Zenmuse XT630 TIR camera was used to measure the roof surface temperature for comparing and evaluating the color space technique. The indoor temperature was measured using a Xiaomi digital thermometer ( Table 1). The performance of Zenmuse X3 is characterized by 12.4 million pixels and a wide-angle camera with a 94° field of view (FOV). Meanwhile, Zenmuse XT630 is a system that displays the difference in the amount of infrared radiation emitted by an object as a temperature value. The angle in the case of the UAV may be changed due to vibration, but for Zenmuse XT630, the vibration angle range was accurate at 0.03°. Furthermore, the temperature accuracy was high at ±5%. The temperature display unit of the digital thermometer was 0.1 °C, and its temperature accuracy was ±0.3 °C

Unmanned Aerial Vehicle (UAV) Data Acquisition
The UAV images were taken during the third week of August 2018 when the temperature was the highest and between 12:00 and 13:00 when the Sun was the highest [27,28]. Moreover, the photographs were taken on a clear day without clouds (external average temperature: 31.1 °C; (Inspire 1) Energies 2020, 13, x FOR PEER REVIEW 4 of 12 The UAV images were acquired herein using Inspire 1, which can fly for up to 18 min using the remote sensing technique, while the RGB images for applying the color space techniques were acquired using a Zenmuse X3 camera. A Zenmuse XT630 TIR camera was used to measure the roof surface temperature for comparing and evaluating the color space technique. The indoor temperature was measured using a Xiaomi digital thermometer ( Table 1). The performance of Zenmuse X3 is characterized by 12.4 million pixels and a wide-angle camera with a 94° field of view (FOV). Meanwhile, Zenmuse XT630 is a system that displays the difference in the amount of infrared radiation emitted by an object as a temperature value. The angle in the case of the UAV may be changed due to vibration, but for Zenmuse XT630, the vibration angle range was accurate at 0.03°. Furthermore, the temperature accuracy was high at ±5%. The temperature display unit of the digital thermometer was 0.1 °C, and its temperature accuracy was ±0.3 °C

Unmanned Aerial Vehicle (UAV) Data Acquisition
The UAV images were taken during the third week of August 2018 when the temperature was the highest and between 12:00 and 13:00 when the Sun was the highest [27,28]. Moreover, the photographs were taken on a clear day without clouds (external average temperature: 31.1 °C; Energies 2020, 13, x FOR PEER REVIEW 4 of 12 The UAV images were acquired herein using Inspire 1, which can fly for up to 18 min using the remote sensing technique, while the RGB images for applying the color space techniques were acquired using a Zenmuse X3 camera. A Zenmuse XT630 TIR camera was used to measure the roof surface temperature for comparing and evaluating the color space technique. The indoor temperature was measured using a Xiaomi digital thermometer ( Table 1). The performance of Zenmuse X3 is characterized by 12.4 million pixels and a wide-angle camera with a 94° field of view (FOV). Meanwhile, Zenmuse XT630 is a system that displays the difference in the amount of infrared radiation emitted by an object as a temperature value. The angle in the case of the UAV may be changed due to vibration, but for Zenmuse XT630, the vibration angle range was accurate at 0.03°. Furthermore, the temperature accuracy was high at ±5%. The temperature display unit of the digital thermometer was 0.1 °C, and its temperature accuracy was ±0.3 °C

Unmanned Aerial Vehicle (UAV) Data Acquisition
The UAV images were taken during the third week of August 2018 when the temperature was the highest and between 12:00 and 13:00 when the Sun was the highest [27,28]. Moreover, the photographs were taken on a clear day without clouds (external average temperature: 31.1 °C; Energies 2020, 13, x FOR PEER REVIEW 4 of 12 The UAV images were acquired herein using Inspire 1, which can fly for up to 18 min using the remote sensing technique, while the RGB images for applying the color space techniques were acquired using a Zenmuse X3 camera. A Zenmuse XT630 TIR camera was used to measure the roof surface temperature for comparing and evaluating the color space technique. The indoor temperature was measured using a Xiaomi digital thermometer ( Table 1). The performance of Zenmuse X3 is characterized by 12.4 million pixels and a wide-angle camera with a 94° field of view (FOV). Meanwhile, Zenmuse XT630 is a system that displays the difference in the amount of infrared radiation emitted by an object as a temperature value. The angle in the case of the UAV may be changed due to vibration, but for Zenmuse XT630, the vibration angle range was accurate at 0.03°. Furthermore, the temperature accuracy was high at ±5%. The temperature display unit of the digital thermometer was 0.1 °C, and its temperature accuracy was ±0.3 °C

Unmanned Aerial Vehicle (UAV) Data Acquisition
The UAV images were taken during the third week of August 2018 when the temperature was the highest and between 12:00 and 13:00 when the Sun was the highest [27,28]. Moreover, the photographs were taken on a clear day without clouds (external average temperature: 31

Unmanned Aerial Vehicle (UAV) Data Acquisition
The UAV images were taken during the third week of August 2018 when the temperature was the highest and between 12:00 and 13:00 when the Sun was the highest [27,28]. Moreover, the photographs were taken on a clear day without clouds (external average temperature: 31.1 • C; relative humidity: 60.4%; and wind speed: 0.69 m/s) to avoid the influence of shadows. Pix4D Capture, a UAV flight planning software linked to Google Earth and OpenStreetMap, was used to maintain a constant flight altitude of 70 m, overlapping 70% along the automatic flight route during the data acquisition ( Figure 4). The UAV flew at the lowest speed of 3 m/s to avoid the camera effects. A total of 149 UAV images were acquired, and an orthophoto was made using Agisoft's Photoscan.

of 12
Energies 2020, 13, x FOR PEER REVIEW 5 of 12 constant flight altitude of 70 m, overlapping 70% along the automatic flight route during the data acquisition ( Figure 4). The UAV flew at the lowest speed of 3 m/s to avoid the camera effects. A total of 149 UAV images were acquired, and an orthophoto was made using Agisoft's Photoscan. Lens distortion correction was performed before image matching using Agisoft's Photoscan software [29,30]. The model used for the lens distortion correction was Brown's distortion model, which was calculated as follows (Equations (1) to (6)): x = x 1 + K 1 r 2 + K 2 r 4 + K 3 r 6 + K 4 r 8 + P 2 r 2 + 2x 2 + 2P 1 1 + P 3 r 2 + P 4 r 4 (3) y = y 1 + K 1 r 2 + K 2 r 4 + K 3 r 6 + K 4 r 8 + P 1 r 2 + 2y 2 + 2P 2 (1 + P 3 r 2 + P 4 r 4 ) (4) u = w × 0.5 + c x + x f + xʹB 1 + yʹB 2 (5) v = h × 0.5 + c y + yʹf (6) where, X, Y, and Z are the point coordinates in the local camera coordinate system; u and v denote the projected point coordinates in the image coordinate system (in pixels); f is the focal length; cx and cy are the principal point offset; K1, K2, K3, and K4 are the radial distortion coefficients; B1 and B2 represent the affinity and non-orthogonality (skew) coefficients, respectively; and w and h are the image width and height in pixels, respectively. Table 2 shows the residual for the images and interior orientation parameters. The FC350 camera used in Inspire 1 was a wide-angle camera; hence, many residuals went toward the end. FC350 is a low-cost camera. As a result, the manufacturing process is not precise, and the residuals can largely be calculated [31]. Lens distortion correction was performed before image matching using Agisoft's Photoscan software [29,30]. The model used for the lens distortion correction was Brown's distortion model, which was calculated as follows (Equations (1) to (6)): x = x 1 + K 1 r 2 + K 2 r 4 + K 3 r 6 + K 4 r 8 + P 2 r 2 + 2x 2 + 2P 1 xy 1 + P 3 r 2 + P 4 r 4 (3) y = y 1 + K 1 r 2 + K 2 r 4 + K 3 r 6 + K 4 r 8 + P 1 r 2 + 2y 2 + 2P 2 xy 1 + P 3 r 2 + P 4 r 4 (4) where, X, Y, and Z are the point coordinates in the local camera coordinate system; u and v denote the projected point coordinates in the image coordinate system (in pixels); f is the focal length; c x and c y are the principal point offset; K 1 , K 2 , K 3 , and K 4 are the radial distortion coefficients; B 1 and B 2 represent the affinity and non-orthogonality (skew) coefficients, respectively; and w and h are the image width and height in pixels, respectively. Table 2 shows the residual for the images and interior orientation parameters. The FC350 camera used in Inspire 1 was a wide-angle camera; hence, many residuals went toward the end. FC350 is a low-cost camera. As a result, the manufacturing process is not precise, and the residuals can largely be calculated [31].
The processes of extracting feature points using the scale invariant feature transform (SIFT) technique, building high-density points using the structure from motion (SfM) technique, and converting relative coordinates through the ground control point (GCP) input were required in generating an orthophoto. The SIFT method is suitable for feature point extraction because it is invariant to the changes in image rotation and scale [32,33]. The SfM method connects the feature points extracted by two adjacent images, estimates the relative posture and direction of the images matching using epipolar geometry, and estimates the 3D position of the feature points [34,35]. For the GCP input, 19 points were acquired through a virtual reference station survey among global navigation satellite system survey methods. Seven of which were used as GCP, and 12 were used as check points (CPs). The evaluation result using the CPs showed that the orthophoto with a high positional accuracy   The processes of extracting feature points using the scale invariant feature transform (SIFT) technique, building high-density points using the structure from motion (SfM) technique, and converting relative coordinates through the ground control point (GCP) input were required in generating an orthophoto. The SIFT method is suitable for feature point extraction because it is invariant to the changes in image rotation and scale [32,33]. The SfM method connects the feature points extracted by two adjacent images, estimates the relative posture and direction of the images matching using epipolar geometry, and estimates the 3D position of the feature points [34,35]. For the GCP input, 19 points were acquired through a virtual reference station survey among global navigation satellite system survey methods. Seven of which were used as GCP, and 12 were used as check points (CPs). The evaluation result using the CPs showed that the orthophoto with a high positional accuracy was produced with horizontal maximum errors of x = 0.11 m and y = 0.15 m and a vertical maximum error of z = 0.16 m (Figure 1).

Red, Green and Blue (RGB) Image Data Processing
Four color space techniques were applied herein, namely LRV, HSV, HSL, and YUV. As defined by the International Lighting Commission, the LRV technique is an indicator of light reflectance based on CIELAB ((International Commission on Illumination (CIE)) international color standard (also known as CIE L*a*b* or sometimes abbreviated as simply "Lab" color space)), where L* is lightness; a* is red and green; and b* is yellow and blue. Light reflectance is calculated using the L of three values. LRV is a measure of the amount of visible and available light reflected by all directions and wavelengths when light enters the surface [36,37]. This scale is used to identify the amount of light that color reflects or absorbs. The measurements range from 0% (reflecting visible light) to 100% (absorbing all light).
The HSV color space, which is the color space closest to the human intuitive method based on human cognition, is expressed in hue, saturation, and value (brightness). Hue is the relative placement angle of the ring shape when the longest red in the visible spectrum is 0°. Therefore, the color values range from 0° to 360°, with 360° and 0° indicating the same color red. Saturation is the intensity of a particular color. The saturation value indicates the degree of concentration when the darkest state of a specific color is 100%. It has the same achromatic color when the saturation value is 0%. Brightness (value) represents the degree of brightness when white is 100% and black is 0% [38].
The HSL color space represents hue, saturation, and lightness (luminance). The color luminance is a measure of the perceived lightness. Like the HSV color space, the HSL color space is black at one end; however, it is white in the other. The most saturated colors represent a middle value [39].
The YUV color space is generally used in video transmission, computer graphics, and visualization in scientific computing [40]. It can also be considered similar to the retina in the human eye. The luminance (Y channel) describes the light intensity just like the rod cells in the retina. The YUV color space is suitable for black and white display devices. Two channels called ″U″ and ″V″

Red, Green and Blue (RGB) Image Data Processing
Four color space techniques were applied herein, namely LRV, HSV, HSL, and YUV. As defined by the International Lighting Commission, the LRV technique is an indicator of light reflectance based on CIELAB ((International Commission on Illumination (CIE)) international color standard (also known as CIE L*a*b* or sometimes abbreviated as simply "Lab" color space)), where L* is lightness; a* is red and green; and b* is yellow and blue. Light reflectance is calculated using the L of three values. LRV is a measure of the amount of visible and available light reflected by all directions and wavelengths when light enters the surface [36,37]. This scale is used to identify the amount of light that color reflects or absorbs. The measurements range from 0% (reflecting visible light) to 100% (absorbing all light).
The HSV color space, which is the color space closest to the human intuitive method based on human cognition, is expressed in hue, saturation, and value (brightness). Hue is the relative placement angle of the ring shape when the longest red in the visible spectrum is 0 • . Therefore, the color values range from 0 • to 360 • , with 360 • and 0 • indicating the same color red. Saturation is the intensity of a particular color. The saturation value indicates the degree of concentration when the darkest state of a specific color is 100%. It has the same achromatic color when the saturation value is 0%. Brightness (value) represents the degree of brightness when white is 100% and black is 0% [38].
The HSL color space represents hue, saturation, and lightness (luminance). The color luminance is a measure of the perceived lightness. Like the HSV color space, the HSL color space is black at one end; however, it is white in the other. The most saturated colors represent a middle value [39].
The YUV color space is generally used in video transmission, computer graphics, and visualization in scientific computing [40]. It can also be considered similar to the retina in the human eye. The luminance (Y channel) describes the light intensity just like the rod cells in the retina. The YUV color space is suitable for black and white display devices. Two channels called "U" and "V" carry color information from the chrominance components. The YUV color space has already separated luminance; hence, it takes up little bandwidth compared to the RGB color space [41].
Only the elements related to the brightness value were used among the four color space techniques. We evaluated only the brightness value because the influence of light absorption and reflection depends on the degree of brightness. We calculated it using the L channel in LRV, V channel in HSV, L channel in HSL, and Y channel in YUV. The four color space techniques were calculated using MATLAB. Table 3 presents the color expression method and formulas. Table 3. Values of the color space technology (i.e., light-reflectance value (LRV), hue saturation value (HSV), hue saturation lightness (HSL), and YUV), surface temperature, and indoor temperature according to color.

(a) (b) (c) (d)
separated luminance; hence, it takes up little bandwidth compared to the RGB color space [41]. Only the elements related to the brightness value were used among the four color space techniques. We evaluated only the brightness value because the influence of light absorption and reflection depends on the degree of brightness. We calculated it using the L channel in LRV, V channel in HSV, L channel in HSL, and Y channel in YUV. The four color space techniques were calculated using MATLAB. Table 3 presents the color expression method and formulas.

Thermal Infrared (TIR) Image Data Processing
The color temperature was acquired using Inspire 1, Zenmuse XT640, and DJI GO application. The DJI GO application allows one to view the image of a UAV-mounted TIR camera in real time and check the temperature. The data were acquired at a height of approximately 50 m such that the surface temperature for each color could be seen at once. The TIR images were obtained with the surface temperature using Flir Tools + S/W. Flir Tools + S/W converts radiated values to degrees Fahrenheit or Celsius and provides the maximum, minimum, and average temperatures within a single point of temperature (i.e., region of interest). The surface temperature of five spots per color was obtained herein using Flir Tools + software ( Figure 5).
Adapted from Li et al.
(2005) [42] separated luminance; hence, it takes up little bandwidth compared to the RGB color space [41]. Only the elements related to the brightness value were used among the four color space techniques. We evaluated only the brightness value because the influence of light absorption and reflection depends on the degree of brightness. We calculated it using the L channel in LRV, V channel in HSV, L channel in HSL, and Y channel in YUV. The four color space techniques were calculated using MATLAB. Table 3 presents the color expression method and formulas. Table 3. Values of the color space technology (i.e., light-reflectance value (LRV), hue saturation value (HSV), hue saturation lightness (HSL), and YUV), surface temperature, and indoor temperature according to color.

Thermal Infrared (TIR) Image Data Processing
The color temperature was acquired using Inspire 1, Zenmuse XT640, and DJI GO application. The DJI GO application allows one to view the image of a UAV-mounted TIR camera in real time and check the temperature. The data were acquired at a height of approximately 50 m such that the surface temperature for each color could be seen at once. The TIR images were obtained with the surface temperature using Flir Tools + S/W. Flir Tools + S/W converts radiated values to degrees Fahrenheit or Celsius and provides the maximum, minimum, and average temperatures within a single point of temperature (i.e., region of interest). The surface temperature of five spots per color was obtained herein using Flir Tools + software ( Figure 5).

Adapted from Sambyal
(2015) [43] separated luminance; hence, it takes up little bandwidth compared to the RGB color space [41]. Only the elements related to the brightness value were used among the four color space techniques. We evaluated only the brightness value because the influence of light absorption and reflection depends on the degree of brightness. We calculated it using the L channel in LRV, V channel in HSV, L channel in HSL, and Y channel in YUV. The four color space techniques were calculated using MATLAB. Table 3 presents the color expression method and formulas. Table 3. Values of the color space technology (i.e., light-reflectance value (LRV), hue saturation value (HSV), hue saturation lightness (HSL), and YUV), surface temperature, and indoor temperature according to color.

Thermal Infrared (TIR) Image Data Processing
The color temperature was acquired using Inspire 1, Zenmuse XT640, and DJI GO application. The DJI GO application allows one to view the image of a UAV-mounted TIR camera in real time and check the temperature. The data were acquired at a height of approximately 50 m such that the surface temperature for each color could be seen at once. The TIR images were obtained with the surface temperature using Flir Tools + S/W. Flir Tools + S/W converts radiated values to degrees Fahrenheit or Celsius and provides the maximum, minimum, and average temperatures within a single point of temperature (i.e., region of interest). The surface temperature of five spots per color was obtained herein using Flir Tools + software ( Figure 5).

Adapted from Tsai and
Tseng (2012) [44] separated luminance; hence, it takes up little bandwidth compared to the RGB color space [41]. Only the elements related to the brightness value were used among the four color space techniques. We evaluated only the brightness value because the influence of light absorption and reflection depends on the degree of brightness. We calculated it using the L channel in LRV, V channel in HSV, L channel in HSL, and Y channel in YUV. The four color space techniques were calculated using MATLAB. Table 3 presents the color expression method and formulas. Table 3. Values of the color space technology (i.e., light-reflectance value (LRV), hue saturation value (HSV), hue saturation lightness (HSL), and YUV), surface temperature, and indoor temperature according to color.

Thermal Infrared (TIR) Image Data Processing
The color temperature was acquired using Inspire 1, Zenmuse XT640, and DJI GO application. The DJI GO application allows one to view the image of a UAV-mounted TIR camera in real time and check the temperature. The data were acquired at a height of approximately 50 m such that the surface temperature for each color could be seen at once. The TIR images were obtained with the surface temperature using Flir Tools + S/W. Flir Tools + S/W converts radiated values to degrees Fahrenheit or Celsius and provides the maximum, minimum, and average temperatures within a single point of temperature (i.e., region of interest). The surface temperature of five spots per color was obtained herein using Flir Tools + software ( Figure 5).

Thermal Infrared (TIR) Image Data Processing
The color temperature was acquired using Inspire 1, Zenmuse XT640, and DJI GO application. The DJI GO application allows one to view the image of a UAV-mounted TIR camera in real time and check the temperature. The data were acquired at a height of approximately 50 m such that the surface temperature for each color could be seen at once. The TIR images were obtained with the surface temperature using Flir Tools + S/W. Flir Tools + S/W converts radiated values to degrees Fahrenheit or Celsius and provides the maximum, minimum, and average temperatures within a single point of temperature (i.e., region of interest). The surface temperature of five spots per color was obtained herein using Flir Tools + software ( Figure 5). carry color information from the chrominance components. The YUV color space has already separated luminance; hence, it takes up little bandwidth compared to the RGB color space [41].
Only the elements related to the brightness value were used among the four color space techniques. We evaluated only the brightness value because the influence of light absorption and reflection depends on the degree of brightness. We calculated it using the L channel in LRV, V channel in HSV, L channel in HSL, and Y channel in YUV. The four color space techniques were calculated using MATLAB. Table 3 presents the color expression method and formulas. Table 3. Values of the color space technology (i.e., light-reflectance value (LRV), hue saturation value (HSV), hue saturation lightness (HSL), and YUV), surface temperature, and indoor temperature according to color.

Thermal Infrared (TIR) Image Data Processing
The color temperature was acquired using Inspire 1, Zenmuse XT640, and DJI GO application. The DJI GO application allows one to view the image of a UAV-mounted TIR camera in real time and check the temperature. The data were acquired at a height of approximately 50 m such that the surface temperature for each color could be seen at once. The TIR images were obtained with the surface temperature using Flir Tools + S/W. Flir Tools + S/W converts radiated values to degrees Fahrenheit or Celsius and provides the maximum, minimum, and average temperatures within a single point of temperature (i.e., region of interest). The surface temperature of five spots per color was obtained herein using Flir Tools + software ( Figure 5). Figure 5. Temperature acquisition screen of the Flir Tools + software. The closer the color was to red, the higher the temperature. The closer the color was to blue, the lower the temperature. Figure 6 shows the result obtained when the color space technique was applied using MATLAB. Figure 6a represents the LRV, while b represents the HSV. Figure 6c depicts the HSL, while d represents  . Temperature acquisition screen of the Flir Tools + software. The closer the color was to red, the higher the temperature. The closer the color was to blue, the lower the temperature. Figure 6 shows the result obtained when the color space technique was applied using MATLAB. Figure 6a represents the LRV, while b represents the HSV. Figure 6c depicts  In all the color space techniques, a high value was observed on the white roof, while the lowest value was observed in the black roof (Table 4). Light gray was used to express the color of cement in the early days of construction. If cement is left for a long time without proper maintenance after construction, the color may turn dark gray or black. In this case, the index would be close to black. Conversely, if the color fades over time, the blue color will show an index close to white. Green, which is often used as a general waterproof paint color, does not have a very low index like black, but is close to that of black. A comparison of each index value obtained by the color space technique with the surface and indoor temperature values showed that white, which had the highest value in the color space technique, had the lowest temperature value. By contrast, black, which showed the lowest value in the color space technique, exhibited the highest temperature value. In the color space technique, the color closer to white showed a lower temperature value, while the color closer to black showed a higher temperature value. In other words, the green and blue colors used as the waterproof paint caused increased surface and indoor temperatures in summer. In all the color space techniques, a high value was observed on the white roof, while the lowest value was observed in the black roof (Table 4). Light gray was used to express the color of cement in the early days of construction. If cement is left for a long time without proper maintenance after construction, the color may turn dark gray or black. In this case, the index would be close to black. Conversely, if the color fades over time, the blue color will show an index close to white. Green, which is often used as a general waterproof paint color, does not have a very low index like black, but is close to that of black. A comparison of each index value obtained by the color space technique with the surface and indoor temperature values showed that white, which had the highest value in the color space technique, had the lowest temperature value. By contrast, black, which showed the lowest value in the color space technique, exhibited the highest temperature value. In the color space technique, the color closer to white showed a lower temperature value, while the color closer to black showed a higher temperature value. In other words, the green and blue colors used as the waterproof paint caused increased surface and indoor temperatures in summer. Pearson correlation and linear regression analyses were used to analyze the correlation between the surface and indoor temperatures for the brightness values of the four color space techniques. The Pearson's correlation was between −1 and 1. The closer the value to −1, the higher the negative correlation. The closer the value to +1, the higher the positive correlation. A value of 0 indicates the absence of a correlation. For the linear regression analysis, the closer the value to 1, the better the suitability of the relationship (Table 5).   (Table 6). Table 6. Relationship between the color space technique and the indoor temperature. In conclusion, the HSV value among the four color space techniques had a higher correlation between the surface and indoor temperatures than the values of the other color space techniques.

Conclusions
In this study, four color space techniques were applied using the RGB image of the UAV. The brightness values of the color space techniques for each rooftop color were then compared with the surface and indoor temperatures to evaluate the cool roof performance. As a result, in the cool roof, the white color (average surface temperature: 44.1 • C; average indoor temperature: 33.3 • C) showed the lowest temperature, whereas the black color (average surface temperature: 73.4 • C; average indoor temperature: 37.1 • C) exhibited the highest temperature. Compared with the color space technique, the white color showed the highest color space technique value, whereas the black color showed the lowest color space technique value. The technique showed a high correlation of the Pearson correlation coefficient with the surface temperature (−0.95) and linear regression analysis (0.91). It also showed a high correlation with the indoor temperature (−0.97) and the linear regression analysis (0.94). These results verified the evaluation that the RGB images and the HSV techniques can be used to evaluate cool roofs instead of thermometers or infrared sensors.
This study deviated from the field approach based on uncertainty and applied the UAV photogrammetry technique, which is a remote-sensing technique. In addition, we confirmed that cool roof performance evaluation is possible even when only a low-cost RGB sensor, instead of an expensive infrared sensor, is used for the UAV. The evaluation of the cool roof technology using the RGB sensor and the HSV color space technique confirmed environmental benefits, such as mitigation of the urban heat island effect, comfort benefits from acceptable indoor thermal conditions, and CO 2 reduction through peak power demand reduction. In addition, the evaluation of cool roof performance is essential not only for the content mentioned, but also for solving the deepening tropical phenomenon worldwide. Therefore, an organization like Europe's Cool Roof Council should also be established in Korea. In a future study, we will proceed with the calculation of the artificial temperature image using a correlation function derived from the HSV value and compare it with TIR images. We will also verify the possibility of evaluating cool roofs with RGB sensors and the color space technology in areas with many buildings.