Next Article in Journal
The Evolving Role of Radiation Therapy Technologists in Head and Neck Cancer: A Narrative Review and Operational Framework
Previous Article in Journal
Automated Processing and Deviation Analysis of 3D Pipeline Point Clouds Based on Geometric Features
Previous Article in Special Issue
Unleashing the Potential of Residual and Dual-Stream Transformers for the Remote Sensing Image Analysis
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Assessing RGB Color Reliability via Simultaneous Comparison with Hyperspectral Data on Pantone® Fabrics

by
Cindy Lorena Gómez-Heredia
1,*,
Jose David Ardila-Useda
2,
Andrés Felipe Cerón-Molina
2,
Jhonny Osorio-Gallego
2 and
Jorge Andrés Ramírez-Rincón
2,*
1
Grupo de Películas Delgadas y Nanofotónica, Departamento de Física, Facultad de Ciencias, Pontificia Universidad Javeriana, Bogotá 110231, Colombia
2
Grupo de Investigación en Ciencias y Educación (ICE), Facultad de ingeniería, Universidad de América, Bogotá 110211, Colombia
*
Authors to whom correspondence should be addressed.
J. Imaging 2026, 12(3), 116; https://doi.org/10.3390/jimaging12030116
Submission received: 18 July 2025 / Revised: 28 August 2025 / Accepted: 17 September 2025 / Published: 10 March 2026

Abstract

Accurate color property measurements are critical for advancing artificial vision in real-time industrial applications. RGB imaging remains highly applicable and widely used due to its practicality, accessibility, and high spatial resolution. However, significant uncertainties in extracting chromatic information highlight the need to define when conventional digital images can reliably provide accurate color data. This work simultaneously compares six chromatic properties across 700 Pantone® TCX fabric samples, using optical data acquired simultaneously from both hyperspectral (HSI) and digital (RGB) cameras. The results indicate that the accurate interpretation of optical information from RGB (sRGB and REC2020) images is significantly influenced by lightness ( L * ) values. Samples with bright and unsaturated colors ( L * > 50) reach ratio-to-performance-deviation (RPD) values above 2.5 for four properties ( L * ,   a * ,   b *   h a b ), indicating a good correlation between HSI and RGB information. Absolute color difference comparisons ( E a ) between HSI and RGB images yield values exceeding 5.5 units for red-yellow-green samples and up to 9.0 units for blue and purple tones. In contrast, relative color differences ( E r ) comparisons show a significant decrease, with values falling below 3.0 for all lightness values, indicating the practical equivalence of both methodologies according to the Two One-Sided Test (TOST) statistical analysis. These results confirm that RGB imagery achieves reliable color consistency when evaluated against a practical reference.

Graphical Abstract

1. Introduction

Color appearance is a primary sensory attribute commonly linked to the quality, purity, and chemical composition of an object. Rather than representing intrinsic properties of a sample, color reflects the visual response arising from its optical characteristics interacting with an illumination source [1,2,3,4]. Important processes in industries such as automotive, textile, food, pharmaceutical, cosmetic, and packaging require thorough color inspection for tasks such as painting matching, fabric dyeing, garment color consistency, ensuring food and beverage quality, medication identification and quality, hair dye formulation, makeup consistency, and packaging design [5,6]. These processes are crucial for enhancing product appeal, preserving brand consistency, and ensuring consumer satisfaction. However, color evaluation has traditionally relied on visual inspection by trained personnel, whose assessments may vary due to factors such as lighting conditions, individual perceptual differences, and visual fatigue—introducing inherent ambiguities and inconsistencies [7,8].
In recent decades, advances in artificial and computer vision driven by image/video processing technologies and sophisticated algorithms, have enabled automated identification, classification, and real-time assessment in industries such as pharmaceuticals [9,10], food [11,12], and textiles [13]. For agriculture, Rodríguez-Pulido et al. integrated color and NIR spectral data for simultaneous pixel- and object-level evaluation of grape and seed samples. Similarly, machine vision applied to strawberries combined shape, size, and color analysis through line drawing, K-means clustering, and Multi-attribute Decision Making Theory, achieving color accuracy of 88.8%, shape classification above 90%, and size error detection of 5% within three seconds [14]. In textiles, Çelik et al. designed a fabric inspection system using a feedforward neural network with accuracies of 93.4% (defect detection) and 96.3% (classification) [15]. Later, Dlamini et al. developed a system combining image acquisition hardware with YOLOv4-based processing, achieving 95.3% precision at 34 fps [16].
Recently, hyperspectral imaging (HSI) has emerged as a cutting-edge tool in production lines, medical diagnostics, precision agriculture [17,18,19], and packaging [20], replacing traditional methods for evaluating sensory/color and physicochemical properties. Unlike point spectrometers or colorimeters, HSI simultaneously captures high spatial (millions of pixels) and spectral (numerous bands) information, proving valuable both industrially and scientifically [21]. However, its large scale or even laboratory use is limited by high equipment costs and the expertise required for acquisition, processing, and analysis [22,23,24]. Additionally, both spectroscopy and HSI generate vast datasets across full wavelength ranges that require careful preprocessing before modeling [25,26]. In contrast, RGB imaging, using simple digital cameras with color filters to capture three spectral bands, remains preferable where analysis areas are limited, high spatial resolution and short shutter times are needed, or conditions challenge instrument durability [27].
Although RGB data provides spectral information from only three broad bands, accurate classification can be achieved by leveraging morphological and spatial features [28]. A review in Ref. [29] compiles studies on spectroscopy, colorimeters, and hyperspectral imaging combined with Machine Learning for rapid and non-destructive crop weed discrimination. Conventional spectroscopy delivers extensive VIS/IR data strongly correlated with plant species, though optimal ranges depend on crop and weed characteristics. Hyperspectral imaging shows high potential for real-time identification but requires optimized input variable selection. In contrast, RGB imaging, despite its limited spectral resolution, has proven highly effective for plant classification. Recent efforts focus on training Neural Networks to expand RGB data into more than 50 optical bands, with promising results, though their applicability remains dependent on specific training, testing, and experimental datasets [30,31,32].
In the visible range (380–780 nm), RGB imaging enables color contrast generation through channel intensity changes, facilitating chromatic variation estimation, mainly via lightness, with applications in agriculture and healthcare [33,34,35,36]. This approach remains widely used, even with higher resolution images, as it is simple, reliable, and requires no advanced processing skills [37,38]. However, precise and absolute color calculations rely on spectrometers or hyperspectral cameras, which accurately determine CIE- L * a * b * coordinates [39]. For instance, Lasarte et al. enhanced RGB-CCD measurements with chromatic filters, achieving over 96% color reproduction accuracy and improved appearance parameter predictions using Hunt’94 or CIECAM02 models compared to CIE- L * a * b * [40]. Similarly, hyperspectral imaging outperformed RGB (95.83% vs. 66.2% accuracy) in discriminating apple maturity under different storage conditions [41]. Nonetheless, RGB imaging struggles to match HSI precision due to the non-uniformity of the visible spectrum and color representation variability to each RGB model (sRGB, Adobe RGB, ProPhoto RGB, REC2020, etc.) [42,43]. Consequently, defining the conditions under which RGB images can reliably reproduce chromatic properties is essential to expand machine vision applications in color-sensitive fields.
This work presents a comprehensive and simultaneous comparison of six chromatic properties— L * (lightness), a * ,   b * , C a b * (chroma), h a b (hue), and S (saturation)—across 700 Pantone® TCX fabric samples, using optical information captured from both hyperspectral and digital images. The experimental setup involved capturing HSI and RGB images of the fabric samples under uniform illumination conditions. A custom Python algorithm was developed to automatically normalize the images relative to a white reference, isolate each sample, extract its optical data, convert it to the CIE- L * a * b * color space, and conduct a detailed comparison between the two imaging modalities. This methodological framework enabled a robust and scalable evaluation of chromatic fidelity across a large and diverse dataset. The results offer valuable insights into the conditions under which RGB images in the sRGB and REC2020 representations (different color gamut) can serve as a reliable alternative to hyperspectral imaging for accurate color characterization, highlighting the practical potential of RGB systems in applications where hyperspectral imaging may not be feasible.

2. Materials and Methods

2.1. Experimental Setup

HSI and RGB images were simultaneously acquired under natural sun lighting on a clear day around noon with oblique incidence to ensure uniform and stable illumination over the area of analysis. The experimental setup included two cameras (HSI and RGB) aligned frontally at a fixed height of 45 cm above the samples. A diffuser surface was employed to homogenize the illumination, effectively minimizing shadows and saturation artifacts (see Figure 1a). As white reference, two Spectralon targets were placed along the longitudinal and transverse directions to verify the stability and homogeneity of spectral power distribution (SPD) in each image (Figure 1b), thereby reducing errors in both HSI and RGB data during capturing (30 s for HSI and 1/60 s for RGB) [44]. The SPD variations measured in each image were below 3% (Figure 1c) which was considered sufficiently homogenous for the 17 × 27 cm2 scene analyzed in this work. For the hyperspectral camera, normalization was performed automatically using the “simultaneous reference mode”, whereas for the RGB images, it was applied during the computational processing stage. The hyperspectral camera was operated with an integration time of 3 ms. RGB images were acquired in PRO mode, with an ISO setting of 6400 and autofocus enable.

2.1.1. Hyperspectral and Digital Cameras

Hyperspectral images were acquired using the Specim IQ handheld camera, featuring a resolution of 512 × 512 pixels and 204 spectral bands within the 397 to 1003 nm range (±2.97 nm). The camera utilizes an NVIDIA Tegra K1 processor (SPECIM, Spectral Imaging Ltd, Oulu, Finland) with CMOS technology and a 5 Mpx viewfinder, employing an optical motion engine in a pushbroom configuration for image generation [45]. Conventional digital RGB images were captured using a Samsung S20 mobile phone with a 16-megapixel CMOS camera (3024 × 4032 resolution). Images were saved in RAW format to retain uncompressed and unprocessed data to be subsequently displayed in sRGB and REC2020 formats [46], which cover the 35% and 72% of visible color space, respectively [43,47].

2.1.2. Pantone TCX

To compare the color properties derived from HSI and RGB images, 700 fabric samples from the Pantone® Fashion, Home + Interiors Cotton Planner—an industry-standard reference in the textile sector—were used. Twenty pages, each containing 35 samples, were selected to represent a broad range of hues across the chromatic circle [48], as shown in Figure S1 [49,50,51] of the Supplementary Materials. The optical data from the TCX fabrics were converted to the CIE- L * a * b * color space using a standard 2° observer and D65 illuminant.

2.2. Computational Process

The optical information contained in both the hyperspectral fingerprint and RGB values must be analyzed at the pixel level. To facilitate this process, a Python-based tool was developed to automatically normalize, read, process, segment, and compute the chromatic properties of each sample. Figure 2 presents a flowchart illustrating the algorithmic steps used to convert the images into the CIE- L * a * b * color space.

2.2.1. Image Reading

The algorithm employs two parallel processing lines that merge during the comparison of the results. The initial step involves reading the images using the libraries Spectral (HSI) and OpenCV (RGB). The images are stored as 3D hypercubes with dimensions [ m H ,   n H ,   204 ] for HSI and [ m R ,   n R ,   3 ] for RGB, corresponding to the number of rows ( m ), columns ( n ) and spectral bands contained [52].

2.2.2. Processing

The hypercubes are “unfolded” using the Numpy library into 2D arrays of dimension [ m H n H ,   204 ] and [ m R n R ,   3 ] , facilitating the matrix operations for transformation into CIE- L * a * b * color space. The hyperspectral data is limited to the 400–780 nm range (130 bands), within which the color properties are calculated [53,54,55]. The standard deviation of the optical data across pixels is used to identify and exclude the white reference and background from the analysis [56]. For RGB images, the white reference coordinates are similarly employed to normalize and scale the data from 0 to 1, a necessary step for the application of the XYZ conversion matrices [57,58].

2.2.3. Segmentation

The segmentation of the 35 samples in each image was performed using the Segment Anything Model (SAM) via the Segment Anything library [59]. A total of 700 samples were evaluated; however, they were confined within 20 structured images, each containing 35 objects arranged in a fixed grid (columns 1–5 and rows 1–7, see Figure 2). This deterministic layout enabled systematic indexing and direct correspondence between RGB and HSI images. In this controlled setup, segmentation accuracy was visually verified, and no ambiguities such as partial boundaries, multiple segment assignments, or confusion with the background were observed, which implies confidence in the segmentation results. For HSI images, a dimensional reduction is required using the bands of red (600 nm), green (550 nm), and blue (452 nm), preserving the spectral fingerprint of each sample. This process generates N matrices (masks) associated with the number of identified objects. In this study, 2100 matrices were obtained (35 fabrics × 20 photos × 3 types: HSI, sRGB, REC2020) with dimensions [ m H n H ,   130 ] for HSI and [ m R n R ,   3 ] for RGB. Note m   n m n , as each object occupies a small fraction of the total image area.

2.2.4. CIE- L * a * b * Transformation

The matrices generated are systematically numbered and organized to facilitate comparisons. This data is then transformed into coordinates X Y Z by matrix operations using Equation (S5) (HSI) [60,61,62,63,64,65,66], Equation (S9) (RGB) [47,67,68,69] and therefore calculated the L * a * b * values using Equations (S6)–(S8). The chromatic information reported in the results section corresponds to the average over 550 pixels for HSI and 23 k pixels for RGB, obtaining surface uniformities higher to 95% for all cases, which indicates the homogeneity of samples analyzed. The algorithm then separates and organizes the information into four quadrants (Q) based on their position in the CIE- L * a * b * color space, mainly using the Hue ( h a b ) parameter: Q1 (0–90°), Q2 (90–180°), Q3 (180–270°), and Q4 (270–360°), as illustrated in the chromatic circle (Figure S1) of the Supplementary Materials.

2.2.5. HSI–RGB Color Comparison

Color differences have been reported as a fundamental parameter in both academic research and industrial applications of chromatic inspection. It is particularly useful for establishing perceptual similarities through direct comparison of a sample’s color properties with respect to a reference, as well as for evaluating the equivalence between different colorimetric instruments [70,71]. Several mathematical definitions for the color difference calculation are currently used [72], which mainly differ in the number of parameters involved. The simplest definition employs the Euclidean distance between two points in CIE- L * a * b * space (Equation (S4)). However, due to the strong dependence of hue and lightness with the visual perception of color, alternative definitions such as CIE-2000 ( E ) have been deemed for assessing these differences [73]. Detailed descriptions of each parameter involved in Equation (1) are available in Ref. [74].
E = Δ L K L S L 2 + Δ C K C S C 2 + Δ H K H S H 2 + R T Δ C K C S C Δ H K H S H 1 2 .
In our work, the CIE-2000 color difference was used to analyze the equivalence of the chromatic information obtained from HSI and RGB images in both representations (sRGB and REC2020) through two comparative metrics [75,76]:
  • Absolute color difference ( E a ): Calculated by directly comparing the color properties obtained for each sample from HSI and RGB images. This quantifies the accuracy of RGB data to reproduce a color perception similar to that of HSI.
  • Relative color difference ( E r ): Calculated by selecting one sample as a reference within each imagen and computing the color differences in the remaining 34 samples. This process was performed automatically with the Phyton tool, using as references the first sample in the upper-left corner on HSI and RGB images independently (see Figure 2). The results were subsequently compared to quantify the reliability of RGB data in reproducing color differences between samples within the same image.
Although a unique threshold for perceptual similarity in a color dataset has not been established, recent academic and industrial studies indicate that color differences below 3 units ( E < 3) are generally sufficient to consider two or more processes chromatically equivalent [77,78,79,80].

3. Results

3.1. Optical Characterization

Figure 3 shows the reflectance spectra in the visible range of 16 Pantone® fabric samples whose chromatic properties are representative of CIE- L * a * b * color space. Moreover, the typical spectral sensibility of conventional cameras for R (red), G (green) and B (blue) bands are included (vertical dashed lines). For the red-yellow group (Figure 3a), a notably higher reflectance is observed in the red region (>600 nm) compared to the blue one (<500 nm); therefore, their chromatic properties are expected to be in the first quadrant (Q1) of chromatic circle. In the case of samples 5 to 8 (Figure 3b), although the reflectance is also high in the red region, there is also a significant increase (>50%) of the signal in the green band, therefore these colors will be a combination of yellow and green tones (Q2). The spectra of samples 9 and 12 (Figure 3c—Q3), as well of 13 and 16 (Figure 3d—Q4) present similar characteristics in the blue region but mainly differ in the signal intensity in the green and bands. Likewise, the comparison between samples 3–4 and 5–6, exhibits comparable spectral signatures with different intensities, resulting in lighter or darker colors associated with increases or decreases in reflectance signals. This latter can be observed in curves 10–12 and 14–16 where the colors tend to be darker (low lightness) because a wide amount of visible light is absorbed. These findings are further supported by the color representation of the 16 samples in both the sRGB and REC2020, as shown in Table S1 of the Supplementary Materials.

3.2. Color Properties from Hyperspectral Images (HSI)

The color properties of the 700 samples analyzed in this study were derived from the examination of HSI and RGB images using the computational tool and methodology outlined in Section 2.2. The chromatic information obtained from the HSI is summarized through the statistical analysis presented in Table 1. It is worth noting that each quadrant includes samples with lightness ( L *) values ranging from 20 to 95, encompassing both light and dark color appearances. Similarly, the a and b coordinates, along with the derived chroma ( C a b * ) and saturation (S) values, span from very low intensities (<5) to values approaching the limits of the visible color space (>80). The hue ( h a b ) values also cover the entire chromatic circle, ranging from 0° in Q1 to 360° in Q4. Furthermore, the HSI-derived chromatic properties were compared with the reference values provided by the manufacturer for all 700 samples, as reported online [81]. Differences ( P ¯ ) below 3.0 were observed for each property across all quadrants, highlighting the reliability of the HSI data and, subsequently, the accuracy of the color properties calculated from the hyperspectral images. This comprehensive coverage underscores the dataset’s robust statistical significance in characterizing chromatic properties within the CIE- L * a * b * color space.
The mean and median values of L * and C a b * across all groups are below 70 and 35, respectively, indicating a general tendency toward darker colors. This may hinder accurate interpretation of optical information from RGB images, particularly in Q3, due to (i) the limited range of colors represented in each color space and (ii) the expansion of the visible color gamut as L * decreases [82,83].

3.3. Comparative Analysis of Color Properties from HSI and RGB Images

Figure 4 presents scatter plots for the L *, a *, b *, C a b * , h a b , and   S color properties, comparing the values obtained from RGB images in both sRGB and REC2020 color spaces with reference values generated from the HSI data. A dashed blue line in each plot represents the perfect correlation between HSI and RGB. The lightness (in Figure 4a) shows a better HSI-RGB agreement for values higher than 60 units. For lower values, L * exhibits an underestimation related to the RGB camera’s capability to capture and interpret optical information in conditions of low reflectance/transmittance, which in turn affects the calculation of the Y stimulus. In consequence, the reproduction of a * and b * is seriously compromised by the most intense colors, where a noticeable increase in dispersion is observed for values a * < 0 and b * > 0, specifically for chromatic coordinates located in Q2 and Q3 (Figure 4b,c). Eventually, the chroma (Figure 4d) also exhibits an overestimation for C a b * > 40 in colors with relative high purity. The best HSI-RGB relationship is found in the region C < 30, where a * and b * tend to zero with b * close to white reference.
On the other hand, the hue demonstrates excellent reproducibility between optical and digital information (Figure 4e). Therefore, reliable chromatic contrasts for the identification and classification of colors within a set of samples can be achieved. The regions of major dispersion of h a b are observed in both cases when the quadrant changes due to a decrease to zero and when a * and/or b * change signs. This significantly affects the tangent function used to calculate h a b , producing different tones than expected and even changing the color space quadrant. The best reproduction occurs in Q2, where the samples have the highest L * values and the lowest S values, resulting in a better HSI-RGB correlation. Conversely, in the range between 180° and 270° (Q3), where a * and b * values—and consequently h a b —exhibit the greatest uncertainty, a poor correlation is observed.
Saturation is the color property with the lowest HSI-RGB reproducibility (Figure 4f) and shows the highest difference between sRGB and REC2020 representations. In the first case, values of up to 90% associated with high purity color samples have been obtained. However, this prediction is incorrect when considering the reference dataset information in Table 1, where the maximum value of S obtained is lower than 80%. “When the color saturation in the digital representation approaches 100%, at least one of the RGB channels tends toward zero. As a result, the xy chromaticity coordinates shift toward the boundary of the color space (see Figure S2 in the Supplementary Materials), thereby limiting the accuracy of color representation. The greatest disparities in saturation are found in Q4 (corresponding to the line of purples in the visible color space), where the colors are primarily combinations of the red and blue channels—conditions that, as previously mentioned, hinder accurate digital representation.
It is worth noting that although REC2020 encompasses approximately 72% of the visible color space and it is expected to provide predictions closer to those from HSI, differences in saturation (S) of up to 30 units are still observed. These discrepancies can be attributed to the optical information obtained from RGB camera, which is strongly conditioned by the spectral response of the Bayer array filters (RGGB) in the cellphone, thereby limiting the information captured in each channel [84,85]. Although the RGB data extracted from RAW files yields sRGB and REC2020 values that appear quite different (Table S1), the corresponding conversion matrices sRGB → XYZ and REC2020 → XYZ (Equation (S9)) produce color data with similar chromaticity coordinates ( x y ) i.e., similar perceptual colors as shown in Figure 4. This outcome suggests that the rescaling process from RAW to REC2020 is essentially artificial and does not expand the effective capacity (gamut) of the sensor to generate more accurate images. Consequently, from the outset, the range of color information that can reproduce, particularly in saturated and pure colors, is inherently limited [86,87].

4. Discussion

The scatter plots presented in Figure 4 offer a powerful tool to provide a comprehensive view of when chromatic properties can be successfully reproduced through RGB information, as well as the specific value ranges where this reproducibility holds true. However, the proximity of color properties in both representations (sRGB and REC2020), considering the differences in color space, can lead to ambiguous interpretations. To evaluate the accuracy of the proposed method in reproducing HSI chromatic information through the sRGB and REC2020 representations, the ratio-to-performance deviation (RPD) is calculated. This metric quantifies the deviation between predicted (RGB) and actual (HSI) values by comparing the Standard Deviation (SD) of actual data with the Root Mean Square Error (RMSE) of predicted values. Based on the RPD results, the relationship between dataset can be classified as Excellent (RPD > 3), Good (2.5 < RPD < 3.0), Approximate (2.0 < RPD < 2.5) or Unsatisfactory (RPD < 2.0) [88,89]. Table 2 presents the evaluation of how accurately each color property is represented across different quadrants and lightness intervals within each RGB representation.
Lightness is the parameter that gives the better reliability, especially for quadrants Q1 and Q2, where the highest values of L * are found in the dataset. In contrast, dark and highly saturated colors exhibit RPD values below 1.0, highlighting the strong dependence of RGB information on sample lightness. The uncertainty in a * and b * values become more pronounced for L * 50, and this effect is amplified in the chroma ( C a b * ) parameter due to their quadratic relationship (Equation (S1)). This lightness dependence also affects a * and b * , with more accurate RGB-based predictions observed when L * is greater than 50. Generally, for bright and unsaturated colors ( L * > 50), sRGB and REC2020 predictions show a remarkable agreement related to HSI, with RDPs > 2.5 for at least 4 of the 6 chromatic properties assessed (Classified as Good and Excellent). This consistency between both RGB representations indicates that the matrices defined for each CIE- X Y Z transformation (Equation (S9)) enable the recovery of similar color data from any RGB image.

Color Differences Between HSI and RGB Images

Considering that the number of perceivable colors in the visible space decreases as the Y-stimulus (luminance) increases [82,83], the proportion of colors encompassed by the RGB representations become comparatively higher. Therefore, it is expected that as the chromatic properties of a sample approach to the reference white point (Y → 1), its digital reproduction will more closely align with the HSI image information. Figure 5a presents the scatter plot of the Y -stimulus, derived from HSI data, for each sample alongside the corresponding HSI-RGB absolute color difference defined in Section 2.2.5 ( E a ). A general trend indicates that E a decreases as Y increases. For Y > 0.7 (i.e., L * > 87), when the chromatic coordinates (xyY) of the sample are near the white point, ∆E values below 3.0 are obtained. In this zone ( Y > 0.7), the gamut generated by each RGB representation encompasses several samples in the set, ensuring a highly reliable color interpretation. This is supported by the chromaticity diagrams in Figure 5b, where the coordinates calculated using Equation (S10) for six samples with varying Y values are displayed. Although all colors seem to be contained within the sRGB gamut, the volume defined by the xyY coordinates in Figure 5c reveals that those with the lowest Y values fall outside this volume, indicating that their properties cannot be accurately interpreted using RGB information.
The average values of the absolute color differences E a , grouped by quadrants and limit values of L * (directly related to Y ) are presented in Table 3. In all quadrants, particularly in Q3, E a values exceed 5.0 units, indicating a weak correlation between HSI and RGB data. For darker colors ( L * < 50), these differences surpass 7.4 units in both RGB representations. In contrast, better reproducibility is observed when L * > 50, leading to a noticeable improvement in predicting E a < 5.0 for lighter colors ( L * > 75). Although such color differences are accepted by many authors, these values remain still too high to establish a reliable correspondence between HSI and RGB [72,90,91,92].
Finally, Table 4 shows the analysis of the relative color difference E r (as defined in Section 2.2.5) for the HSI and RGB images, conducted using the Two One-Sided Test. This statistical method evaluates the equivalence of two approaches in reproducing similar results within a predefined tolerance.
The average values of the relative color differences for hyperspectral ( E ¯ r   H S I ) and RGB ( E ¯ r   R G B ) images decrease notably as the considered L * values increase. This is because the reference sample used to compute the relative differences in each case corresponds to the one with the highest lightness (first sample in the upper left corner- see Figure 2). A similar effect is observed when comparing the relative color differences obtained from HSI and RGB for each sample ( d ¯ ), with values below 1.0, indicating an excellent reproduction of color variations for samples of L * > 75. Despite the high dispersion of the data in each interval ( σ d ), the 95% confidence interval (IC 95%) comparing HSI and RGB indicates that most samples fall within a relative color difference below 3.0 units L * < 50, 2.0 units for L * > 50 and even lower to 1.0 units for L * > 75. Although the TOST results show data with statistically significant differences under the null hypothesis (p  0.05), with the size effect ranging from medium to small [93,94], the relative color differences found comparing HSI and RGB remain within the 3.0 units tolerance interval, supporting the practical equivalence of both methodologies for any lightness value.
These results indicate that, unlike absolute color differences, a relative color comparison does not require RGB to encompass the entire visible color space. Instead, knowing the coordinates of a specific reference point (sample reference), within the restricted space is sufficient to reliably quantify distances (color differences) to other points. In this way, the spectral bias inherent to RGB images is effectively canceled when computing distances between two points, making the resulting information equivalent to that obtained from hyperspectral data.

5. Conclusions

Six chromatic properties defined in the CIE- L * a * b * space were calculated and compared using HSI and RGB images in both sRGB and REC2020 representations. A Python algorithm was developed to process, individualize and analyze data from 700 Pantone® fabric samples, aiming to establish a correlation between the color information obtained from each type of image. The results indicate that the accurate interpretation of the optical information from RGB images is strongly influenced by the lightness ( L *) values due to the changes in the CIE- L * a * b * color space as it approaches the white reference. For samples with L * values greater than 50 (lighter colors), RPD values above 2.5 were found in four of the considered properties ( L * ,   a * ,   b * ,     h a b ), indicating a good correlation between HSI and RGB data. The absolute color differences ( E a ) obtained from the direct comparison of HSI and RGB data exceeded 5.0 units for samples in the red-yellow-green quadrants (Q1 and Q2), rising to 9.0 for blues and purples (Q3 and Q4) in both sRGB and REC2020 representations. These discrepancies were attributed to the optical information captured by the RGB camera, which is strongly conditioned by the spectral response of the Bayer array filters (RGGB) in the cellphone, thereby limiting accuracy of the information in each channel. These differences decreased significantly in the relative color differences analysis ( E r ), which uses a reference sample for each HSI and RGB image. In this case, values below 3.0 units were achieved across the entire color space, reaching as low as 1.0 unit for light-colored samples ( L * > 75). Statistical analysis using the Two One-Sided Test (TOST) indicate that, although HSI and RGB data show statistically significant differences under the null hypothesis (p  0.05), the relative differences between them remain within the 3.0 units tolerance interval commonly accepted in most academic industrial applications. Although RGB shows absolute limitations (with high E a values in red, blue, and purple tones), the use of relative color differences ( E r ) combined with statistical validation makes it a reliable and practical tool for comparative analyses. This approach results in a robust and validated protocol that can partially replace more expensive HSI systems in applications where absolute spectral reproduction is not essential, but relative color consistency is critical. Such capability is highly relevant for industries where reliable and cost-effective color assessment is required, including textiles and fashion (quality control of dyes and fabrics), printing and packaging (color matching and consistency across batches), food and beverages (monitoring freshness and product appearance), and digital imaging and display technologies (standardization of color rendering).
This work provides a solid first step toward RGB-based segmentation for fabric color analysis, offering valuable insights into its potential and limitations. Still, methodological extensions—such as incorporating multiple RGB devices with different Bayer filters, testing under varied illumination sources, and applying the approach to different material types—could significantly strengthen its robustness and broaden its applicability in industrial contexts.

Supplementary Materials

The following supporting information can be downloaded at https://www.mdpi.com/article/10.3390/jimaging12030116/s1: Figure S1: (a) CIE-L*a*b* color space used to represent chromatic properties, with the L-axis associated with lightness ranging from 0 (black) to 100 (white). (b) CIE-L*a*b* plane is divided into four quadrants based on a and b values, representing the chroma ( C a b * ) and hue ( h a b ) for a given color; Figure S2: (a) Chromaticity diagram CIE-xy used for representation of visible color space (horseshoe) and gamut of sRGB (solid line) and REC2020 (dash line). (b) The Macadam ellipses allow to determine the region where a group of colors produce the same visual effect; Table S1: Digital data for the red (R), green (G) and blue (B) channels for the 16 samples of Figure 3 (main document), obtained from RGB camera. The colored squares have been generated by using the representations sRGB and REC2020 to highlight the differences (Diff) in color interpretation.

Author Contributions

Conceptualization, J.A.R.-R.; methodology, J.D.A.-U. and A.F.C.-M.; software, A.F.C.-M. and J.O.-G.; validation, J.A.R.-R., J.D.A.-U. and J.O.-G.; formal analysis, J.A.R.-R. and C.L.G.-H.; writing—original draft preparation, J.A.R.-R. and J.D.A.-U.; writing—review and editing, C.L.G.-H. and J.A.R.-R.; funding acquisition, J.A.R.-R. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Fundación Universidad de América, grant number IHU-009 and The APC was funded by Pontificia Universidad Javeriana.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding authors.

Acknowledgments

Authors thanks to Universidad de America for the financial support to develop this project.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
HSIHyperspectral imaging
QQuadrant
RPDRatio to performance deviation
CIECommission Internationale d’Eclairage
LLightness
CabChroma
habHue
TOSTTwo One-Sided Test
SPDSpectral Power Distribution

References

  1. Huang, M.; Liu, H.; Cui, G.; Luo, M.R. Testing Uniform Colour Spaces and Colour-difference Formulae Using Printed Samples. Color Res. Appl. 2012, 37, 326–335. [Google Scholar] [CrossRef]
  2. Montag, E.D.; Wilber, D.C. A Comparison of Constant Stimuli and Gray-scale Methods of Color Difference Scaling. Color Res. Appl. 2003, 28, 36–44. [Google Scholar] [CrossRef]
  3. Vernet, S.; Dinet, E.; Trémeau, A.; Colantoni, P. Experimental Protocol for Color Difference Evaluation Under Stabilized LED Light. J. Imaging 2024, 11, 4. [Google Scholar] [CrossRef] [PubMed]
  4. Miller, M.E. Scenes and Lighting. In Color in Electronic Display Systems: Advantages of Multi-Primary Displays; Springer International Publishing: Cham, Switzerland, 2019; pp. 39–66. [Google Scholar]
  5. Ohta, N.; Robertson, A.R. Colorimetry: Fundamentals and Applications; John Wiley & Sons: Hoboken, NJ, USA, 2006. [Google Scholar]
  6. Wu, D.; Sun, D.-W. Colour Measurements by Computer Vision for Food Quality Control—A Review. Trends Food Sci. Technol. 2013, 29, 5–20. [Google Scholar] [CrossRef]
  7. Mak, K.L.; Peng, P.; Yiu, K.F.C. Fabric Defect Detection Using Morphological Filters. Image Vis. Comput. 2009, 27, 1585–1592. [Google Scholar] [CrossRef]
  8. Tiffin, J.; Kuhn, H.S. Color Discrimination in Industry. Arch. Ophthalmol. 1942, 28, 851–859. [Google Scholar] [CrossRef]
  9. Dugonik, B.; Golob, M.; Marhl, M.; Dugonik, A. Optimizing Digital Image Quality for Improved Skin Cancer Detection. J. Imaging 2025, 11, 107. [Google Scholar] [CrossRef]
  10. Khalkhali, V.; Lee, H.; Nguyen, J.; Zamora-Erazo, S.; Ragin, C.; Aphale, A.; Bellacosa, A.; Monk, E.P.; Biswas, S.K. MST-AI: Skin Color Estimation in Skin Cancer Datasets. J. Imaging 2025, 11, 235. [Google Scholar] [CrossRef]
  11. Hoshino, H.; Shindo, T.; Hiraguri, T.; Itoh, N. RGB Color Space-Enhanced Training Data Generation for Cucumber Classification. J. Imaging 2025, 11, 120. [Google Scholar] [CrossRef]
  12. Xu, L.; Zhao, Y. Automated Strawberry Grading System Based on Image Processing. Comput. Electron. Agric. 2010, 71, S32–S39. [Google Scholar] [CrossRef]
  13. Ono, S. A Color-Based Multispectral Imaging Approach for a Human Detection Camera. J. Imaging 2025, 11, 93. [Google Scholar] [CrossRef] [PubMed]
  14. Rodríguez-Pulido, F.J.; Gordillo, B.; Heredia, F.J.; González-Miret, M.L. CIELAB—Spectral Image MATCHING: An App for Merging Colorimetric and Spectral Images for Grapes and Derivatives. Food Control 2021, 125, 108038. [Google Scholar] [CrossRef]
  15. Çelik, H.I.; Dülger, L.C.; Topalbekiroǧlu, M. Development of a Machine Vision System: Real-Time Fabric Defect Detection and Classification with Neural Networks. J. Text. Inst. 2014, 105, 575–585. [Google Scholar] [CrossRef]
  16. Dlamini, S.; Kao, C.-Y.; Su, S.-L.; Jeffrey Kuo, C.-F. Development of a Real-Time Machine Vision System for Functional Textile Fabric Defect Detection Using a Deep YOLOv4 Model. Text. Res. J. 2022, 92, 675–690. [Google Scholar] [CrossRef]
  17. Teke, M.; Deveci, H.S.; Haliloglu, O.; Gurbuz, S.Z.; Sakarya, U. A Short Survey of Hyperspectral Remote Sensing Applications in Agriculture. In Proceedings of the RAST 2013—Proceedings of 6th International Conference on Recent Advances in Space Technologies, Istanbul, Turkey, 12–14 June 2013. [Google Scholar]
  18. Liu, F.; Xiao, Z. Disease Spots Identification of Potato Leaves in Hyperspectral Based on Locally Adaptive 1D-CNN. In Proceedings of the 2020 IEEE International Conference on Artificial Intelligence and Computer Applications, ICAICA 2020, Dalian, China, 27–29 June 2020. [Google Scholar]
  19. Carroll, M.W.; Glaser, J.A.; Hellmich, R.L.; Hunt, T.E.; Sappington, T.W.; Calvin, D.; Copenhaver, K.; Fridgen, J. Use of Spectral Vegetation Indices Derived from Airborne Hyperspectral Imagery for Detection of European Corn Borer Infestation in Iowa Corn Plots. J. Econ. Entomol. 2008, 101, 1614–1623. [Google Scholar] [CrossRef]
  20. Medus, L.D.; Saban, M.; Francés-Víllora, J.V.; Bataller-Mompeán, M.; Rosado-Muñoz, A. Hyperspectral Image Classification Using CNN: Application to Industrial Food Packaging. Food Control 2021, 125, 107962. [Google Scholar] [CrossRef]
  21. Khan, M.J.; Khan, H.S.; Yousaf, A.; Khurshid, K.; Abbas, A. Modern Trends in Hyperspectral Image Analysis: A Review. IEEE Access 2018, 6, 14118–14129. [Google Scholar] [CrossRef]
  22. Corrales, D.C. Toward Detecting Crop Diseases and Pest by Supervised Learning. Ing. Univ. 2015, 19, 207–228. [Google Scholar] [CrossRef]
  23. Jordan, M.I.; Mitchell, T.M. Machine Learning: Trends, Perspectives, and Prospects. Science 2015, 349, 255–260. [Google Scholar] [CrossRef]
  24. Signoroni, A.; Savardi, M.; Baronio, A.; Benini, S. Deep Learning Meets Hyperspectral Image Analysis: A Multidisciplinary Review. J. Imaging 2019, 5, 52. [Google Scholar] [CrossRef]
  25. Ramírez-Rincón, J.A.; Palencia, M.; Combatt, E.M. Separation of Optical Properties for Multicomponent Samples and Determination of Spectral Similarity Indices Based on FEDS0 Algorithm. Mater. Today Commun. 2022, 33, 104528. [Google Scholar] [CrossRef]
  26. Ramírez-Rincón, J.A.; Palencia, M.; Combatt, E.M. Determining Relative Values of PH, CECe, and OC in Agricultural Soils Using Functional Enhanced Derivative Spectroscopy (FEDS0) Method in the Mid-Infrared Region. Infrared Phys. Technol. 2023, 133, 104864. [Google Scholar] [CrossRef]
  27. Lee, D.J.; Archibald, J.K.; Chang, Y.C.; Greco, C.R. Robust Color Space Conversion and Color Distribution Analysis Techniques for Date Maturity Evaluation. J. Food Eng. 2008, 88, 364–372. [Google Scholar] [CrossRef]
  28. Nishidate, I.; Kawauchi, S.; Sato, S.; Sato, M.; Aizu, Y.; Kokubo, Y. RGB Camera-Based Functional Imaging of In Vivo Biological Tissues. In Optical Design and Testing VIII; SPIE: Bellingham, WA, USA, 2018. [Google Scholar]
  29. Su, W.H. Advanced Machine Learning in Point Spectroscopy, Rgb-and Hyperspectral-Imaging for Automatic Discriminations of Crops and Weeds: A Review. Smart Cities 2020, 3, 767–792. [Google Scholar] [CrossRef]
  30. Zhang, J.; Su, R.; Fu, Q.; Ren, W.; Heide, F.; Nie, Y. A Survey on Computational Spectral Reconstruction Methods from RGB to Hyperspectral Imaging. Sci. Rep. 2022, 12, 11905. [Google Scholar] [CrossRef] [PubMed]
  31. Connah, D.; Westland, S.; Thomson, M.G.A. Recovering Spectral Information Using Digital Camera Systems. Color. Technol. 2001, 117, 309–312. [Google Scholar] [CrossRef]
  32. Eem, J.K.; Shin, H.D.; Park, S.O. Reconstruction of Surface Spectral Reflectances Using Characteristic Vectors of Munsell Colors. In Proceedings of the Final Program and Proceedings—IS and T/SID Color Imaging Conference, Scottsdale, AZ, USA, 15–18 November 1994. [Google Scholar]
  33. Kim, I.; Kim, M.S.; Chen, Y.R.; Kong, S.G. Detection of Skin Tumors on Chicken Carcasses Using Hyperspectral Fluorescence Imaging. Trans. Am. Soc. Agric. Eng. 2004, 47, 1785–1792. [Google Scholar] [CrossRef]
  34. Vargas, A.M.; Kim, M.S.; Tao, Y.; Lefcourt, A.; Chen, Y.R. Safety Inspection of Cantaloupes and Strawberries Using Multispectral Fluorescence Imaging Techniques. In Proceedings of the ASAE Annual International Meeting 2004, Ottawa, ON, Canada, 1–4 August 2004. [Google Scholar]
  35. Elagamy, S.H.; Adly, L.; Abdel Hamid, M.A. Smartphone Based Colorimetric Approach for Quantitative Determination of Uric Acid Using Image J. Sci. Rep. 2023, 13, 21888. [Google Scholar] [CrossRef]
  36. Del Fiore, A.; Reverberi, M.; Ricelli, A.; Pinzari, F.; Serranti, S.; Fabbri, A.A.; Bonifazi, G.; Fanelli, C. Early Detection of Toxigenic Fungi on Maize by Hyperspectral Imaging Analysis. Int. J. Food Microbiol. 2010, 144, 64–71. [Google Scholar] [CrossRef]
  37. Lin, H.; Wang, Z.; Ahmad, W.; Man, Z.; Duan, Y. Identification of Rice Storage Time Based on Colorimetric Sensor Array Combined Hyperspectral Imaging Technology. J. Stored Prod. Res. 2020, 85, 101523. [Google Scholar] [CrossRef]
  38. Kabakeris, T.; Poth, A.; Intreß, J.; Schmidt, U.; Geyer, M. Detection of Postharvest Quality Loss in Broccoli by Means of Non-Colorimetric Reflection Spectroscopy and Hyperspectral Imaging. Comput. Electron. Agric. 2015, 118, 322–331. [Google Scholar] [CrossRef]
  39. ISO 11664-4:2008; International Commission on Illumination Standard Colorimetry—Part 4: CIE 1976 L*a*b* Colour Space. ISO: Geneva, Switzerland, 2008.
  40. de Lasarte, M.; Vilaseca, M.; Pujol, J.; Arjona, M.; Martínez-Verdú, F.M.; de Fez, D.; Viqueira, V. Development of a Perceptual Colorimeter Based on a Conventional CCD Camera with More than Three Color Channels. In Proceedings of the 10th Congress of the International Color Association, Granada, Spain, 9–13 May 2005; Volume 1, pp. 1247–1250. [Google Scholar]
  41. Garrido-Novell, C.; Pérez-Marin, D.; Amigo, J.M.; Fernández-Novales, J.; Guerrero, J.E.; Garrido-Varo, A. Grading and Color Evolution of Apples Using RGB and Hyperspectral Imaging Vision Cameras. J. Food Eng. 2012, 113, 281–288. [Google Scholar] [CrossRef]
  42. International Electrotechnical Commission. Multimedia Systems and Equipment—Colour Measurement and Management—Part 2-1: Colour Management—Default RGB Colour Space—SRGB; International Electrotechnical Commission: Geneva, Switzerland, 2000. [Google Scholar]
  43. Süsstrunk, S.; Buckley, R.; Swen, S. Standard RGB Color Spaces. In Color and Imaging Conference; Society of Imaging Science and Technology: Springfield, VA, USA, 1999; Volume 7, pp. 127–134. [Google Scholar]
  44. Foster, D.H.; Amano, K. Hyperspectral Imaging in Color Vision Research: Tutorial. J. Opt. Soc. Am. A 2019, 36, 606. [Google Scholar] [CrossRef] [PubMed]
  45. Behmann, J.; Acebron, K.; Emin, D.; Bennertz, S.; Matsubara, S.; Thomas, S.; Bohnenkamp, D.; Kuska, M.; Jussila, J.; Salo, H.; et al. Specim IQ: Evaluation of a New, Miniaturized Handheld Hyperspectral Camera and Its Application for Plant Phenotyping and Disease Detection. Sensors 2018, 18, 441. [Google Scholar] [CrossRef]
  46. Mirhosseini, S.; Nasiri, A.F.; Khatami, F.; Mirzaei, A.; Aghamir, S.M.K.; Kolahdouz, M. A Digital Image Colorimetry System Based on Smart Devices for Immediate and Simultaneous Determination of Enzyme-Linked Immunosorbent Assays. Sci. Rep. 2024, 14, 2587. [Google Scholar] [CrossRef]
  47. Soneira, R.M. Display Color Gamuts: NTSC to Rec. 2020. Inf. Disp. 2016, 32, 26–31. [Google Scholar] [CrossRef]
  48. León, K.; Mery, D.; Pedreschi, F.; León, J. Color Measurement in L*a*b* Units from RGB Digital Images. Food Res. Int. 2006, 39, 1084–1091. [Google Scholar] [CrossRef]
  49. Zhbanova, V.L. Research into Methods for Determining Colour Differences in the CIELAB Uniform Colour Space. Light Eng. 2020, 28, 53–59. [Google Scholar] [CrossRef]
  50. Hill, B.; Roger, T.; Vorhagen, F.W. Comparative Analysis of the Quantization of Color Spaces on the Basis of the CIELAB Color-Difference Formula. ACM Trans. Graph. 1997, 16, 109–154. [Google Scholar] [CrossRef]
  51. Misue, K.; Kitajima, H. Design Tool of Color Schemes on the CIELAB Space. In Proceedings of the 2016 20th International Conference Information Visualisation (IV), Lisbon, Portugal, 19–22 July 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 33–38. [Google Scholar]
  52. Gao, L.; Smith, R.T. Optical Hyperspectral Imaging in Microscopy and Spectroscopy—A Review of Data Acquisition. J. Biophotonics 2015, 8, 441–456. [Google Scholar] [CrossRef]
  53. Ciaccheri, L.; Adinolfi, B.; Mencaglia, A.A.; Mignani, A.G. Smartphone-Enabled Colorimetry. Sensors 2023, 23, 5559. [Google Scholar] [CrossRef] [PubMed]
  54. Wright, W.D. A Re-Determination of the Trichromatic Coefficients of the Spectral Colours. Trans. Opt. Soc. 1929, 30, 141–164. [Google Scholar] [CrossRef]
  55. Guild, J. The Colorimetric Properties of the Spectrum. Philos. Trans. R. Soc. A 1931, 230, 149–187. [Google Scholar] [CrossRef]
  56. Dorrepaal, R.; Malegori, C.; Gowen, A. Tutorial: Time Series Hyperspectral Image Analysis. J. Near Infrared Spectrosc. 2016, 24, 89–107. [Google Scholar] [CrossRef]
  57. Chernov, V.; Alander, J.; Bochko, V. Integer-Based Accurate Conversion between RGB and HSV Color Spaces. Comput. Electr. Eng. 2015, 46, 328–337. [Google Scholar] [CrossRef]
  58. Ganesan, P.; Rajini, V.; Rajkumar, R.I. Segmentation and Edge Detection of Color Images Using CIELAB Color Space and Edge Detectors. In Proceedings of the INTERACT-2010, Chennai, India, 3–5 December 2010; IEEE: Piscataway, NJ, USA, 2010; pp. 393–397. [Google Scholar]
  59. Kirillov, A.; Mintun, E.; Ravi, N.; Mao, H.; Rolland, C.; Gustafson, L.; Xiao, T.; Whitehead, S.; Berg, A.C.; Lo, W.-Y.; et al. Segment Anything. arXiv 2023. [Google Scholar] [CrossRef]
  60. Bonfatti Júnior, E.A.; Lengowski, E.C. Colorimetria Aplicada à Ciência e Tecnologia Da Madeira. Pesqui. Florest. Bras. 2018, 38, 1–13. [Google Scholar] [CrossRef]
  61. Autran, C.d.S.; Gonçalez, J.C. Caracterização colorimétrica das madeiras de muirapiranga (Brosimum rubescenstaub.) e de seringueira (Hevea brasiliensis, clone tjir 16 müll arg.) visando à utilização em interiores. Ciência Florest. 2006, 16, 445–451. [Google Scholar] [CrossRef]
  62. Bianco, S. Reflectance Spectra Recovery from Tristimulus Values by Adaptive Estimation with Metameric Shape Correction. J. Opt. Soc. Am. A 2010, 27, 1868. [Google Scholar] [CrossRef]
  63. Xu, Y.; Zhang, C.; Gao, C.; Wang, Z.; Li, C. A Hybrid Adaptation Strategy for Reconstruction Reflectance Based on the given Tristimulus Values. Color Res. Appl. 2020, 45, 603–611. [Google Scholar] [CrossRef]
  64. Wu, G.; Qian, L.; Hu, G.; Li, X. Spectral Reflectance Recovery from Tristimulus Values under Multi-Illuminants. J. Spectrosc. 2019, 2019, 3538265. [Google Scholar] [CrossRef]
  65. Cao, B.; Liao, N.; Li, Y.; Cheng, H. Improving Reflectance Reconstruction from Tristimulus Values by Adaptively Combining Colorimetric and Reflectance Similarities. Opt. Eng. 2017, 56, 053104. [Google Scholar] [CrossRef]
  66. Wu, G.; Shen, X.; Liu, Z.; Yang, S.; Zhu, M. Reflectance Spectra Recovery from Tristimulus Values by Extraction of Color Feature Match. Opt. Quantum. Electron. 2016, 48, 64. [Google Scholar] [CrossRef]
  67. Froehlich, J.; Kunkel, T.; Atkins, R.; Pytlarz, J.; Daly, S.; Schilling, A.; Eberhardt, B. Encoding Color Difference Signals for High Dynamic Range and Wide Gamut Imagery. Color Imaging Conf. 2015, 23, 240–247. [Google Scholar] [CrossRef]
  68. Hunt, R.W.G.; Pointer, M.R. Measuring Colour; Wiley: Hoboken, NJ, USA, 2011; ISBN 9781119975373. [Google Scholar]
  69. López, F.; Valiente, J.M.; Baldrich, R.; Vanrell, M. Fast Surface Grading Using Color Statistics in the CIE Lab Space. In Iberian Conference on Pattern Recognition and Image Analysis; Springer: Berlin/Heidelberg, Germany, 2005; pp. 666–673. [Google Scholar]
  70. Farah, R.I. Agreement between Digital Image Analysis and Clinical Spectrophotometer in CIEL*C*h° Coordinate Differences and Total Color Difference (ΔE) Measurements of Dental Ceramic Shade Tabs. Int. J. Esthet. Dent. 2016, 11, 234–245. [Google Scholar]
  71. Milić, N.; Novaković, D.; Kašiković, N. Measurement Uncertainty in Colourcharacterization of Printed Textile Materials. J. Graph. Eng. Des. 2011, 2, 16–25. [Google Scholar] [CrossRef]
  72. Dattner, M.; Bohn, D. Characterization of Print Quality in Terms of Colorimetric Aspects. In Printing on Polymers; Elsevier: Amsterdam, The Netherlands, 2016; pp. 329–345. [Google Scholar]
  73. Robertson, A.R. Historical Development of CIE Recommended Color Difference Equations. Color Res. Appl. 1990, 15, 167–170. [Google Scholar] [CrossRef]
  74. Luo, M.R.; Cui, G.; Rigg, B. The Development of the CIE 2000 Colour-difference Formula: CIEDE2000. Color Res. Appl. 2001, 26, 340–350. [Google Scholar] [CrossRef]
  75. Roy Choudhury, A.K. Colour-Difference Assessment. In Principles of Colour and Appearance Measurement; Elsevier: Amsterdam, The Netherlands, 2015; pp. 55–116. [Google Scholar]
  76. X-RITE Pantone Tips for Defining a Realistic Pass/Fail Tolerance. Available online: https://www.xrite.com/blog/tips-to-define-tolerances (accessed on 17 March 2024).
  77. Dziki, P.; Pieszczek, L.; Daszykowski, M. Toward More Efficient and Effective Color Quality Control for the Large-scale Offset Printing Process. J. Chemom. 2024, 38, e3543. [Google Scholar] [CrossRef]
  78. Boruczkowska, H.; Boruczkowski, T.; Bronkowska, M.; Prajzner, M.; Rytel, E. Comparison of Colour Measurement Methods in the Food Industry. Processes 2025, 13, 1268. [Google Scholar] [CrossRef]
  79. Zhai, X.; Xue, Y.; Sun, Y.; Ma, X.; Ban, W.; Marappan, G.; Tahir, H.E.; Huang, X.; Wu, K.; Chen, Z.; et al. Colorimetric Food Freshness Indicators for Intelligent Packaging: Progress, Shortcomings, and Promising Solutions. Foods 2025, 14, 2813. [Google Scholar] [CrossRef]
  80. Anokye-Bempah, L.; Styczynski, T.; Ristenpart, W.D.; Donis-González, I.R. A Universal Color Curve for Roasted Arabica Coffee. Sci. Rep. 2025, 15, 24192. [Google Scholar] [CrossRef]
  81. Pantone Pantone Connect. Available online: https://www.pantone.com/pantone-connect (accessed on 28 July 2024).
  82. MacAdam, D.L. Visual Sensitivities to Color Differences in Daylight. J. Opt. Soc. Am. 1942, 32, 247. [Google Scholar] [CrossRef]
  83. MacAdam, D.L. Maximum Visual Efficiency of Colored Materials. J. Opt. Soc. Am. 1935, 25, 361. [Google Scholar] [CrossRef]
  84. Li, X.; Gunturk, B.; Zhang, L. Image Demosaicing: A Systematic Survey. In Visual Communications and Image Processing 2008; Pearlman, W.A., Woods, J.W., Lu, L., Eds.; SPIE: Bellingham, WA, USA, 2008; p. 68221J. [Google Scholar]
  85. Shortis, M.R.; Seager, J.W.; Harvey, E.S.; Robson, S. Influence of Bayer Filters on the Quality of Photogrammetric Measurement. In Videometrics VIII; Beraldin, J.-A., El-Hakim, S.F., Gruen, A., Walton, J.S., Eds.; SPIE: Bellingham, WA, USA, 2005; p. 164. [Google Scholar]
  86. Hijazi, A.; Al-Masri, A.; Rawashdeh, N. On the Use of Bayer Sensor Color Cameras in Digital Image Correlation. In Proceedings of the 2022 11th International Symposium on Signal, Image, Video and Communications (ISIVC), El Jadida, Morocco, 18–20 May 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 1–7. [Google Scholar]
  87. Maschal, R.A., Jr.; Young, S.S.; Reynolds, J.; Krapels, K.; Fanning, J.; Corbin, T. Review of Bayer Pattern CFA Demosaicing with New Quality Assessment Algorithms. In Infrared Imaging Systems: Design, Analysis, Modeling, and Testing XXI; Holst, G.C., Krapels, K.A., Eds.; SPIE: Bellingham, WA, USA, 2010; p. 766215. [Google Scholar]
  88. Nawar, S.; Buddenbaum, H.; Hill, J.; Kozak, J. Modeling and Mapping of Soil Salinity with Reflectance Spectroscopy and Landsat Data Using Two Quantitative Methods (PLSR and MARS). Remote Sens. 2014, 6, 10813–10834. [Google Scholar] [CrossRef]
  89. Chang, C.-W.; Laird, D.A.; Mausbach, M.J.; Hurburgh, C.R. Near-Infrared Reflectance Spectroscopy–Principal Components Regression Analyses of Soil Properties. Soil Sci. Soc. Am. J. 2001, 65, 480–490. [Google Scholar] [CrossRef]
  90. Luo, M.R.; Rigg, B. BFD (l:C) Colour-difference Formula Part-1 Development of the Formula. J. Soc. Dye. Colour. 1987, 103, 86–94. [Google Scholar] [CrossRef]
  91. Burgos-Fernández, F.J.; Vilaseca, M.; Perales, E.; Chorro, E.; Martínez-Verdú, F.M.; Fernández-Dorado, J.; Pujol, J. Validation of a Gonio-Hyperspectral Imaging System Based on Light-Emitting Diodes for the Spectral and Colorimetric Analysis of Automotive Coatings. Appl. Opt. 2017, 56, 7194. [Google Scholar] [CrossRef]
  92. Gravesen, J. The Metric of Colour Space. Graph. Models 2015, 82, 77–86. [Google Scholar] [CrossRef]
  93. Ellis, P.D. The Essential Guide to Effect Sizes; Cambridge University Press: Cambridge, UK, 2010; ISBN 9780521194235. [Google Scholar]
  94. Fritz, C.O.; Morris, P.E.; Richler, J.J. Effect Size Estimates: Current Use, Calculations, and Interpretation. J. Exp. Psychol. Gen. 2012, 141, 2–18. [Google Scholar] [CrossRef]
Figure 1. (a) Experimental setup used for the acquisition and analysis of hyperspectral and RGB images. (b) Real captured scene illustrating the arrangement of 35 fabrics and two Spectralon references in each image, used to verify the stability and homogeneity of sunlight. (c) Average solar spectra obtained from the longitudinal (black) and transverse (red) Spectralon references. The error bars indicate the intensity variations across the analyzed area.
Figure 1. (a) Experimental setup used for the acquisition and analysis of hyperspectral and RGB images. (b) Real captured scene illustrating the arrangement of 35 fabrics and two Spectralon references in each image, used to verify the stability and homogeneity of sunlight. (c) Average solar spectra obtained from the longitudinal (black) and transverse (red) Spectralon references. The error bars indicate the intensity variations across the analyzed area.
Jimaging 12 00116 g001
Figure 2. Flowchart of the Python tool (v. 3.12) developed for reading, processing, segmenting, and transforming hyperspectral and RGB images into the CIE-L*a*b* color space. The images illustrate the step-by-step transformation applied to data as part of the computational algorithm.
Figure 2. Flowchart of the Python tool (v. 3.12) developed for reading, processing, segmenting, and transforming hyperspectral and RGB images into the CIE-L*a*b* color space. The images illustrate the step-by-step transformation applied to data as part of the computational algorithm.
Jimaging 12 00116 g002
Figure 3. Reflectance spectrum of 16 representative Pantone® fabrics samples (TCX) in the visible range (400–750 nm) separated by light (solid line) and dark (dash line) colors in the chromatic regions of (a) red-yellow, (b) yellow-green, (c) green-blue and (d) blue-purple. The detection bands R (red), G (green) and B (blue) are included (vertical dash lines) as reference to correlate the spectral fingerprint with the corresponding RGB data.
Figure 3. Reflectance spectrum of 16 representative Pantone® fabrics samples (TCX) in the visible range (400–750 nm) separated by light (solid line) and dark (dash line) colors in the chromatic regions of (a) red-yellow, (b) yellow-green, (c) green-blue and (d) blue-purple. The detection bands R (red), G (green) and B (blue) are included (vertical dash lines) as reference to correlate the spectral fingerprint with the corresponding RGB data.
Jimaging 12 00116 g003
Figure 4. Scatter plots of the chromatic properties (a) lightness L * , (b) a * , (c) b * , (d) chroma, (e) hue and (f) saturation, obtained from hyperspectral and RGB images in the representations sRGB (black) and REC2020 (red). The dash blue line is included in each case as a reference to observing the region of better agreement between HSI and RGB data. The icons in saturation plot have been separated by quadrant: squares (Q1), circles (Q2), triangles (Q3) and stars (Q4).
Figure 4. Scatter plots of the chromatic properties (a) lightness L * , (b) a * , (c) b * , (d) chroma, (e) hue and (f) saturation, obtained from hyperspectral and RGB images in the representations sRGB (black) and REC2020 (red). The dash blue line is included in each case as a reference to observing the region of better agreement between HSI and RGB data. The icons in saturation plot have been separated by quadrant: squares (Q1), circles (Q2), triangles (Q3) and stars (Q4).
Jimaging 12 00116 g004
Figure 5. (a) Scatter plot of the Y -stimulus obtained from HSI images and absolute color difference (∆E) between HSI and RGB data, for the 700 Pantone® fabric samples (TCX) in sRGB (black) and REC2020 (red) representations. The icons have been separated by quadrants: squares (Q1), circles (Q2), triangles (Q3) and stars (Q4). The coordinates of six samples with different Y -stimulus values are displayed in (b) CIE- x y chromaticity and (c) CIE- x y Y diagrams, which illustrate the spectral combinations that can produce the colors perceivable by human eye. Circle indicates the samples outside the sRGB volume.
Figure 5. (a) Scatter plot of the Y -stimulus obtained from HSI images and absolute color difference (∆E) between HSI and RGB data, for the 700 Pantone® fabric samples (TCX) in sRGB (black) and REC2020 (red) representations. The icons have been separated by quadrants: squares (Q1), circles (Q2), triangles (Q3) and stars (Q4). The coordinates of six samples with different Y -stimulus values are displayed in (b) CIE- x y chromaticity and (c) CIE- x y Y diagrams, which illustrate the spectral combinations that can produce the colors perceivable by human eye. Circle indicates the samples outside the sRGB volume.
Jimaging 12 00116 g005
Table 1. Statistical analysis of the chromatic properties obtained using hyperspectral images separated by quadrants (Q) in terms of the L * (lightness) a * ,   b * ,     C a b * (chroma), h a b (hue) and S (Saturation) for the 700 Pantone® fabric samples (TCX).
Table 1. Statistical analysis of the chromatic properties obtained using hyperspectral images separated by quadrants (Q) in terms of the L * (lightness) a * ,   b * ,     C a b * (chroma), h a b (hue) and S (Saturation) for the 700 Pantone® fabric samples (TCX).
Quadrant
(N)
Color Properties CIE-L*a*b*
L * a * b * C a b * h a b S
Q1
(271)
x ¯ 59.1124.5818.5634.0039.2045.78
x m e d 55.7216.6211.8527.6133.9346.59
x m i n 27.020.020.081.930.223.63
x m a x 93.1559.5186.4586.5589.8580.60
P ¯ 1.352.812.022.381.922.37
Q2
(168)
x ¯ 65.41−15.4421.3529.54129.0638.86
x m e d 65.88−13.1815.3926.67129.8538.48
x m i n 31.07−50.690.313.5090.145.20
x m a x 93.73−0.0281.2081.35179.3968.36
P ¯ 1.961.101.451.352.062.85
Q3
(126)
x ¯ 56.70−19.39−19.7331.27226.6748.52
x m e d 55.03−18.42−21.6132.27227.2050.22
x m i n 30.60−46.16−36.133.60180.0411.69
x m a x 83.73−0.27−0.0346.16269.5267.50
P ¯ 2.720.782.811.822.232.39
Q4
(135)
x ¯ 49.7922.18−12.2828.08325.6447.53
x m e d 45.3419.11−12.3829.31333.5052.19
x m i n 23.790.79−34.892.44271.746.23
x m a x 95.7256.16−0.1056.20359.2376.09
P ¯ 1.962.511.862.752.202.55
N: number of samples; x ¯ : average value; x m e d : median; x m i n : minimum; x m a x : maximum; P ¯ : average difference between HSI and Pantone® CIE- L * a * b * properties.
Table 2. Accuracy of RGB images in reproducing the chromatic properties L * (lightness), a * , b * , C a b * (chroma), h a b (hue), and S (saturation), evaluated using the ratio of performance to deviation (RPD). The information has been separated by quadrant (Q) and L * intervals for sRGB and REC2020 representations.
Table 2. Accuracy of RGB images in reproducing the chromatic properties L * (lightness), a * , b * , C a b * (chroma), h a b (hue), and S (saturation), evaluated using the ratio of performance to deviation (RPD). The information has been separated by quadrant (Q) and L * intervals for sRGB and REC2020 representations.
Color Properties CIE- L * a * b * L * a * b * C a b * h a b S
RPD
Q1sRGB4.673.071.652.151.172.42
REC 20204.413.321.642.130.852.50
Q2sRGB3.221.944.363.182.452.31
REC 20203.231.833.572.802.312.17
Q3sRGB1.751.010.960.831.200.93
REC 20201.601.460.870.841.410.87
Q4sRGB2.531.341.641.170.371.30
REC 20202.411.441.751.200.741.28
L * ≤ 50sRGB0.872.561.891.662.071.43
REC 20200.832.772.081.742.291.44
L * > 50sRGB3.483.382.852.083.612.44
REC 20203.283.912.571.973.142.39
Table 3. Average of the absolute ( E a ) color differences obtained by comparing HSI and RGB data in the sRGB and REC2020 representations. The information has been separated by quadrant (Q) and L * intervals for the 700 Pantone® fabric samples (TCX). The standard deviation ( σ ) is included to show the precision of the results.
Table 3. Average of the absolute ( E a ) color differences obtained by comparing HSI and RGB data in the sRGB and REC2020 representations. The information has been separated by quadrant (Q) and L * intervals for the 700 Pantone® fabric samples (TCX). The standard deviation ( σ ) is included to show the precision of the results.
SamplesHSI vs. sRGBHSI vs. REC 2020
E ¯ a ± σ E ¯ a ± σ
Q15.682.035.572.03
Q25.572.515.712.42
Q39.692.3310.522.28
Q46.801.456.741.66
L * < 507.541.737.431.59
L * > 505.722.675.682.51
L * > 753.911.774.031.86
Table 4. Summary of Two One-Sided Test analyses of the relative color difference ( E r ) calculated by comparing HSI and RGB data. The information has been separated by L * intervals for the 700 Pantone® fabric samples (TCX).
Table 4. Summary of Two One-Sided Test analyses of the relative color difference ( E r ) calculated by comparing HSI and RGB data. The information has been separated by L * intervals for the 700 Pantone® fabric samples (TCX).
Samples L * < 50 L * > 50 L * > 75
E ¯ r H S I 27.7216.156.73
E ¯ r R G B sRGB31.6917.897.60
REC202032.1318.557.82
d ¯ sRGB2.481.440.29
REC20202.661.950.37
σ d sRGB3.272.040.73
REC20203.412.110.84
C I 95 % sRGB[2.07, 2.89][1.24, 1.63][0.17, 0.41]
REC2020[2.23, 2.98][1.74, 1.83][0.23, 0.51]
p value
(t paired)
sRGB<<0.05
REC2020
Effect sizesRGB0.750.700.40
REC20200.770.720.44
ConclusionsRGBRelative color differences equivalent to HSI
REC2020
E ¯ r : average relative color difference; d ¯ : average differences in HSI and RGB comparison; σ d : standard deviation of d ¯ ; C I   95 % : confidence interval of 95%.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Gómez-Heredia, C.L.; Ardila-Useda, J.D.; Cerón-Molina, A.F.; Osorio-Gallego, J.; Ramírez-Rincón, J.A. Assessing RGB Color Reliability via Simultaneous Comparison with Hyperspectral Data on Pantone® Fabrics. J. Imaging 2026, 12, 116. https://doi.org/10.3390/jimaging12030116

AMA Style

Gómez-Heredia CL, Ardila-Useda JD, Cerón-Molina AF, Osorio-Gallego J, Ramírez-Rincón JA. Assessing RGB Color Reliability via Simultaneous Comparison with Hyperspectral Data on Pantone® Fabrics. Journal of Imaging. 2026; 12(3):116. https://doi.org/10.3390/jimaging12030116

Chicago/Turabian Style

Gómez-Heredia, Cindy Lorena, Jose David Ardila-Useda, Andrés Felipe Cerón-Molina, Jhonny Osorio-Gallego, and Jorge Andrés Ramírez-Rincón. 2026. "Assessing RGB Color Reliability via Simultaneous Comparison with Hyperspectral Data on Pantone® Fabrics" Journal of Imaging 12, no. 3: 116. https://doi.org/10.3390/jimaging12030116

APA Style

Gómez-Heredia, C. L., Ardila-Useda, J. D., Cerón-Molina, A. F., Osorio-Gallego, J., & Ramírez-Rincón, J. A. (2026). Assessing RGB Color Reliability via Simultaneous Comparison with Hyperspectral Data on Pantone® Fabrics. Journal of Imaging, 12(3), 116. https://doi.org/10.3390/jimaging12030116

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop