Color and Texture Analysis of Textiles Using Image Acquisition and Spectral Analysis in Calibrated Sphere Imaging System-II

: The extended application of device-dependent systems’ vision is growing exponentially, but these systems face challenges in precisely imitating the human perception models established by the device-independent systems of the Commission internationale de l’ é clairage (CIE). We previously discussed the theoretical treatment and experimental validation of developing a calibrated integrated sphere imaging system to imitate the visible spectroscopy environment. The RGB polynomial function was derived to obtain a meaningful interpretation of color features. In this study, we dyed three different types of textured materials in the same bath with a yellow reactive dye at incremental concentrations to see how their color difference proﬁles tested. Three typical cotton textures were dyed with three ultra-RGB remozol reactive dyes and their combinations. The color concentration ranges of 1%, 2%, 3%, and 4% were chosen for each dye, followed by their binary and ternary mixtures. The aim was to verify the fundamental spectral feature mapping in various imaging color spaces and spectral domains. The ﬁndings are quite interesting and help us to understand the ground truth behind working in two domains. In addition, the trends of color mixing, CIE color difference, CIExy (chromaticity) color gamut, and RGB gamut and their distinguishing features were veriﬁed. Human perception accuracy was also compared in both domains to clarify the inﬂuence of texture. These fundamental experiments and observations on human perception and calibrated imaging color space could clarify the expected precision in both domains.


Introduction
We discussed the significance of this study, prior art, and theoretical treatment of imaging from an integrating sphere in our previous paper [1].We experimented with textile texture and color by varying these factors in a controlled manner (red, blue, yellow, and cyan dyes at various concentrations (0.25%, 0.5%, 0.75%, 1%, 1.5%, 2%, 3%, 4%, 5%, and 6%)).In addition, a simple calibration technique that describes how unique digital color signatures can be derived from calibrated RGB to extract the best features for color and texture was proposed and validated.This alter ego of the reflectance function, missing in the imaging domain, was experimentally validated to be used for visualization, identification, and application for qualitative and quantitative color-texture analysis [1].
The present investigation aims to conduct a qualitative and quantitative analysis of color perception in terms of DERGB and DE precision using our proposed method, along with a study of color combination.Further, we studied various RGB spaces that were critically represented with varied texture and color combinations.
Many applications such as spectral measurements, image processing, and human vision require congruent and precise results in real-world problems.Complex operations, such as calibration protocols, device profiling, illumination uniformity, viewing geometry, device characterization, and so on, are used to bridge the gap between device-dependent and device-independent color transformation.In most cases, the characterization or prediction models are prone to theoretical assumption errors [1][2][3] as well as practical imperfections or limitations [4,5].It could be stated that the illumination source's properties and its uniformity over the material are likely the main offenders and play a crucial role in the precise estimation of color and texture qualities [6,7].
Most of the challenges were discussed in our previous article, and it is noteworthy here to mention some conclusive inferences from a few current researchers.Nie et al. (2023) [8] reported on problems with serious interference of specular reflections in the endoscopic image that majorly contribute to errors in computer vision algorithms.Abdulateef and Hasoon (2023) [9] studied the limitations of image analysis as it essentially needs a clear, bright, and no-shadow RGB image to obtain accurate results.As stated by Lin and Finlayson (2023) [10], "Surprisingly, we show that all compared algorithms-regardless of their model complexity-degrade to broadly the same level of performance." As a matter of fact, the device-dependent systems of RGB color spaces were knowingly evolved with the color compressed gamut of CIE and encouraged widely as a tool for easy communication, real-time application, business, and so on, with the growth of users of computers, phones, and other digital media [1,3].As of today, it is a big challenge for AI systems and advanced algorithms to be trained accurately with prior domain knowledge for better identification, classification, and prediction of subjects or materials of interest.Nature has many colors and it may be beyond our knowledge how they react physically for our color perception.In physics terms, reflection, transmission, absorption, and scattering within the visible range of light involve numerous materials like dyes, pigments, and biomaterials for varied applications.
Materials show their unique properties of light reflectance and absorption over the visible range.If a substrate is dyed, we can see its family of curves by increasing the dye concentration (e.g., Figure 1).Interestingly, textile materials are quite suitable for experimenting with varied textures and dyes.We dyed three different types of textured materials with a yellow reactive dye at incremental concentrations in the same bath.Dyeing experiments for three ultra-RGB remozol reactive dyes and their combinations were then carried out on the same substrate for four concentration ranges (single, binary, and ternary mixtures) to validate the fundamental spectral feature mapping in both domains and varied imaging color spaces.The motivation behind these ground truth experiments was to analyze the critical issue of hu- We dyed three different types of textured materials with a yellow reactive dye at incremental concentrations in the same bath.Dyeing experiments for three ultra-RGB remozol reactive dyes and their combinations were then carried out on the same substrate for four concentration ranges (single, binary, and ternary mixtures) to validate the fundamental spectral feature mapping in both domains and varied imaging color spaces.The motivation behind these ground truth experiments was to analyze the critical issue of human perception and computer vision: the device-independent human perception CIE model and current progress in digital image processing, with a simplified explanation.

Materials and Methods
In our previous study, we discussed the development of a novel integrating sphere imaging system with a theoretical explanation.We experimented with textile properties by varying the texture in a controlled way and coloring the samples with red, blue, yellow, and cyan dyes at various concentrations (0.25%, 0.5%, 0.75%, 1%, 1.5%, 2%, 3%, 4%, 5%, and 6%).We experimented and derived calibrated RGB polynomials and compared them with spectral measurement profiles.Procedures to precisely define the qualitative and quantitative influences of color and texture and color prediction capabilities were also planned and investigated.The present investigation aims to conduct a qualitative and quantitative analysis of the precision of our proposed method along with an analysis of color combination.Three types of bleached cotton fabrics with plain, twill, and modified twill were initially dyed in the same dye bath with a yellow reactive dye (Levafix Brilliant Yellow 4GL; vinyl sulphone class) in an incremental order of concentration, i.e., 0.25%, 0.5%, 0.75%, 1%, 1.5%, 2%, 2.5%, 3%, 4%, and 5% (10 shades for each of the three textures).Further, dyeing was carried out with three kinds of ultra-RGB Remozol reactive dyes and their combinations (single, binary, and ternary mixtures) on the same substrate in four concentration ranges: 1%, 2%, 3%, and 4% (Ultra-RGB Carmen (Dye A), Navy Blue (Dye B), and Red (Dye C)).The dyes were provided by Dystar, Hong Kong.A Minolta 2600D spectrophotometer was used to measure reflectance (360 to 740 nm with a 10 nm gap), XYZ, CIEL*a*b*, and color difference (DE).Images of plain, twill, and modified twill samples of the same percentage shade were taken together (Figure 2) and, after calibrating the diffused imaging system with a white tile, their average RGB values were computed in MATLAB.All the detailed measurement values are provided as Supplementary File.

Three Types of Textures with Incremental Yellow Color Variations
Textile samples with three different textures were dyed in the same bath uptake could be verified in both domains and a quantitative and qualitativ color and the color difference between textiles could be conducted effectiv

Three Types of Textures with Incremental Yellow Color Variations
Textile samples with three different textures were dyed in the same bath so that color uptake could be verified in both domains and a quantitative and qualitative analysis of color and the color difference between textiles could be conducted effectively.The integrated sphere imaging system was initially calibrated with the white plate, and all three kinds of textures dyed in the same bath were measured together (Figure 2) to ensure clarity in the determination of RGB values, denoted as Canon D450 (Camera) RGB.
Figure 3 shows the sample images and Table 1 shows the experimental RGB and color difference DE (root mean square difference of CIE L*a*b*) and DERGB (calculated as root mean square difference of R, G, and B), using the first plain weaved sample Y2 as a reference.

Three Types of Textures with Incremental Yellow Color Variations
Textile samples with three different textures were dyed in the same bath so th uptake could be verified in both domains and a quantitative and qualitative ana color and the color difference between textiles could be conducted effectively.T grated sphere imaging system was initially calibrated with the white plate, and a kinds of textures dyed in the same bath were measured together (Figure 2) to ensu ity in the determination of RGB values, denoted as Canon D450 (Camera) RGB.
Figure 3 shows the sample images and Table 1 shows the experimental RGB an difference DE (root mean square difference of CIE L*a*b*) and DERGB (calculated mean square difference of R, G, and B), using the first plain weaved sample Y2 as ence.In both domains, their spectral and derived imaging RGB polynomials are pr A visual difference is observed in the higher wavelength ranges of red and green g, and so on in the RGB polynomial graph in Figure 4).In both domains, their spectral and derived imaging RGB polynomials are profound.A visual difference is observed in the higher wavelength ranges of red and green (1-r, 2-g, and so on in the RGB polynomial graph in Figure 4).

Color Difference for Human Perception
From Figure 4 and Table 1, it can be observed that the textures can be visually distinguished into 3 families: plain, twill, and modified twill.The highest was M.Twill (on top), followed by Twill (middle), and the least was plain (below).The same trend can be seen by observing CIE L* values in the spectral domain as well as intensity values (R+G+B/3) in the calibrated imaging domain in Figure 7.The color difference can be distinguished much better in terms of DERGB from the proposed system than DE in CIEL*a*b* (Figure 5).It was observed that the DERGB and color difference DE readings follow obvious trends as the color concentration increases among each of the three textures.Further, we ranked all samples to compare both domains' perception in terms of DE and DERGB.

Color Difference for Human Perception
From Figure 4 and Table 1, it can be observed that the textures can be visually distinguished into 3 families: plain, twill, and modified twill.The highest was M.Twill (on top), followed by Twill (middle), and the least was plain (below).The same trend can be seen by observing CIE L* values in the spectral domain as well as intensity values (R+G+B/3) in the calibrated imaging domain in Figure 7.The color difference can be distinguished much better in terms of DERGB from the proposed system than DE in CIEL*a*b* (Figure 5).It was observed that the DERGB and color difference DE readings follow obvious trends as the color concentration increases among each of the three textures.Further, we ranked all samples to compare both domains' perception in terms of DE and DERGB.Experimental RGBs, color difference DE, and DERGB of the textures (Y42 as reference) are given in Table 1 and plotted in Figure 5.More discrimination is noticed in the case of DERGB compared with DE.For the color perception point of view, images were ranked in terms of measured DE and estimated DERGB from images using Y42′s RGB as reference.The congruent ranks are highlighted in italic with thick borders and the flipped reading sets are in bold (color difference ranks are flipped while evaluating DE and DERGB).It can be observed that these flipped sets (Y46, Y48), (Y53, Y55), and (Y65, Y70) have closer DE values among them (0.022, 0.287, and 0.485, respectively).
If the color difference (DE) is less than 0.5, it can be perceived as the same color and we accept it for industrial applications as well [11,12].While observing this set of samples (shown in Figure 6) carefully, it can be easily noticed that the left-side samples Y46, Y53, and Y65 may be perceived as more colored than the samples Y48, Y55, and Y70, respectively.In fact, texture clearly has a greater impact on imaging systems, whereas color spectral reflection measurement was designed for color (reflection spectra in the visible wavelength range).Experimental RGBs, color difference DE, and DERGB of the textures (Y42 as reference) are given in Table 1 and plotted in Figure 5.More discrimination is noticed in the case of DERGB compared with DE.For the color perception point of view, images were ranked in terms of measured DE and estimated DERGB from images using Y42 s RGB as reference.The congruent ranks are highlighted in italic with thick borders and the flipped reading sets are in bold (color difference ranks are flipped while evaluating DE and DERGB).It can be observed that these flipped sets (Y46, Y48), (Y53, Y55), and (Y65, Y70) have closer DE values among them (0.022, 0.287, and 0.485, respectively).
If the color difference (DE) is less than 0.5, it can be perceived as the same color and we accept it for industrial applications as well [11,12].While observing this set of samples (shown in Figure 6) carefully, it can be easily noticed that the left-side samples Y46, Y53, and Y65 may be perceived as more colored than the samples Y48, Y55, and Y70, respectively.In fact, texture clearly has a greater impact on imaging systems, whereas color spectral reflection measurement was designed for color (reflection spectra in the visible wavelength range).Critically, these color perception mismatches, or zones of confusion, are always debated, and the errors were profoundly significant.This is because, in most cases, the imaging domain RGB parameters are derived with a lot of assumptions and a lack of calibration, illumination information, viewing geometry, etc., while being transformed from or to CIE spaces.The reason for this is that human perception models only accept the CIE system because standardization of RGB color space is near to impossible [3,13].These theoretical errors have been reported for decades, and it is evident that the images obtained from this calibrated imaging system could be closer to our color perception and appearance due to texture change.In addition, a color image with calibrated RGBs can be much more useful than spectral ones for real-world applications.Complex algorithms for domain transfer from device-dependent systems to device-independent systems are progressively being used for various important applications, including medical imaging; however, these conversions may cause serious errors and outliers when generalized.Sciuto et al. (2017) [14] and Lo Sciuto et al. (2021) [15] reported some improved network classifiers and feature extraction algorithms for better results to recognize organic solar cells defects.The typical RGB values of standard illuminants in different device-dependent RGB spaces are given in Table 2.These values can be computed and were available from the spectral calculator spreadsheet by Bruce Justin Lindbloom [16], which was used later for RGB calculations and visualizations in various RGB color spaces.The application of device-dependent systems has been growing exponentially for AI, cloud computing, virtual realities, and many more.It is critical that device-independent systems like CIEL*a*b* are always associated with their illumination and observer pairs.However, image processing researchers rarely consider this and incorrectly assume many parameters for computing RGB associations for empirical models with complex algorithms.Critically, these color perception mismatches, or zones of confusion, are always debated, and the errors were profoundly significant.This is because, in most cases, the imaging domain RGB parameters are derived with a lot of assumptions and a lack of calibration, illumination information, viewing geometry, etc., while being transformed from or to CIE spaces.The reason for this is that human perception models only accept the CIE system because standardization of RGB color space is near to impossible [3,13].These theoretical errors have been reported for decades, and it is evident that the images obtained from this calibrated imaging system could be closer to our color perception and appearance due to texture change.In addition, a color image with calibrated RGBs can be much more useful than spectral ones for real-world applications.Complex algorithms for domain transfer from device-dependent systems to device-independent systems are progressively being used for various important applications, including medical imaging; however, these conversions may cause serious errors and outliers when generalized.Sciuto et al. (2017) [14] and Lo Sciuto et al. (2021) [15] reported some improved network classifiers and feature extraction algorithms for better results to recognize organic solar cells defects.The typical RGB values of standard illuminants in different device-dependent RGB spaces are given in Table 2.These values can be computed and were available from the spectral calculator spreadsheet by Bruce Justin Lindbloom [16], which was used later for RGB calculations and visualizations in various RGB color spaces.The application of device-dependent systems has been growing exponentially for AI, cloud computing, virtual realities, and many more.It is critical that device-independent systems like CIEL*a*b* are always associated with their illumination and observer pairs.However, image processing researchers rarely consider this and incorrectly assume many parameters for computing RGB associations for empirical models with complex algorithms.

Color Intensity for Quantitative Evaluation
The intensity values were computed as (R+G+B)/3 for 10 ranges of incremental dye concentration (Table 3).It can be easily noticed that the estimated intensity decreases as the concentration of dye increases, which implies lower reflection properties of the surface with higher dye absorption.The relationship between color concentration and intensity was established using quantitative evaluation (Figure 7).Here, it is relevant to investigate whether the calibrated image intensity could imitate absorbance or dye uptake (K/S), i.e., the intensity should be estimated reasonably well when the concentration is known.The measured intensities of the three textures (total of 30) were plotted against the 10 incremental concentration ranges, as shown in Figure 7 A high coefficient of correlation was observed for all the cases (R 2 > 0.99), confirming a good prediction probability from calibrated imaging.

Color Combinations and Verification of Various Color Space and CIE Chromaticity Visualizations
Further, we conducted ground truth experiments with three dyes using an equal proportion of their primary, secondary, and ternary mixtures (Table 4 and Figure 8).Various color space RGB representations of the dye mixtures (BC at 1, 2, 3, and 4%) were computed [14] and represented in Figure 9.A high coefficient of correlation was observed for all the cases (R 2 > 0.99), confirming a good prediction probability from calibrated imaging.

Color Combinations and Verification of Various Color Space and CIE Chromaticity Visualizations
Further, we conducted ground truth experiments with three dyes using an equal proportion of their primary, secondary, and ternary mixtures (Table 4 and Figure 8).Various color space RGB representations of the dye mixtures (BC at 1, 2, 3, and 4%) were computed [14] and represented in Figure 9.
The reflectance measurement was conducted using spectrophotometry, and calibrated imaging RGBs were calculated in MATLAB.The spreadsheet by Bruce Justin Lindbloom was used to calculate various device-dependent RGB standards, and all raw data were provided on an Excel file.Typical binary combinations of 1, 2, 3, and 4% of Dye B and Dye C for three major color space RGB representations (Apple RGB, Adobe RGB, and Pro-photo RGB) are given in Figure 9.It is evident that they could be well represented in both qualitative and quantitative analyses (curves of the same family have similar spectral reflectance).In addition, we investigated the similarity of their representations in the CIE chromaticity (CIExy) diagram (Figure 10) and the 3D representation of linearity of individual and dye mixtures in both domains (Figure 11).The physical significance of this is that a two-dye mixture in a particular proportion will be in a straight line until it becomes saturated.The reflectance measurement was conducted using spectrophotometry, and calibrated imaging RGBs were calculated in MATLAB.The spreadsheet by Bruce Justin Lindbloom was used to calculate various device-dependent RGB standards, and all raw data were provided on an Excel file.Typical binary combinations of 1, 2, 3, and 4% of Dye B   These experimental findings demonstrate that both qualitative and quantitative evaluations are possible in the calibrated digital domain and that they are comparable to spectral or device-independent systems.The calibrated red, green, blue, RG, RB, GB, and RGB polynomial expansion can be treated as an alter ego of spectral responses for any color space, including dye mixtures, for practical applications.The CIE XYZ and 3D RGB trends revealed the distinct dye and combination profiles.

Reflectance Prediction in Terms of Calibrated RGB Polynomial Regression
As we discussed how the proposed RGB polynomial could potentially be used as an alter ego for the reflectance function earlier, we investigated further how well the reflectance function (31 values over a visible wavelength of 400-700 nm with a 10 nm gap) could be predicted.It is logical that the mixture of primary RGBs generates a wide gamut of  These experimental findings demonstrate that both qualitative and quantitative evaluations are possible in the calibrated digital domain and that they are comparable to spectral or device-independent systems.The calibrated red, green, blue, RG, RB, GB, and RGB polynomial expansion can be treated as an alter ego of spectral responses for any color space, including dye mixtures, for practical applications.The CIE XYZ and 3D RGB trends revealed the distinct dye and combination profiles.

Reflectance Prediction in Terms of Calibrated RGB Polynomial Regression
As we discussed how the proposed RGB polynomial could potentially be used as an alter ego for the reflectance function earlier, we investigated further how well the reflectance function (31 values over a visible wavelength of 400-700 nm with a 10 nm gap) could be predicted.It is logical that the mixture of primary RGBs generates a wide gamut of These experimental findings demonstrate that both qualitative and quantitative evaluations are possible in the calibrated digital domain and that they are comparable to spectral or device-independent systems.The calibrated red, green, blue, RG, RB, GB, and RGB polynomial expansion can be treated as an alter ego of spectral responses for any color space, including dye mixtures, for practical applications.The CIE XYZ and 3D RGB trends revealed the distinct dye and combination profiles.

Reflectance Prediction in Terms of Calibrated RGB Polynomial Regression
As we discussed how the proposed RGB polynomial could potentially be used as an alter ego for the reflectance function earlier, we investigated further how well the reflectance function (31 values over a visible wavelength of 400-700 nm with a 10 nm gap) could be predicted.It is logical that the mixture of primary RGBs generates a wide gamut of colors.
Here we have taken these 27 dye mixture samples in a similar fashion to how the database is prepared for computer color matting for prediction.It can be mathematically denoted as follows: R(λ) = f (RGBpolynomial) We used 3, 8, 11, 20, and 23 argument coefficients here, and did not try more arguments as they caused more complexity and predilection in our earlier investigation [5,17,18].The predicted reflectance were then converted into theoretical CIEL*a*b* for various illuminant-observer pairs to calculate the predicted DE.All the experimental data (actual and predicted reflectance, coefficients, and DE) for various illuminat-observer pairs are provided as a Supplementary File. Figure 12 clearly shows that the reflectance function is well predicted from the 23-argument polynomial RGB.The arguments for the polynomial RGB function are denoted below.Table 5 illustrates the summary of color difference results based on these 5 RGB polynomial arguments were derived from the predicted reflectance values.In Figure 12a the experimental reflectance was plotted as continuous line and predicted reflectance of 23 argument model was marked as *.colors.Here we have taken these 27 dye mixture samples in a similar fashion to how the database is prepared for computer color matting for prediction.It can be mathematically denoted as follows: = ( ) We used 3, 8, 11, 20, and 23 argument coefficients here, and did not try more arguments as they caused more complexity and predilection in our earlier investigation [5,17,18].The predicted reflectance were then converted into theoretical CIEL*a*b* for various illuminant-observer pairs to calculate the predicted DE.All the experimental data (actual and predicted reflectance, coefficients, and DE) for various illuminat-observer pairs are provided as a Supplementary File. Figure 12 clearly shows that the reflectance function is well predicted from the 23-argument polynomial RGB.The arguments for the polynomial RGB function are denoted below.Table 5 illustrates the summary of color difference results based on these 5 RGB polynomial arguments were derived from the predicted reflectance values.In Figure 12a the experimental reflectance was plotted as continuous line and predicted reflectance of 23 argument model was marked as *.

Conclusions
In fact, exact spectral reconstruction is even more difficult to achieve.These challenges can be easily understood if we revisit the development of CIE systems themselves with defined illuminants, observer functions (cleverly designed with real-time inputs from expert human observers), and viewing geometry set-ups.For example, we need to predict 31 values (%R from 400-700 nm in a 10 nm gap) from three colorimetric readings of CIE L*, a*, and b*.The systems required to do this are well established and being utilized for specific domain applications as of today, unlike RGBs; those are being evolved for use in pleasant images with less know-how.
In current practice, the transformation of device-dependent imaging parameters into device-independent CIE parameters is mandatory as human perception only accepts the CIE system.Before pre-and post-processing of the color images in compressed digital color space or to map specific RGBs to a specific spectrophotometer or colorimetric reading under a certain illuminant and observer, domain knowledge is a definite prerequisite.Specifically, the color and appearance perception mismatches, or zones of confusion, are being critically debated in real-world applications.In fact, humans can differentiate more colors than CIEs.The fundamental cause of these errors was profoundly significant for critical decision-making applications when imaging domain parameters with a lot of assumptions and a lack of calibration, illumination information, uniformity, etc., were transformed to CIE spaces.
Previously, we proposed an alternative reflectance function obtained from the calibrated sphere imaging system with a theoretical explanation to analyze and validate the close proximity of texture and color.Here, with three different textures and the incremental color depth of three color combinations, we experimented and validated the qualitative analysis of color and proximity texture in both domains.The concentration of a particular color can be estimated from the calibrated image intensity.The human perception of color differences and its ambiguity are explained.The color difference in terms of DERGB can be perceived well and texture has a large influence on it.The reflectance predictions from polynomial regression RGB models were found to be reasonably accurate.The various RGB spaces and CIExyz for color combinations were found to be congruent, and finally, precision can be ensured if an image is well calibrated with diffused and uniform illumination constancy.

Electronics 2023 , 17 Figure 1 .
Figure 1.A typical % reflectance and K/S profile of ultra orange RGB dye.

Figure 1 .
Figure 1.A typical % reflectance and K/S profile of ultra orange RGB dye.

lectronics 2023 ,Figure 2 .
Figure 2. Measurement of three textures together in a calibrated integrated sphere im tem.

Figure 2 .
Figure 2. Measurement of three textures together in a calibrated integrated sphere imaging system.

Figure 2 .
Figure 2. Measurement of three textures together in a calibrated integrated sphere imaging tem.

Figure 3 .
Figure 3. Incremental yellow dyed samples for three textures.

Figure 3 .
Figure 3. Incremental yellow dyed samples for three textures.

Figure 4 .
Figure 4. Spectral and RGB polynomial representations of all 30 yellow samples.

Figure 4 .
Figure 4. Spectral and RGB polynomial representations of all 30 yellow samples.

Figure 5 .
Figure 5. DERGB and DE of three textures, dyed with yellow in the same bath.

Figure 5 .
Figure 5. DERGB and DE of three textures, dyed with yellow in the same bath.

Figure 6 .
Figure 6.Image ranking according to the color difference DE and DERGB for all 30 samples.

Figure 6 .
Figure 6.Image ranking according to the color difference DE and DERGB for all 30 samples.

:Figure 8 .
Figure 8. Images of the dyed samples; three dyes: primary, secondary, and ternary mixture.Figure 8. Images of the dyed samples; three dyes: primary, secondary, and ternary mixture.

Figure 8 . 17 Figure 9 .
Figure 8. Images of the dyed samples; three dyes: primary, secondary, and ternary mixture.Figure 8. Images of the dyed samples; three dyes: primary, secondary, and ternary mixture.Electronics 2023, 12, x FOR PEER REVIEW 12 of 17

Figure 11 .
Figure 11.3D Plots of CIExyz and RGBs of all 27 dyed samples.

Figure 11 .
Figure 11.3D Plots of CIExyz and RGBs of all 27 dyed samples.

Figure 11 .
Figure 11.3D Plots of CIExyz and RGBs of all 27 dyed samples.

Figure 12 .
Figure 12.(a) Reflectance functions of 27 samples predicted from a 23-argument RGB polynomial.It is noteworthy that these reflectance models would be more accurate in cases where we use the same substrate and dye combinations and it is highly probable that their predictions would ensure nonmetameric matches for all sets of ASTM illuminant-observer pairs.The experimental and predicted CIE L*a*b* are plotted in (b-d).(b) CIE L* vs. Lp*, (c) CIE a* vs. ap*, and (d) CIE b* vs. bp*.

Table 1 .
Experimental RGBs, color difference DE, and DERGB of three textures (Y42 as reference).

Table 2 .
Typical RGB values of standard illuminants in different device-dependent RGB spaces.

Table 3 .
Intensity of all three textured samples with respect to 10 incremental concentration ranges.

Table 4 .
Experimental RGBs of dyed samples; three dyes: primary, secondary, and ternary mixture.

Table 4 .
Experimental RGBs of dyed samples; three dyes: primary, secondary, and ternary mixture.

Table 5 .
DE from predicted reflectance for D65_64.

Table 5 .
DE from predicted reflectance for D65_64.