Designing Quadcolor Cameras with Conventional RGB Channels to Improve the Accuracy of Spectral Reflectance and Chromaticity Estimation
Abstract
1. Introduction
2. Materials and Assessment Metrics
2.1. Camera Spectral Sensitivities
2.2. Color Samples
2.3. Assessment Metrics
3. Methods
3.1. Color Filter Spectral Transmittance
3.2. Optimization Cost Function
3.3. II-LUT Method
- STEP 1: Convert the signal vector C of the test sample into the normalized signal vector c.
- STEP 2: Locate the tetrahedron enclosing vector c in the normalized signal space.
- STEP 3: Solve the coefficients βk, k = 1, 2, 3, and 4, from Equations (12) and (13).
- STEP 4: Calculate the coefficients ηk, k = 1, 2, 3, and 4, according to Equation (15).
- STEP 5: Calculate the reconstruction spectrum SRec according to Equation (14).
4. Results and Discussion
4.1. The Use of Band-Pass Optical Filter
4.2. The Use of Notch Optical Filter
4.3. Recovered Spectrum Examples
5. Conclusions
- (1)
- The spectral sensitivity of the 4th channel with a peak wavelength between 500 nm and 550 nm reduces color difference owing to the improved fit of the camera spectral sensitivities to CMFs. Compared with the case without a filter, the mean color difference ΔE00 can be reduced from 0.3062 to 0.1096 when a filter was used, but the mean spectral reflectance error ERef increases from 0.00928 to 0.01047.
- (2)
- The spectral sensitivity of the 4th channel with a peak wavelength of around 685 nm reduces the spectral reflectance error because the sensitivities of RGB channels are small in this wavelength region, but it impairs the CMF fit. Compared with the case without a filter, the mean ERef and mean ΔE00 can be reduced to 0.00867 and 0.3012, respectively, when a filter was used.
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Picollo, M.; Cucci, C.; Casini, A.; Stefani, L. Hyper-spectral imaging technique in the cultural heritage field: New possible scenarios. Sensors 2020, 20, 2843. [Google Scholar] [CrossRef]
- Candeo, A.; Ardini1, B.; Ghirardello, M.; Valentini, G.; Clivet, L.; Maury, C.; Calligaro, T.; Manzoni, C.; Comelli, D. Performances of a portable Fourier transform hyperspectral imaging camera for rapid investigation of paintings. Eur. Phys. J. Plus 2022, 137, 409. [Google Scholar] [CrossRef]
- Lu, B.; Dao, P.D.; Liu, J.; He, Y.; Shang, J. Recent advances of hyperspectral imaging technology and applications in agriculture. Remote Sens. 2020, 12, 2659. [Google Scholar] [CrossRef]
- Ahmed, M.T.; Monjur, O.; Khaliduzzaman, A.; Kamruzzaman, M. A comprehensive review of deep learning-based hyperspectral image reconstruction for agrifood quality appraisal. Artif. Intell. Rev. 2025, 58, 96. [Google Scholar]
- Vilela, E.F.; Ferreira, W.P.M.; Castro, G.D.M.; Faria, A.L.R.; Leite, D.H.; Lima, I.A.; Matos, C.S.M.; Silva, R.A.; Venzon, M. New Spectral Index and Machine Learning Models for Detecting Coffee Leaf Miner Infestation Using Sentinel-2 Multispectral Imagery. Agriculture 2023, 13, 388. [Google Scholar] [CrossRef]
- Ma, C.; Yu, M.; Chen, F.; Lin, H. An efficient and portable LED multispectral imaging system and its application to human tongue detection. Appl. Sci. 2022, 12, 3552. [Google Scholar] [CrossRef]
- Czempiel, T.; Roddan, A.; Leiloglou, M.; Hu, Z.; O’Neill, K.; Anichini, G.; Stoyanov, D.; Elson, D. RGB to hyperspectral: Spectral reconstruction for enhanced surgical imaging. Healthc. Technol. Lett. 2024, 11, 307–317. [Google Scholar] [CrossRef]
- Zhang, J.; Yao, P.; Wu, H.; Xin, J.H. Automatic color pattern recognition of multispectral printed fabric images. J. Intell. Manuf. 2022, 34, 2747–2763. [Google Scholar] [CrossRef]
- Kior, A.; Yudina, L.; Zolin, Y.; Sukhov, V.; Sukhova, E. RGB Imaging as a Tool for Remote Sensing of Characteristics of Terrestrial Plants: A Review. Plants 2024, 13, 1262. [Google Scholar] [CrossRef]
- Li, S.; Xiao, K.; Li, P. Spectra Reconstruction for Human Facial Color from RGB Images via Clusters in 3D Uniform CIELab* and Its Subordinate Color Space. Sensors 2023, 23, 810. [Google Scholar] [CrossRef]
- Valero, E.M.; Nieves, J.L.; Nascimento, S.M.C.; Amano, K.; Foster, D.H. Recovering spectral data from natural scenes with an RGB digital camera and colored Filters. Col. Res. Appl. 2007, 32, 352–360. [Google Scholar] [CrossRef]
- Tominaga, S.; Nishi, S.; Ohtera, R.; Sakai, H. Improved method for spectral reflectance estimation and application to mobile phone cameras. J. Opt. Soc. Am. A 2022, 39, 494–508. [Google Scholar] [CrossRef]
- Liang, J.; Wan, X. Optimized method for spectral reflectance reconstruction from camera responses. Opt. Express 2017, 25, 28273–28287. [Google Scholar] [CrossRef]
- He, Q.; Wang, R. Hyperspectral imaging enabled by an unmodified smartphone for analyzing skin morphological features and monitoring hemodynamics. Biomed. Opt. Express 2020, 11, 895–909. [Google Scholar] [CrossRef] [PubMed]
- Schaepman, M.E. Imaging spectrometers. In The SAGE Handbook of Remote Sensing; Warner, T.A., Nellis, M.D., Foody, G.M., Eds.; Sage Publications: Los Angeles, CA, USA, 2009; pp. 166–178. [Google Scholar]
- Cai, F.; Lu, W.; Shi, W.; He, S. A mobile device-based imaging spectrometer for environmental monitoring by attaching a lightweight small module to a commercial digital camera. Sci. Rep. 2017, 7, 15602. [Google Scholar] [CrossRef] [PubMed]
- Wen, Y.-C.; Wen, S.; Hsu, L.; Chi, S. Spectral reflectance recovery from the quadcolor camera signals using the interpolation and weighted principal component analysis methods. Sensors 2022, 22, 6228. [Google Scholar] [CrossRef] [PubMed]
- Wen, Y.-C.; Wen, S.; Hsu, L.; Chi, S. Irradiance independent spectrum reconstruction from camera signals using the interpolation method. Sensors 2022, 22, 8498. [Google Scholar] [CrossRef]
- Tzeng, D.Y.; Berns, R.S. A review of principal component analysis and its applications to color technology. Col. Res. Appl. 2005, 30, 84–98. [Google Scholar] [CrossRef]
- Agahian, F.; Amirshahi, S.A.; Amirshahi, S.H. Reconstruction of reflectance spectra using weighted principal component analysis. Col. Res. Appl. 2008, 33, 360–371. [Google Scholar] [CrossRef]
- Hamza, A.B.; Brady, D.J. Reconstruction of reflectance spectra using robust nonnegative matrix factorization. IEEE. Trans. Signal Process 2006, 54, 3637–3642. [Google Scholar] [CrossRef]
- Amirshahi, S.H.; Amirhahi, S.A. Adaptive non-negative bases for reconstruction of spectral data from colorimetric information. Opt. Rev. 2010, 17, 562–569. [Google Scholar] [CrossRef]
- Yoo, J.H.; Kim, D.C.; Ha, H.G.; Ha, Y.H. Adaptive spectral reflectance reconstruction method based on Wiener estimation using a similar training set. J. Imaging Sci. Technol. 2016, 60, 205031. [Google Scholar] [CrossRef]
- Nahavandi, A.M. Noise segmentation for improving performance of Wiener filter method in spectral reflectance estimation. Color Res. Appl. 2018, 43, 341–348. [Google Scholar] [CrossRef]
- Heikkinen, V.; Camara, C.; Hirvonen, T.; Penttinen, N. Spectral imaging using consumer-level devices and kernel-based regression. J. Opt. Soc. Am. A 2016, 33, 1095–1110. [Google Scholar] [CrossRef] [PubMed]
- Heikkinen, V. Spectral reflectance estimation using Gaussian processes and combination kernels. IEEE. Trans. Image Process 2018, 27, 3358–3373. [Google Scholar] [CrossRef]
- Wang, L.; Wan, X.; Xia, G.; Liang, J. Sequential adaptive estimation for spectral reflectance based on camera responses. Opt. Express 2020, 28, 25830–25842. [Google Scholar] [CrossRef]
- Lin, Y.-T.; Finlayson, G.D. On the Optimization of Regression-Based Spectral Reconstruction. Sensors 2021, 21, 5586. [Google Scholar] [CrossRef]
- Liu, Z.; Xiao, K.; Pointer, M.R.; Liu, Q.; Li, C.; He, R.; Xie, X. Spectral reconstruction using an iteratively reweighted regulated model from two illumination camera responses. Sensors 2021, 21, 7911. [Google Scholar] [CrossRef]
- Zhang, J.; Su, R.; Fu, Q.; Ren, W.; Heide, F.; Nie, Y. A survey on computational spectral reconstruction methods from RGB to hyperspectral imaging. Sci. Rep. 2022, 12, 11905. [Google Scholar] [CrossRef]
- Yao, P.; Wu, H.; Xin, J.H. Improving generalizability of spectral reflectance reconstruction using L1-norm penalization. Sensors 2023, 23, 689. [Google Scholar] [CrossRef]
- Huo, D.; Wang, J.; Qian, Y.; Yang, Y.-H. Learning to recover spectral reflectance from RGB images. IEEE Trans. Image Process 2024, 33, 3174–3186. [Google Scholar] [CrossRef]
- Abed, F.M.; Amirshahi, S.H.; Abed, M.R.M. Reconstruction of reflectance data using an interpolation technique. J. Opt. Soc. Am. A 2009, 26, 613–624. [Google Scholar] [CrossRef]
- Kim, B.G.; Han, J.; Park, S. Spectral reflectivity recovery from the tristimulus values using a hybrid method. J. Opt. Soc. Am. A 2012, 29, 2612–2621. [Google Scholar] [CrossRef]
- Kim, B.G.; Werner, J.S.; Siminovitch, M.; Papamichael, K.; Han, J.; Park, S. Spectral reflectivity recovery from tristimulus values using 3D extrapolation with 3D interpolation. J. Opt. Soc. Korea 2014, 18, 507–516. [Google Scholar] [CrossRef]
- Chou, T.-R.; Hsieh, C.-H.; Chen, E. Recovering spectral reflectance based on natural neighbor interpolation with model-based metameric spectra of extreme points. Col. Res. Appl. 2019, 44, 508–525. [Google Scholar] [CrossRef]
- Wen, Y.-C.; Wen, S.; Hsu, L.; Chi, S. Auxiliary Reference Samples for Extrapolating Spectral Reflectance from Camera RGB Signals. Sensors 2022, 22, 4923. [Google Scholar] [CrossRef] [PubMed]
- Darrodi, M.M.; Finlayson, G.; Goodman, T.; Mackiewicz, M. Reference data set for camera spectral sensitivity estimation. J. Opt. Soc. Am. A 2015, 32, 381–391. [Google Scholar] [CrossRef]
- Kohonen, O.; Parkkinen, J.; Jaaskelainen, T. Databases for spectral color science. Col. Res. Appl. 2006, 31, 381–390. [Google Scholar] [CrossRef]
- Viggiano, J.A.S. A perception-referenced method for comparison of radiance ratio spectra and its application as an index of metamerism. Proc. SPIE 2002, 4421, 701–704. [Google Scholar]
- Mansouri1, A.; Sliwa1, T.; Hardeberg, J.Y.; Voisin, Y. An adaptive-PCA algorithm for reflectance estimation from color images. In Proceedings of the 19th International Conference on Pattern Recognition, Tampa, FL, USA, 8–11 December 2008. [Google Scholar]
λSGF (nm) | D5100 | RGBF | 400 | 480 | 680 | 700 | ||||
---|---|---|---|---|---|---|---|---|---|---|
γ | NA | NA | 10−3 | 1 | 10−3 | 1 | 10−3 | 1 | 10−3 | 1 |
ΔλF (nm) | NA | NA | 327.8 | 301.6 | 179.2 | 155.8 | 56.8 | 173 | 50 | 223.8 |
ΔλE (nm) | NA | NA | 99.9 | 98.0 | 78.5 | 83.6 | 50.0 | 20.0 | 35.8 | 20.0 |
CMFMisF | 0.1498 | 0.1495 | 0.1277 | 0.0973 | 0.1336 | 0.0986 | 0.1496 | 0.1491 | 0.1498 | 0.1490 |
mean ΔE00 | 0.4778 | 0.3062 | 0.1655 | 0.1096 | 0.1733 | 0.1107 | 0.3022 | 0.2898 | 0.3012 | 0.2898 |
mean ERef | 0.01785 | 0.00928 | 0.00962 | 0.01047 | 0.00959 | 0.01057 | 0.00875 | 0.00894 | 0.00867 | 0.00893 |
Metric | Filter | w/o | SGF | ISGF | |||||
---|---|---|---|---|---|---|---|---|---|
λSGF/λISGF (nm) | NA | 480 | 480 | 400 | 600 | 600 | 600 | 590 | |
γ | NA | 10−3 | 1 | 1 | 10−3 | 10−1.65 | 1 | 10−3 | |
ERef | mean μ | 0.00928 | 0.00959 | 0.01057 | 0.01047 | 0.00787 | 0.00792 | 0.01055 | 0.00780 |
std σ | 0.00961 | 0.00934 | 0.01238 | 0.01198 | 0.00763 | 0.00788 | 0.01155 | 0.00756 | |
PC50 | 0.00603 | 0.00661 | 0.00667 | 0.00677 | 0.00538 | 0.00534 | 0.00714 | 0.00520 | |
PC98 | 0.04290 | 0.03872 | 0.04569 | 0.04498 | 0.03248 | 0.03257 | 0.04563 | 0.02872 | |
MAX | 0.07328 | 0.07939 | 0.12590 | 0.14719 | 0.06071 | 0.07282 | 0.10320 | 0.06686 | |
GFC | mean μ | 0.99880 | 0.99891 | 0.99859 | 0.99862 | 0.99918 | 0.99917 | 0.99873 | 0.99923 |
std σ | 0.00342 | 0.00243 | 0.00458 | 0.00450 | 0.00231 | 0.00243 | 0.00316 | 0.00218 | |
PC50 | 0.99962 | 0.99959 | 0.99961 | 0.99960 | 0.99972 | 0.99972 | 0.99957 | 0.99973 | |
MIN | 0.93870 | 0.96364 | 0.89529 | 0.89440 | 0.94917 | 0.95098 | 0.93413 | 0.95152 | |
RGF99 | 0.98311 | 0.99062 | 0.98124 | 0.98311 | 0.99156 | 0.98874 | 0.98874 | 0.99156 | |
ΔE00 | mean μ | 0.30618 | 0.17333 | 0.11066 | 0.10960 | 0.18919 | 0.18143 | 0.13121 | 0.20859 |
std σ | 0.36686 | 0.14776 | 0.09288 | 0.09193 | 0.19811 | 0.19134 | 0.11139 | 0.22091 | |
PC50 | 0.18992 | 0.12470 | 0.08618 | 0.08561 | 0.13567 | 0.13044 | 0.10220 | 0.14965 | |
PC98 | 1.53876 | 0.60145 | 0.37249 | 0.37480 | 0.67183 | 0.62592 | 0.40659 | 0.83240 | |
MAX | 3.38551 | 0.99401 | 0.76597 | 0.76625 | 2.24135 | 2.24168 | 1.32258 | 2.24832 | |
SCI | mean μ | 3.03986 | 2.26793 | 2.19007 | 2.18351 | 2.41083 | 2.37555 | 2.26130 | 2.47910 |
std σ | 3.35378 | 1.79260 | 1.81088 | 1.79095 | 2.23688 | 2.24981 | 1.99434 | 2.35225 | |
PC50 | 1.99793 | 1.75189 | 1.63791 | 1.63828 | 1.72560 | 1.70528 | 1.64829 | 1.75844 | |
PC98 | 16.37406 | 8.80518 | 7.84793 | 7.83212 | 8.24166 | 8.12446 | 8.06511 | 9.14487 | |
MAX | 27.56020 | 13.56906 | 19.45063 | 18.72469 | 28.13400 | 29.57345 | 20.46814 | 25.54052 |
λISGF (nm) | D5100 | RGBF | 600 | 590 | ||||||
---|---|---|---|---|---|---|---|---|---|---|
γ | NA | NA | 10−3 | 10−1.65 | 10−1.2 | 1 | 10−3 | 10−1 | 10−0.5 | 1 |
ΔλFS (nm) | NA | NA | 171.6 | 158.6 | 164.8 | 399.2 | 165.4 | 336 | 380.2 | 361 |
ΔλE (nm) | NA | NA | 53.1 | 24.7 | 26.0 | 68.9 | 20.7 | 46.0 | 51.0 | 47.5 |
CMFMisF | 0.1498 | 0.1495 | 0.1355 | 0.1363 | 0.1333 | 0.0979 | 0.1455 | 0.1077 | 0.1111 | 0.1112 |
mean ΔE00 | 0.4778 | 0.3062 | 0.1892 | 0.1814 | 0.1796 | 0.1312 | 0.2085 | 0.1351 | 0.1347 | 0.1346 |
mean ERef | 0.01785 | 0.00928 | 0.00787 | 0.00792 | 0.00808 | 0.01055 | 0.00780 | 0.00997 | 0.01011 | 0.01013 |
Metrics | ΔE00 | ERef | ||||||
---|---|---|---|---|---|---|---|---|
Camera | RGBF | SGF 480 nm | ISGF 590 nm | ISGF 600 nm | RGBF | SGF 480 nm | ISGF 590 nm | ISGF 600 nm |
(a) | 0.43872 | 0.09394 | 0.06724 | 0.04982 | 0.01841 | 0.01161 | 0.01025 | 0.00969 |
(b) | 0.14554 | 0.06427 | 0.15826 | 0.17569 | 0.01926 | 0.01794 | 0.0204 | 0.02247 |
(c) | 0.06335 | 0.07416 | 0.04467 | 0.03945 | 0.00561 | 0.06686 | 0.0060 | 0.00647 |
(d) | 0.09678 | 0.09912 | 0.18562 | 0.17538 | 0.00304 | 0.00362 | 0.00505 | 0.00485 |
(e) | 1.33057 | 0.26433 | 0.64571 | 0.59835 | 0.01300 | 0.00645 | 0.00778 | 0.00843 |
(f) | 0.46338 | 0.17233 | 0.32332 | 0.16304 | 0.02361 | 0.02504 | 0.01378 | 0.01546 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wen, S.; Wen, Y.-C. Designing Quadcolor Cameras with Conventional RGB Channels to Improve the Accuracy of Spectral Reflectance and Chromaticity Estimation. Optics 2025, 6, 32. https://doi.org/10.3390/opt6030032
Wen S, Wen Y-C. Designing Quadcolor Cameras with Conventional RGB Channels to Improve the Accuracy of Spectral Reflectance and Chromaticity Estimation. Optics. 2025; 6(3):32. https://doi.org/10.3390/opt6030032
Chicago/Turabian StyleWen, Senfar, and Yu-Che Wen. 2025. "Designing Quadcolor Cameras with Conventional RGB Channels to Improve the Accuracy of Spectral Reflectance and Chromaticity Estimation" Optics 6, no. 3: 32. https://doi.org/10.3390/opt6030032
APA StyleWen, S., & Wen, Y.-C. (2025). Designing Quadcolor Cameras with Conventional RGB Channels to Improve the Accuracy of Spectral Reflectance and Chromaticity Estimation. Optics, 6(3), 32. https://doi.org/10.3390/opt6030032