#
Comparison of Imaging Models for Spectral Unmixing in Oil Painting^{ †}

^{*}

^{†}

## Abstract

**:**

## 1. Introduction

## 2. Materials and Methods

#### 2.1. Imaging Models

#### 2.2. Mockup Samples Realisation and Imaging Setup

- X, endmember, $100\%$ of that pigment,
- XY, ratio $1\phantom{\rule{-0.166667em}{0ex}}:\phantom{\rule{-0.166667em}{0ex}}1$ between the 2 pigments,
- Xy, ratio $2\phantom{\rule{-0.166667em}{0ex}}:\phantom{\rule{-0.166667em}{0ex}}1$, with X being the most concentrated,
- Xyz, ratio $2\phantom{\rule{-0.166667em}{0ex}}:\phantom{\rule{-0.166667em}{0ex}}1\phantom{\rule{-0.166667em}{0ex}}:\phantom{\rule{-0.166667em}{0ex}}1$, with X being the most concentrated.

^{®}calibration target with a known wavelength-dependent reflectance factor was included in the scene. The target served to estimate the light source spectrum and to compute the reflectance at the pixel level. The HyspexRAD software was deployed to perform radiometric correction [48]. Flat field correction was performed to correct the spatial non-uniformities of the illumination field. Due to noise present at both ends of the spectrum, the first 10 and last 10 spectral bands are omitted from the data, therefore leaving spectra with 166 data points. The reflectance factor of each patch was obtained by averaging over a manually cropped area. Post-processing of hyperspectral images and the analysis presented in the following sections are conducted using MATLAB (The MathWorks Inc., Natick, MA, USA).

#### 2.3. Spectral Unmixing Method

#### 2.4. Evaluation Protocol

**Model expectation test**. For each mixture sample, the models produce spectra using the ground truth concentration vector. This test is used to explore the forward performances of the models and their spectral accuracy.**Prediction task**. In this inversion test, only the information of the pigments included in a mixture is used. All models are inverted in a facilitated unmixing, as the spectral library is pruned down to contain only those pigments. The concentrations are retrieved through the optimisation algorithm, and the spectral reconstruction and the concentration vector are evaluated, via the proposed score w. With this task, it is possible to evaluate the ability of each model to retrieve accurate concentrations, given that they are not allowed to select endmembers absent in the mixture, while at the same time keeping a good degree of spectral reconstruction.**Unmixing**. In this instance no prior information is used. All models undergo the task of retrieving the concentrations, starting from the spectrum of the measurement and the spectral library. For each mixture sample, the best model is chosen by selecting the lowest w score. With the task of full unmixing, it is possible to evaluate all the characteristics observed in the previous tasks: spectral accuracy and concentration accuracy, plus the ability to detect the correct pigments present in a mixture.

## 3. Results

#### 3.1. Model Expectation Test

#### 3.2. Prediction

#### 3.3. Spectral Unmixing

#### 3.4. Pigment Identification

#### 3.5. Mockup Painting

## 4. Conclusions

## Supplementary Materials

## Author Contributions

## Funding

## Institutional Review Board Statement

## Informed Consent Statement

## Data Availability Statement

## Acknowledgments

## Conflicts of Interest

## Abbreviations

MSI | Multispectral Imaging |

HSI | Hyperspectral Imaging |

SU | Spectral Unmixing |

SC | Sum-to-one Constraint |

NC | Non-negativity Constraint |

CH | Cultural Heritage |

PM | Pigment Mapping |

BRDF | Bidirectional Reflectance Distribution Function |

LIP | Logarithmic Image Processing |

MSE | Mean Square Error |

ROC | Receiver Operating Characteristic |

FPR | False Positive Rate |

TPR | True Positive Rate |

## References

- Seelan, S.K.; Laguette, S.; Casady, G.M.; Seielstad, G.A. Remote sensing applications for precision agriculture: A learning community approach. Remote Sens. Environ.
**2003**, 88, 157–169. [Google Scholar] [CrossRef] - Melillos, G.; Agapiou, A.; Themistocleous, K.; Michaelides, S.; Papadavid, G.; Hadjimitsis, D.G. Field spectroscopy for the detection of underground military structures. Eur. J. Remote Sens.
**2019**, 52, 385–399. [Google Scholar] [CrossRef] [Green Version] - Sabins, F.F. Remote sensing for mineral exploration. Ore Geol. Rev.
**1999**, 14, 157–183. [Google Scholar] [CrossRef] - Bioucas-Dias, J.M.; Plaza, A.; Dobigeon, N.; Parente, M.; Du, Q.; Gader, P.; Chanussot, J. Hyperspectral unmixing overview: Geometrical, statistical, and sparse regression-based approaches. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens.
**2012**, 5, 354–379. [Google Scholar] [CrossRef] [Green Version] - Stagakis, S.; Vanikiotis, T.; Sykioti, O. Estimating forest species abundance through linear unmixing of CHRIS/PROBA imagery. ISPRS J. Photogramm. Remote Sens.
**2016**, 119, 79–89. [Google Scholar] [CrossRef] - Song, X.; Jiang, X.; Rui, X. Spectral unmixing using linear unmixing under spatial autocorrelation constraints. In 2010 IEEE International Geoscience and Remote Sensing Symposium; IEEE: Honolulu, HI, USA, 2010; pp. 975–978. [Google Scholar]
- Bioucas-Dias, J.M. A variable splitting augmented Lagrangian approach to linear spectral unmixing. In Proceedings of the 2009 First Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing, Grenoble, France, 26–28 August 2009; pp. 1–4. [Google Scholar]
- Malegori, C.; Grassi, S.; Marques, E.J.N.; de Freitas, S.T.; Casiraghi, E. Vitamin C distribution in acerola fruit by near infrared hyperspectral imaging. J. Spectr. Imaging
**2016**, 5, 1–4. [Google Scholar] [CrossRef] - Che, W.; Sun, L.; Zhang, Q.; Tan, W.; Ye, D.; Zhang, D.; Liu, Y. Pixel based bruise region extraction of apple using Vis-NIR hyperspectral imaging. Comput. Electron. Agric.
**2018**, 146, 12–21. [Google Scholar] [CrossRef] - Devassy, B.M.; George, S. Contactless Classification of Strawberry Using Hyperspectral Imaging. In Proceedings of the 10th Colour and Visual Computing Symposium, Gjøvik, Norway, 16–17 September 2020. [Google Scholar]
- Bratchenko, I.A.; Alonova, M.V.; Myakinin, O.O.; Moryatov, A.A.; Kozlov, S.V.; Zakharov, V.P. Hyperspectral visualization of skin pathologies in visible region. Comput. Opt.
**2016**, 40, 240–248. [Google Scholar] [CrossRef] [Green Version] - Mishra, P.; Lohumi, S.; Khan, H.A.; Nordon, A. Close-range hyperspectral imaging of whole plants for digital phenotyping: Recent applications and illumination correction approaches. Comput. Electron. Agric.
**2020**, 178, 105780. [Google Scholar] [CrossRef] - Delaney, J.K.; Dooley, K.A.; Van Loon, A.; Vandivere, A. Mapping the pigment distribution of Vermeer’s Girl with a Pearl Earring. Herit. Sci.
**2020**, 8, 4. [Google Scholar] [CrossRef] [Green Version] - Cucci, C.; Delaney, J.K.; Picollo, M. Reflectance hyperspectral imaging for investigation of works of art: Old master paintings and illuminated manuscripts. Acc. Chem. Res.
**2016**, 49, 2070–2079. [Google Scholar] [CrossRef] [PubMed] - Delaney, J.K.; Zeibel, J.G.; Thoury, M.; Littleton, R.; Palmer, M.; Morales, K.M.; de La Rie, E.R.; Hoenigswald, A. Visible and infrared imaging spectroscopy of Picasso’s Harlequin musician: Mapping and identification of artist materials in situ. Appl. Spectrosc.
**2010**, 64, 584–594. [Google Scholar] [CrossRef] - George, S.; Hardeberg, J.; Linhares, J.; MacDonald, L.; Montagner, C.; Nascimento, S.; Picollo, M.; Pillay, R.; Vitorino, T.; Webb, E. A study of spectral imaging acquisition and processing for cultural heritage. In Digital Techniques for Documenting and Preserving Cultural Heritage; ARC, Amsterdam University Press: Amsterdam, The Netherlands, 2018; pp. 141–158. [Google Scholar]
- Delaney, J.K.; Ricciardi, P.; Glinsman, L.; Palmer, M.; Burke, J. Use of near infrared reflectance imaging spectroscopy to map wool and silk fibres in historic tapestries. Anal. Methods
**2016**, 8, 7886–7890. [Google Scholar] [CrossRef] - Padoan, R.; Steemers, T.; Klein, M.; Aalderink, B.; De Bruin, G. Quantitative hyperspectral imaging of historical documents: Technique and applications. In Proceedings of the 9th International Conference on NDT of Art, Jerusalem, Israel, 25–30 May 2008. [Google Scholar]
- Deborah, H.; George, S.; Hardeberg, J.Y. Pigment mapping of The Scream (1893) based on hyperspectral imaging. In Proceedings of the International Conference on Image and Signal Processing, Cherburg, France, 30 June–2 July 2014; pp. 247–256. [Google Scholar]
- Khan, M.J.; Yousaf, A.; Abbas, A.; Khurshid, K. Deep learning for automated forgery detection in hyperspectral document images. J. Electron. Imaging
**2018**, 27, 053001. [Google Scholar] [CrossRef] - Valdiviezo-N, J.C.; Urcid, G.; Lechuga, E. Digital restoration of damaged color documents based on hyperspectral imaging and lattice associative memories. Signal Image Video Process.
**2017**, 11, 937–944. [Google Scholar] [CrossRef] - Deborah, H.; George, S.; Hardeberg, J.Y. Spectral-divergence based pigment discrimination and mapping: A case study on The Scream (1893) by Edvard Munch. J. Am. Inst. Conserv.
**2019**, 58, 90–107. [Google Scholar] [CrossRef] [Green Version] - Rohani, N.; Salvant, J.; Bahaadini, S.; Cossairt, O.; Walton, M.; Katsaggelos, A. Automatic pigment identification on roman egyptian paintings by using sparse modeling of hyperspectral images. In Proceedings of the 2016 24th European Signal Processing Conference (EUSIPCO), Budapest, Hungary, 29 August–2 September 2016; pp. 2111–2115. [Google Scholar]
- Bioucas-Dias, J.M.; Figueiredo, M.A. Alternating direction algorithms for constrained sparse regression: Application to hyperspectral unmixing. In Proceedings of the 2010 2nd Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing, Reykjavik, Iceland, 14–16 June 2010; pp. 1–4. [Google Scholar]
- Grabowski, B.; Masarczyk, W.; Głomb, P.; Mendys, A. Automatic pigment identification from hyperspectral data. J. Cult. Herit.
**2018**, 31, 1–12. [Google Scholar] [CrossRef] - Rohani, N.; Pouyet, E.; Walton, M.; Cossairt, O.; Katsaggelos, A.K. Nonlinear unmixing of hyperspectral datasets for the study of painted works of art. Angew. Chem.
**2018**, 130, 11076–11080. [Google Scholar] [CrossRef] - Rohani, N.; Pouyet, E.; Walton, M.; Cossairt, O.; Katsaggelos, A.K. Pigment Unmixing of Hyperspectral Images of Paintings Using Deep Neural Networks. In Proceedings of the ICASSP 2019–2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Brighton, UK, 12–17 May 2019; pp. 3217–3221. [Google Scholar]
- Grillini, F.; Thomas, J.B.; George, S. Linear, Subtractive and Logarithmic Optical Mixing Models in Oil Painting. In Proceedings of the 10th Colour and Visual Computing Symposium, Gjøvik, Norway, 16–17 September 2020. Paper 7. [Google Scholar]
- Grillini, F.; Thomas, J.B.; George, S. Mixing models in close-range spectral imaging for pigment mapping in cultural heritage. In Proceedings of the International Colour Association (AIC) Conference 2020, Avignon, France, 20, 26–27 November 2020; pp. 372–376. [Google Scholar]
- Yang, C.K.; Yang, H.L. Realization of Seurat’s pointillism via non-photorealistic rendering. Vis. Comput.
**2008**, 24, 303–322. [Google Scholar] [CrossRef] - Ripstein, J. Multi-Layered Painting and Method Therefor. U.S. Patent 5,902,670, 11 May 1999. [Google Scholar]
- Wikipedia Contributors. Pigment—Wikipedia, The Free Encyclopedia. 2020. Available online: https://en.wikipedia.org/w/index.php?title=Pigment&oldid=956663998 (accessed on 3 March 2021).
- Nascimento, J.M.; Bioucas-Dias, J.M. Nonlinear mixture model for hyperspectral unmixing. Image and Signal Processing for Remote Sensing XV. Int. Soc. Opt. Photonics
**2009**, 7477, 74770I. [Google Scholar] - Bartell, F.; Dereniak, E.; Wolfe, W. The theory and measurement of bidirectional reflectance distribution function (BRDF) and bidirectional transmittance distribution function (BTDF). Radiation scattering in optical systems. Int. Soc. Opt. Photonics
**1981**, 257, 154–160. [Google Scholar] - Burns, S.A. Subtractive Color Mixture Computation. arXiv
**2017**, arXiv:1710.06364. [Google Scholar] - Yule, J.; Nielsen, W. The penetration of light into paper and its effect on halftone reproduction. Proc. TAGA
**1951**, 3, 65–76. [Google Scholar] - Simonot, L.; Hébert, M. Between additive and subtractive color mixings: Intermediate mixing models. JOSA A
**2014**, 31, 58–66. [Google Scholar] [CrossRef] [Green Version] - Kubelka, P. Ein Beitrag zur Optik der Farbanstriche (Contribution to the optic of paint). Z. fur Tech. Phys.
**1931**, 12, 593–601. [Google Scholar] - Yang, L.; Kruse, B. Revised Kubelka-Munk theory. I. Theory and application. JOSA A
**2004**, 21, 1933–1941. [Google Scholar] [CrossRef] [PubMed] - Jourlin, M.; Pinoli, J.C. Logarithmic image processing: The mathematical and physical framework for the representation and processing of transmitted images. In Advances in Imaging and Electron Physics; Elsevier: Amsterdam, The Netherlands, 2001; Volume 115, pp. 129–196. [Google Scholar]
- Hecht, S. The visual discrimination of intensity and the Weber-Fechner law. J. Gen. Physiol.
**1924**, 7, 235. [Google Scholar] [CrossRef] [Green Version] - Maheu, B.; Letoulouzan, J.N.; Gouesbet, G. Four-flux models to solve the scattering transfer equation in terms of Lorenz-Mie parameters. Appl. Opt.
**1984**, 23, 3353–3362. [Google Scholar] [CrossRef] - Zhao, Y.; Berns, R.S. Predicting the spectral reflectance factor of translucent paints using Kubelka-Munk turbid media theory: Review and evaluation. Col. Res. Appl.
**2009**, 34, 417–431. [Google Scholar] [CrossRef] - Vargas, W.E.; Niklasson, G.A. Applicability conditions of the Kubelka–Munk theory. Appl. Opt.
**1997**, 36, 5580–5586. [Google Scholar] [CrossRef] - Kremer. Kremer Pigmente GmbH & Co.KG. Available online: https://www.kremer-pigmente.com/en/ (accessed on 3 March 2021).
- Wrapson, L. Artists’ Footsteps, the Reconstruction of Pigments and Paintings; Archetype Publications: London, UK, 2012. [Google Scholar]
- Grillini, F.; Thomas, J.B.; George, S. VisNIR pigment mapping and re-rendering of an experimental painting. J. Int. Colour Assoc.
**2021**, 26, 3–10. [Google Scholar] - Pillay, R.; Hardeberg, J.Y.; George, S. Hyperspectral imaging of art: Acquisition and calibration workflows. J. Am. Inst. Conserv.
**2019**, 58, 3–15. [Google Scholar] [CrossRef] [Green Version] - Nelder, J.A.; Mead, R. A simplex method for function minimization. Comput. J.
**1965**, 7, 308–313. [Google Scholar] [CrossRef] - Youden, W.J. Index for rating diagnostic tests. Cancer
**1950**, 3, 32–35. [Google Scholar] [CrossRef]

**Figure 1.**Three possible configurations of pigment mixing. (

**a**): Optical blending occurs when two materials are physically separated but mixed at the camera level. (

**b**): A layered structure assumes that light is transmitted and reflected according to the properties of the different layers. (

**c**): In the intimate mixing the components are not physically discernible.

**Figure 2.**Set of mockup samples realised for the experiment. The 175 painted patches are ordered according to a script, not considering perceptual similarities. Reproduced from [29] with permission from the International Colour Association (AIC).

**Figure 3.**Mockup painting realised with the same set of pigments used for the composition of the mixture samples. Reproduced from [47] with permission from the AIC.

**Figure 4.**Hyperspectral image acquisition setup. The light sources are placed at 45°, while the camera is at 0°. This allows avoiding specular reflection and shadows. In the push broom setup, the translator stage slides across the field of view of the camera at a speed synchronised with the integration time of the camera.

**Figure 5.**Experimental flow chart. The mockup samples are realised and acquired in a Hyperspectral Imaging (HSI) setup, then the reflectance factors of the patches are estimated (Section 2.2). Each imaging model is evaluated individually. Different features of the imaging models can be observed, depending on the task performed: the model expectation test allows to evaluate the spectral accuracy; the prediction task investigates spectral accuracy and concentration estimation, whereas spectral unmixing comprehends spectral accuracy, concentration estimation, and pigment identification.

**Figure 6.**Performances in the model expectation test. (

**a**): Average MSE values and respective $95\%$ confidence intervals. (

**b**): Number of times each model has been selected as best or worst in terms of MSE.

**Figure 7.**Performances in the prediction task. All error bars refer to $95\%$ confidence intervals. (

**a**): Mean w score. (

**b**): Number of times each model has been selected as best or worst in terms of w score. (

**c**): Average MSE. (

**d**): Mean concentration error $\Delta \alpha $.

**Figure 8.**Performances in the spectral unmixing task. All error bars refer to $95\%$ confidence intervals. (

**a**): Mean w score. (

**b**): Number of times each model has been selected as best or worst in terms of w score. (

**c**): Average MSE. (

**d**): Mean concentration error $\Delta \alpha $.

**Figure 9.**Performances of pigment detection. (

**a**): The scores are calculated at the concentration thresholds reported in Table 3. (

**b**): The scores are computed at a fixed ${\alpha}_{T}=0.15$. The overall detection performance is slightly poorer when a fixed ${\alpha}_{T}$ is used, as it is observable by the small decreases in accuracy. At the same time, the trade-off between precision and recall yields very similar F1 scores in both conditions (

**a**,

**b**).

**Figure 10.**Pseudo-colour concentration maps related to each pigment (by

**column**) and investigated imaging models (by

**row**). The best performances are obtained by the subtractive-based models, whereas the additive-based models reported the poorest results.

**Table 1.**Proposed imaging models divided into three main categories: additive (A), subtractive (S), and hybrid (H). The models ${M}_{4}$ and ${M}_{5}$ are indeed hybrid but have strong additive and subtractive tendencies, respectively.

Label | Name | Equation | Category |
---|---|---|---|

${M}_{1}$ | Additive | $Y={\sum}_{i=1}^{q}\phantom{\rule{0.277778em}{0ex}}{\rho}_{i}{\alpha}_{i}$ | A |

${M}_{2}$ | Subtractive | $Y={\prod}_{i=1}^{q}{\rho}_{i}^{{\alpha}_{i}}$ | S |

${M}_{3}$ | Yule-Nielsen | $Y={\left(\right)}^{{\sum}_{i=1}^{q}}\frac{1}{\tau}$ | H |

${M}_{4}$ | Additive-Subtractive | $Y=\tau {\sum}_{i=1}^{q}{\alpha}_{i}{\rho}_{i}+(1-\tau ){\prod}_{i=1}^{q}{\rho}_{i}^{{\alpha}_{i}}$ | H/A |

${M}_{5}$ | Subtractive-Additive | $Y=\left(\right)open="("\; close=")">{\sum}_{i=1}^{q}{\alpha}_{i}{\rho}_{i}^{\tau}$ | H/S |

${M}_{6}$ | LIP additive | $Y=1-{\prod}_{i=1}^{q}{(1-{\rho}_{i})}^{{\alpha}_{i}}$ | A |

${M}_{7}$ | LIP subtractive | $Y=1-exp\left(\right)open="["\; close="]">-{\prod}_{i=1}^{q}{\left(\right)}^{-}{\alpha}_{i}$ | S |

**Table 2.**Pigments included in the set of mockups. The codes refer to the serial number assigned by the manufacturer. The labels identify the pigments and are arbitrarily assigned to better understand the results when the mockup painting is analysed.

Name | Code | Label |
---|---|---|

Kremer White | 46360 | W |

Ultramarine Blue | 45030 | B |

Naples Yellow | 43125 | Y |

Carmine | 23403 | C |

Vermilion | 42000 | V |

Viridian Green | 44250 | G |

Gold Ochre DD | 40214 | O |

**Table 3.**Concentration thresholds ${\alpha}_{T}$ computed for each model. The lowest and preferable values are obtained by the hybrid models ${M}_{3}$, ${M}_{4}$, and ${M}_{5}$, indicating that they can discard more confidently false positive detections.

${\mathit{M}}_{1}$ | ${\mathit{M}}_{2}$ | ${\mathit{M}}_{3}$ | ${\mathit{M}}_{4}$ | ${\mathit{M}}_{5}$ | ${\mathit{M}}_{6}$ | ${\mathit{M}}_{7}$ | |
---|---|---|---|---|---|---|---|

${\alpha}_{T}$ | 0.26 | 0.18 | 0.13 | 0.13 | 0.15 | 0.30 | 0.20 |

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Grillini, F.; Thomas, J.-B.; George, S.
Comparison of Imaging Models for Spectral Unmixing in Oil Painting. *Sensors* **2021**, *21*, 2471.
https://doi.org/10.3390/s21072471

**AMA Style**

Grillini F, Thomas J-B, George S.
Comparison of Imaging Models for Spectral Unmixing in Oil Painting. *Sensors*. 2021; 21(7):2471.
https://doi.org/10.3390/s21072471

**Chicago/Turabian Style**

Grillini, Federico, Jean-Baptiste Thomas, and Sony George.
2021. "Comparison of Imaging Models for Spectral Unmixing in Oil Painting" *Sensors* 21, no. 7: 2471.
https://doi.org/10.3390/s21072471