Next Article in Journal
Assessment of 3D Model for Photogrammetric Purposes Using AI Tools Based on NeRF Algorithm
Next Article in Special Issue
Evaluation of the Suitability of Electrokinetic Treatment to Desalinate the Limestone of the Tomb of Cyrus, a UNESCO World Heritage Site in Iran
Previous Article in Journal
Fluorescent Paints in Contemporary Murals: A Case Study
Previous Article in Special Issue
Archaeometric Identification of a Perfume from Roman Times
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Preserving Colour Fidelity in Photogrammetry—An Empirically Grounded Study and Workflow for Cultural Heritage Preservation

by
Miguel Antonio Barbero-Álvarez
1,*,
Simon Brenner
2,
Robert Sablatnig
2 and
José Manuel Menéndez
1
1
GATV-SSR, Escuela Técnica Superior de Ingenieros de Telecomunicación, Universidad Politécnica de Madrid, 28040 Madrid, Spain
2
Computer Vision Lab, Faculty of Informatics, Technische Universität Wien, 1040 Vienna, Austria
*
Author to whom correspondence should be addressed.
Heritage 2023, 6(8), 5700-5718; https://doi.org/10.3390/heritage6080300
Submission received: 3 July 2023 / Revised: 24 July 2023 / Accepted: 28 July 2023 / Published: 5 August 2023

Abstract

:
In this paper, a study is performed in order to achieve a process that successfully respects the colour integrity of photogrammetry models of cultural heritage pieces. As a crucial characteristic of cultural heritage documentation, the colour of the pieces—as a valuable source of information—needs to be properly handled and preserved, since digital tools may induce variations in its values, or lose them to a degree. Different conditions for image acquisition schemes, RGB value calculation, calibration and photogrammetry have been combined and the results measured, so the adequate procedure is found. Control over all colour transformations is enforced, with blending operations during the texture generation process being the only unpredictable step in the pipeline. It is demonstrated that an excellent degree of colour information preservation can be achieved when applying said control on the factors of acquisition and colour digitalization, inclusive deciding their parameters. This paper aims to serve as guidelines of a correct handling of colour information and workflow so cultural heritage documentation can be performed with the highest degree of colour fidelity, covering the gap of non-existing standard procedure or conditions to perform an optimum digital cultural heritage colour modelling.

1. Introduction

Cultural heritage and technology have had a common history in recent decades [1]. The digital horizons of today aim to cover on a horizontal scope any field of human knowledge. Cultural heritage, understood as the material and immaterial inheritance from the past, has also been included amongst these.
An important part of material cultural heritage consists of significant three-dimensional objects that can vary in size, physical aspect, usage, or connotations. Cultural heritage conservation is a field in science that aims to document and preserve these objects for the present and future. Digital tools and methods for documentation and conservation have been widely proposed in recent years [2], most of them adapted to the particular cultural heritage objects the proposals aim to work with, guaranteeing their safety and integrity [3].
Part of the documentation process of material objects consists of elaborating 3D models that accurately represent their current physical aspect. It can also be used for slightly different purposes, such as virtual restoration and reconstruction for pieces that can not be correctly restored in real life [4]. In all cases, during the digitization process, the visual aspect of the objects should be respected and produced in its integrity. This includes textures, patterns, relief, and colour. Colour is especially important, since, in tangible cultural heritage, it can be a crucial element associated to the object [5].
Concretely, photogrammetry is a widely used technique to acquire both the shape and surface colour of material cultural heritage, using a set of photographs of the object under analysis [6]. Photographic cameras can either return fully compressed or uncompressed, presentation-ready RGB images in any of the corresponding formats (such as JPEG or TIFF), or they can return raw data. These encode just how intensely each cell of the photoreceptor is stimulated by the arriving light [7]. These raw data are the first camera-internal information that can be retrieved, and the presentation-ready images are calculated using these, after undergoing additional processes like white balancing, colour correction, compression, etc.
Photogrammetry needs RGB images to construct a coloured 3D model. However, any presentation-ready RGB image generated directly on the camera or a conventional raw development software (e.g., Adobe Lightroom, CaptureOne) has already undergone a compression process whose parameters are unknown to a degree. This means that, when wanting to represent the true colour present in an item such as a cultural heritage piece, where its preservation is fundamental, this path of RGB image generation is non optimal: the internal coding process in the camera or development software might have modified some of the colour information perceived by the optic block. It is desirable, therefore, to keep control of which processes alter the raw data, in order to assure fidelity to the highest possible degree.
Nevertheless, in spite of the importance of preserving the integrity of visual characteristics in digital cultural heritage modelling, there exist no standard or preferred procedure to achieve it on a holistic level. As such, the work in this paper aims, therefore, to serve as guidelines to perform a digital cultural heritage 3D modelling process of minimal quality loss, and incorporates an adaptive type of colour calibration employed in previous works [8] that ensures state-of-the-art accuracy in homogenizing colour between pictures. The given guidelines are easy to follow and implement, do not require overly specialized gear or expensive purchases, and are designed so any cultural heritage preservation laboratory can follow them with basic installations and procedures.
The method presented aims, therefore, to produce RGB images perfectly usable for photogrammetry which, however, are generated from the raw data with full control and transparency regarding the colour transformations applied. The presented algorithm has been specifically designed to imitate the effect of image compression as seen in known theory [7] but minimizing information loss in order to approximate the final result to the real colours in the pieces, whilst all data manipulations are transparent to the user.
The paper is structured as follows: first, an introduction to digital colorimetry and state of the art is outlined so the context of work can be deduced. Then, the proposed method is presented, and a varied set of experimental subjects is described. After performing the process, relevant quality metrics are extracted and conclusions drawn.
The data acquisition has been performed in concordance with COSCH’s indications on how to treat digitalization of cultural heritage [9].

2. Related Work

2.1. Colour Perception in Cameras

Similarly to the human eye, colour information captured by different cameras varies, incurring in slight differences of the perceived colours [10]. Therefore, it adds its internal uncertainty to the already existing external factors that alter the perceived colour—such as scene and illumination. In these cases, calibration is sought in order to neutralize and homogenize colour information [11,12].
The structural composition of cameras does affect how colour is perceived, and miscapturing the scene due to odd sensor responses is a possibility that may occur. Camera sensors do not have lineal responses in spectrum, so some wavelengths might not be represented in all their intensity—leading to colour faking [13].
The optic block can present chromaticity aberrations that distorts the radiation occurring over every pixel in the sensor [14]. It has been proved that this phenomenon occurs due to the chemical and material composition of the sensors and optic blocks that compose the cameras [15].
The retrieval of colour information within a camera happens from the very start of the light capturing process. The aforementioned light intensity-perceiving sensors in the camera return internal values depending on their saturation, and the colour Filter Arrays (CFA)—like Bayer Filters—are used to differentiate RGB stimuli [16].
From that point on, depending on the image standard followed by the camera’s internal programming, additional operations like white balancing or gamma may be applied to modify the RGB values accordingly to the expected outcome [7]. Therefore, any user can extract either uncompressed, non-treated raw data, raw-to-RGB transformed data, or ending compressed image data from a professional camera.

2.2. Colour in Material Cultural Heritage

Any piece categorized as material cultural heritage undeniably implies the existence of a heavy visual component as source of information, therefore relatable to multimedia technology.
Colour and colorimetry information is being regarded as significant when inspecting pieces exposed in museums. The nature of this type of information in this context has the advantage that it can be captured by any third party with a camera, therefore being a non-invasive method of data acquisition [17].
Initiatives such as Colour and Space in Cultural Heritage (COSCH) [9] perform attempts to guide the correct handling of material heritage pieces for their digital analysis, and yet directly by its name the colorimetric factor is regarded as crucial—including calibration and reconstruction.
The field of heritage preservation science has been favoured by the progressive global society digitalization, combined with a considerable interest in the transversal possibilities of multimedia [3].
In a more specialised fashion, other initiatives, such as [18], concentrate in more particular aspects of colorimetry related to cultural heritage. As such, they highlight the importance of acquisition and digital treatment concerning cultural heritage representations, e.g., scanning, photography, or calibration.

2.3. Cultural Heritage 3D Models

The 3D models of material cultural heritage pieces are a solid way of documenting relevant visual information. It is considered that 3D models of material cultural heritage comprise an added value experience perceived by the public, aside from their documented worth [19].
Additional proposals, see it possible to take advantage of cultural heritage 3D models for restoration purposes [4]. Manipulating a damaged heritage piece can become an irreversible process, so this kind of models are useful to try and reproduce their hypothetical original visual information. In [4], employing external pigment data, they apply them over the model to reconstruct the likely colour that has been washed by time in objects.
Photogrammetry is a widely used method to capture both the 3D shape and the surface colours of a piece at a given point in time [1] (see Figure 1 for illustration).
However, especially when the heritage pieces bear a considerable age or exposure to the elements, their current colour information is likely to be different than the original—when the piece was created. Works as [20] highlight the difficulty of preserving the original colours intact, especially after a prolonged exposition towards damaging elements.
However, in spite of the publicly admitted usefulness of virtual 3D representations of cultural heritage pieces [1], and the crucial factor of colour, there exist no general guidelines on how to properly manage it, with the target of models reproducing the external appearance of the real object as closely as possible. There are similar guidelines for colour scanning in small regions of materials, but not on a global level [21].
Thus, this paper aims to contribute to the benefits of 3D modelling cultural heritage pieces by means of demonstrating how to generate surface colours as truthful as possible. In order to achieve its aims, it takes into account the issues of subjectivity of colour perception.

3. Materials and Methods

Figure 2 illustrates the process from image acquisition over image data manipulations to the photogrammetric reconstruction. The algorithm presented in this work consists of three fundamental steps:
1.
Debayering: recover RGB data from the raw file.
2.
Calibration: surpass perceived chromatic differences.
3.
Photogrammetry: build the Cultural Heritage model starting from a calibrated batch of images.

3.1. Debayering

Any digital camera can retrieve the acquired scene in form of digital RGB data, but only as a the result of a mathematically delicate process that transforms the light occurring in the objective into presentation-ready digits—precisely the same process that can imply the compression and information loss this work intends to avoid. Therefore, the manual handling of data is performed.
The most immediate data that can be recovered from any camera are the direct digitalization of the captured light, also known as raw data. Usually, they would be automatically transformed and processed into RGB, in a sub-process known as debayering. Then, these intermediate RGB data (known as camera RGB) would be compressed and further processed until the final results [7].
The aim of this step is to directly transform the original raw image data to RGB by imitating the functioning of the camera’s internal optoelectronic block, according to the colour filter array (CFA) of the camera, while avoiding any other further operation that could jeopardise the captured colour’s integrity.
Whilst it is true that each single camera perceives colour in a different way, there does not exist any single commonly accepted solution for this step.
Some solutions do provoke a reduction in the size of the resulting RGB image, since they discard the raw pixels where no information of any of the tiles of the CFA or Bayer filter exists. Considering that the target of this work is avoiding an excessive loss of information, a debayering strategy that avoids the pixel loss or modification of the original data are also ideally sought. This can be achieved by respecting the original size of the image, in spite that some of the resulting pixels will be results of a 2D interpolation. On the other hand, if a reduced image where only the pixels that correspond to the sensing tiles of the Bayer filter can be achieved, the forging of information not existing in the first place can be avoided, too.
Therefore, having these possibilities for retrieving RGB images that either (1) do not lose spatial information, or (2) do not fabricate new information, it is a priori not known which one will bear more fidelity. Thus, three possible procedures have been considered for this work, listed below. It has to be considered that there exist more procedures than the listed ones, such as Nearest Neighbours or Splines.
The chosen debayering approaches have been considered due to their default easiness to implement, their theoretical quality, and the fact that they milden the effect of fabricating “new” information in the interpolated pixels: bilinear interpolation assures an average between neighbours instead of their repetition or the introduction of non-linearities.
The latter two are implemented considering that the CFA is a Bayer filter that intercalates colour-sensing tiles with non-valid tiles (with no filter action):
1.
Direct debayering: Direct debayering of the raw image data of the NEF format based on standard procedure following the Python RawPy library [22]. Its concrete algorithm is “Adaptive homogeneity-directed demosaicing algorithm” (“AHD”). It features the advantage of estimating the colour by minimizing artifacts and errors; however, its conception based in filter banks makes the process non-linear, and additionally loses a small stripe of pixels due to convolution operations [23].
2.
Bilinear interpolation: respecting the original pixel size of the raw image, the non-valid pixels are substituted by a bilinear interpolation of the Bayer tiles pixels. It has the advantages of linearity—and therefore reversability—and the fact that the full size of the image is saved. The disadvantage may be a bigger risk of the production of colour artifacts due to its simple conception.
3.
Discard: all non-valid pixels are discarded, and the resulting image is one quarter of the size of the original raw file. Its obvious advantage is that no new information is fabricated, therefore all saved pictures bear true information. The disadvantage is effectively, the loss of information, while it may ease calculations due to its reduced size.
The resulting RGB images are represented in a certain space commonly known as the camera RGB—a RGB space whose gamut is imposed by non-standard primaries internally defined by the physical peculiarities of the components of the optic block of the camera, as seen in Figure 3, left. The concept of camera RGB is therefore unique for each possible acquisition device. However, it is desirable to fix the camera RGB to any standardized digital RGB space. In order to achieve this and make the images ready for photogrammetry, the following steps are performed.

3.2. Colour Calibration

Photogrammetric 3D models are created from a set of images depicting the object under analysis from different viewpoints. Ideally, all of them should be acquired using the same camera parameters, and under the same lighting conditions—varying spectral characteristics of lighting has an impact on the colours recorded [20]. But even identical imaging conditions cannot guarantee existing identical colorimetry in different images: perceived maximum and minimum levels of white and black, as well as automatic camera white balancing parameters can be different for each instance. Furthermore, when acquiring pictures from different angles, the perceived lighting might change due to non-uniform illumination, light artifacts, random noise in the sensor, etc.
The calibration step aims to bring homogeneity to the perceived colour appearance among all the pictures of a single batch. It also may serve as a way to represent the probable true colour of the piece in the photogrammetry model since it eliminates the colour bias of non-neutral illumination temperatures. Additionally, within this step other visual procedures destined to visual enhancement that are performed between the debayering and the image compression in cameras, such as white balancing and colour correction, are indirectly completed.
Before proceeding to calibration, the data need to be transformed from the camera RGB to any kind of standardized digital RGB spaces. This is due to two reasons:
  • A digital RGB space is defined by its R, G, and B primaries, which are specified in its documentation. These serve effectively as limits to the gamut. When defining the limits of the space, all its domain can be calibrated without the need to extrapolate colour values. This is important to know, considering that a colour calibration consists fundamentally in algebraic displacements, resizing and interpolations in the operational colour space with a reference to reach. This references—a set of colours sparse in space—can be achieved. Camera RGB values calculated from raw data lack these standardized primaries, so operating with them is tedious for disjoint sources.
  • Camera RGB values are direct translations from the perceived light stimuli, which do not always have to correspond to the concrete colours defined within the CIE plane and solid. Transformation into a standardized digital RGB does not only ensure that all colours depicted can be accepted by presentation devices, but also ensures the correct perception of visual information, which is ultimately a mandatory condition when dealing with content sensed by human eyes.
Obtaining images in any digital RGB space from this point is simple. The EXIF metadata of the raw images contain the matrix that allows us to transform the camera RGB the are coded with to the master colour space XYZ. Once this is obtained, different digital RGB spaces can be obtained knowing the operation in beforehand. Usually, it requires a matrix multiplication and a gamma compression, whose parameters vary depending on the sought space.
Equations (1) and (2) explain how to relate the XYZ space with any kind of RGB space, standardized or not. X w , Y w and Z w define the reference white point (usually a CIE-standardized illuminant) to be used in the transformation. The parameters S r , S g , S b , X r , Z r , X g , Z g , X b and Z b are calculated with (3) and (4), using the xy coordinates for each primary of the RGB space in usage x r , y r , x g , y g , x b and y b . Y r , Y g , and Y b are equal to 1. However, the matrix is usually known as it can be checked in the standards for each RGB space, or can extracted from the camera EXIF metadata too.
R G B = S r X r S g X g S b X b S r Y r S g Y g S b Y b S r Z r S g Z g S b Z b 1 X Y Z
S r S g S b = X r X g X b Y r Y g Y b Z r Z g Z b 1 X w Y w Z w
X p o i n t = x p o i n t / y p o i n t
Z p o i n t = ( 1 x p o i n t y p o i n t ) / y p o i n t
After transforming XYZ to RGB, a gamma compression (Equation (5)) has to be applied. Its parameters depend on the space, but it always consists of a piecewise-built function that is composed of a scalar multiplication by a factor α for dark values (lower than threshold β ) and an exponential function otherwise. The end product stays in the range [0,1], and is multiplied by the dynamic range of the final digital RGB space, 2 n , n being the bit depth.
γ ( q ) = α q ; q = R G B β
γ ( q ) = q 1 γ ; q = R G B > β
Next, after having translated the camera values to any standardized digital RGB space, the calibration can be performed. In previous articles, as in [8], an adaptive spatial calibration framework specifically designed for cultural heritage conservation has been presented, surpassing in its quality metrics state-of-art related works. Therefore, its usage is applied for this context, too.
The calibration process has been performed employing an X-Rite ColorChecker Classic (or MacBeth) chart, since it is arguably the most widely employed as a commercial solution; and probably will be the easiest to obtain for any cultural heritage conservation-related working team. The colour chart references are known in beforehand since they also are standardized. Every colour value will have a corresponding counterpart rearranged in the calibrated RGB space. Algebraically speaking, the colours will be connected in three dimensions corresponding to the three image colour channels.

3.3. Photogrammetry

The calibrated images are used as inputs for photogrammetric reconstruction. For this work, the open source photogrammetry framework implemented in AliceVision Meshroom [24] has been used, which supports all necessary steps from extracting and matching feature points in the input images, over point cloud reconstruction and meshing, until texturing.
In the context of this work, only the last stage of the pipeline, i.e., the creation of colour textures, is relevant. In order to retrieve the texture of a certain triangle, the input images in which the triangle is visible are determined as candidate texture sources. The sources are then blended together following a heuristic: more images are blended together in low frequencies than in high frequencies [25], leading to consistent global colours while preserving fine details [24]. Hereby, images in which the reprojected triangle covers the largest area are prioritized [24]. The resulting texture patches associated with each triangle are then arranged in texture maps, following an optimization by Levy et al. [26]. An example of a textured 3D model and the corresponding texture map is shown in Figure 4.
It is evident that the calibrated colour values undergo a series of averaging and interpolation operations before the final texture is arrived at.

3.4. Evaluation

All the steps until the photogrammetry process, which is performed by an external program, involve algorithms in which the handling of information is under control. During texture generation, however, colour information from multiple input images is merged in a non-trivial way that depends on the quality and arrangement of input images (see Section 3.3). It is therefore imperative to assess, after the model is generated, in which way colour information is modified by it, and if it implies a serious deviation from the calibrated pictures—which bear the unbiased colour information.
For this purpose, a reliable colour distance metric needs to be used. The ΔE CIEDE2000 difference, or ΔE00, defines a distance metric similar to the norm in the CIELAB space, considering differences related to the physical lightness L, hue H, and chroma C [27]. Equation (7) reflects its nature:
Δ E 00 = Δ L k L S L 2 + Δ C k C S C 2 + Δ H k H S H 2 + R T Δ C k C S C Δ H k H S H
The distance needs to be measured in regions where the surface colour is known. The most favourable candidates are the colour patches of the ColorChecker chart, which are included in the final model and easy to locate (see Figure 4). For each colour patch, the average CIEDE2000 difference is calculated, aside the correlation between the values of the input images and texture maps via the Pearson coefficient, which is used for image validation. Since the colour value distribution in a patch is also valuable information on texture preservation, a correlation coefficient is a valid way to assess how much this is respected in the texture maps after the photogrammetry.

4. Experiments

The experimental setup is bound to test the colour preservation for different combinations of digital RGB colour spaces, setups, and acquisition conditions.
The variation in the colour space is of crucial importance, since their gamuts are of different sizes. Depending on which digital RGB is chosen to code the final images with, it will mean effectively that the possibility of depicting certain hues will exist or not. If the goal is to preserve colour fidelity in the highest degree, it would be desired to employ the digital RGB space with the biggest gamut possible. It is important to mention that, even in this case, the values from the camera RGB calculated from the raw data that fall out of the CIE plane will be lost either way, as they will be saturated to their closest existing equivalent on the borders of the gamut.
Therefore, experiments employing spaces of common usage in technology (sRGB [28] and Adobe RGB [29]) have been performed, and additionally Wide Gamut RGB or wide RGB [30] is been considered. The latter is designed for a wider coverage of the CIE plane (Figure 5).
Table 1 depicts the parameters of the three spaces in the experimental setup, defined by their primaries and their white points.
The colorimetric calibration approaches are evaluated for several acquisition setups, which are summarized in Table 2. Three of them (“Sun”, “Fluor”, “LED”) are experimental setups shot in laboratory with different illuminants; setup “Mixed” is a production setup for an archaeological documentation project, in the facilities of a museum. All images are acquired with a Nikon D4 DSLR camera, with a full frame sensor and an image size of 4940 × 3292 pixels. Following the metadata, its CFA consists of a [Red,Green][Green,Blue] 2 × 2 Bayer filter.
In the lab setups, the chart is placed on a fixed pedestal. Images are taken with a AF Micro-Nikkor 60 mm lens, from a tripod at a fixed elevation angle. Between shots, the tripod with the camera is moved around the object manually, resulting in 35 to 49 images per object. Three lighting situations are examined:
  • Indirect sunlight—“Sun” set: Imaging took place on a sunny afternoon (10 August 2022, between 14:48 and 14:58), with two large windows facing south-west opened. The sunlight did not directly fall on the imaged objects.
  • Fluorescent room light—“Fluor” set: The aforementioned windows were closed and covered with black cloth; light was provided by an array of conventional fluorescent lights installed on the ceiling.
  • White LEDs—“LED” set: Two Lightpanels MicroPro were mounted on tripods on opposite sides of the object, in an elevation angle of approximately 45 .
The production setup (or “Mixed” set) was part of a documentation routine for Etruscan bronze mirrors1. The archaeological objects are placed on a turntable which is rotated manually, with 25 stops per full rotation (a rotation of ca. 14.4 between successive positions); for oblique rotation angles where it was not possible to keep the whole object in focus, multiple images with different focal planes were acquired. As the mirrors had to be fixed with a block of Ethafoam, capturing the whole surface requires to perform two imaging round with the mirror turned upside down in between and then merge the results. However, for the sake of these experiments, we use only one set of images per mirror (see right of Figure 4 for an example. The table, turntable, and backdrop are covered with gray photographic paper to obtain a uniform background. The images are acquired with a Tamron 28–300 mm f/3.5–6.3 lens set to 100 mm focal length, from a fixed tripod. For lighting, two halogen lamps with softboxes are placed on both sides of the camera, symmetrically to the camera’s optical axis. Additionally, an ambient room illumination (fluorescent lamps) is present. The ColorChecker chart is placed on the turntable next to the object, as a basis for colorimetric calibration and validation. The different acquisition setups and examples of the resulting images are shown in Figure 6.
Thus, the experimental setup will consist in building textured 3D models in unique combinations of three different debayering techniques (Direct, Bilinear, and Discard), three colour RGB spaces (sRGB, Adobe RGB, and wide RGB), and four acquisition settings (Mixed, Sun, LED, and Fluor)—in total 36 configurations.

5. Results

5.1. Visual Inspection

At a first glimpse, it can be thoroughly seen that the calibration procedure removes colorimetric bias in the acquisition conditions and highlights darker details in the representation of the objects under analysis. Test set Mixed has been taken under an illuminant with a yellowing colour temperature, and this bias towards the yellow-orange part of the CIE plane is compensated in the resulting images (Figure 7), where the colour value concentration lies on the centre of the space—where neutral hues leaning towards the gray zone exist in abundance (visible effects of the calibration in Figure 8). Additionally, the stretching towards the white point enables a sparser value concentration in the lower range of colours, so further details in darker areas (such as in the metallic surface of the object) can be appreciated.
This phenomenon of “drifting” to the neutral part of the RGB cube is seen in all other sets. This can also be observed in other items of the scene, that are purposely not modelled by the photogrammetry. The recolouring after the calibration gives them a non-natural appearance to human eyes due to the neutralization. It is worthy of mention that the recolouring strongly depends on the selection of hues in the colour chart. The employed ColorChecker Classic chart features a hue selection that is irregular in the RGB cube; therefore, under natural light acquisition, its corresponding values in space are differently and irregularly displaced by the effects of the illumination. Since the calibration function bases itself in multidimensional interpolation, different regions of space that are unevenly displaced will be unequally corrected in the calibrated gamut, explaining the odd colour appearance that may appear sometimes. Nevertheless, photogrammetric models featuring a compact colour selection, such as the bronze mirror in the Mixed set, are unaffected by this phenomenon of odd colour drifting—therefore, this neutralization of illumination is satisfyingly achieved (corresponding to the conclusions of [8]).
It is certainly of interest noticing how the RGB value saturation acts when transforming from the camera RGB to one of the different standardized RGBs with a variable gamut (Figure 9). In the Fluor set, the images calibrated in sRGB—the space with the smallest gamut—feature a different colour for the table in the scene than when calibrated in Adobe or in wide RGB. In these cases, the colour assigned to it is more similar to the humanly perceived one in the uncalibrated images or the version rendered by the camera automatically. This already hints on the importance of employing the widest possible gamut for this application, especially when colours in the edge of the gamut are important.
A similar phenomenon can be seen in the Sun and LED sets. Here, wide RGB achieves a neutralization similar to the previous ones, with a general air of neutral colour—nevertheless, it catches the attention that the table is calibrated as soft blue/gray, and not as yellow as in the Fluor set, in spite of the colours in the chart being well maintained. Arguably, the source illuminant in the scene clearly is an important factor when correcting (as seen in Figure 6c, where under LED illumination the table appears ambiguously under a gray appearance—which explains why after its calibration it is coloured as gray). Additionally, in the Sun set the drifting on the table is less exaggerated than in the Fluor set, but can be recognized that the “green” tinct on the table in sRGB drifts towards a softer blue/gray when comparing it with Adobe, and more with wide RGB. In the LED and Mixed sets the effect is milder.
Another phenomenon occurring is a rupture of the natural colour texture in the image by assigning a value subrange to radically different calibrated values, which visually contrasts with their vicinities. This phenomenon was already observed in [8,31], but it was not frequent and overcome by the benefits of multidimensionality. It has to be considered though that in the previous study the calibration was applied over already compressed images (which are usually smoothed). In this study, the factor of parting from raw data seemingly plays a bigger role in how the calibration acts and visually notes its effects.
Generally, the pictures calibrated in the wide RGB space show less consequences of this effect, or no consequence at all, as in the Mixed and Fluor sets. This is also linked to the fact that the wide RGB pictures look the most colour neutral of all the scenes. Only in the LED and Sun sets there exists a slight false colouring in the white regions for this space, as in the white patch (Figure 10). It also corresponds to the sets with the most neutral illuminant temperature; and incidentally, as seen in [8], in this sort of illumination where it is white and of high luminosity the calibration function is unable to strongly displace and correct saturated colours—since there is barely no possible spatial colour shift in these cases. This, therefore, conforms a calibration function that performs shorter hue shifts concentrated in the saturated, bright colours, thus failing to adequately cover the rest of the gamut. This provokes artifacts as the ones seen in the background of Figure 11.
In correlation with the previously observed general colour drift correction, the sRGB space is generally the one that bears the most exaggerated miscolouring in all cases, followed by Adobe (seen in the transition in Figure 11). The gamut size is another factor to take into account, since it implies the existence of bigger domains of similar hues. Due to the different perception of lowly saturated colours in the scene depending on the illuminant, it has to be ensured that the colour content corrected and passed to the photogrammetry step needs to remain within the RGB subspace bordered by the values of the colour chart. Proceeding this way, possible shifts in space are covered in the domain of the function and unexpected mispixelings and extrapolations will not happen.
The debayering strategies do not bear different effects over the colour perception more than the other factors. It has to be seen if the quality metrics reflect any difference depending on them.
After applying the pictures in the corresponding photogrammetry processes, the same colorimetric phenomena mentioned in the previous paragraphs can be observed. This offers assurance that, regardless of the unknown details of the operations made by the photogrammetry program, the visual characteristics are considerably preserved. Nevertheless, some areas feature visual defects, caused by errors in 3D reconstruction due to a lack of distinctive feature points and consequently erroneous texture map generation, in which colour uniformity is ruptured (Figure 12). However, these happen in areas of the model that result in not being covered in abundance of detail—therefore, it should be considered as a minor inconvenience for the concrete heritage piece to model, which is likely to be properly represented in all pictures employed to generate the textures. In this case, errors induced by stitching are likely to be less pronounced.
Concerning the texture images resulting after the photogrammetry, it is worthwhile to notice that, while the colours overall remain with an expected look, the hues corresponding to the colour patches of the chart undergo a smoothing with respect to the original—aside of the aforementioned defects, caused by 3D reconstructions being noisy in monochrome surfaces—based on their surroundings. This effect probably originates from blending operations during texture building (Section 3.3). This already hints on an irreversible process that affects the colour integrity of the original source. The metrics taken between the texture image and the source are analyzed in the next section, so the effects of the blending can be assessed.

5.2. Quality Metric Analysis

For each of the experimental subjects, the CIEDE2000 difference is calculated between a reference image of the calibrated set and the resulting photogrammetry texture image. It has been calculated considering both the D50 and D65 CIE illuminant of reference. They are both the most usual to calculate CIELAB formulae with and the most usual for depicting a mathematical reference white, respectively. Considering the colorimetric uncertainty of the light of acquisition, it is worthy to assess what differences can exist when considering either one. As a correlational measure, the Pearson coefficient between the patches is also calculated. The concrete numerical results, which are commented upon in the following paragraphs, can be analyzed in the tables found in the Supplementary Material. Averages and standard deviations of the Supplementary Materials can be checked as a sum-up of the analysis in Table 3 and Table 4.
Accordingly to the fact that wide RGB is the space that has the bigger gamut—therefore, including a bigger variety of values—it is certainly observed that the CIEDE2000 measurements for all wide RGB models bear the most stability. In some patches, exaggerated difference outliers (up to Δ E = 63 in some case) can be sporadically found—nevertheless, in wide RGB they are the most infrequent to happen. When inspecting the Pearson calculations, non-correlations are also less found in this space. sRGB models bear the least quality of all, with a greater variation of the colour differences, followed by Adobe with more stability. This confirms that the usage of a bigger gamut does not only respects more variety of hues, but also avoids miscalculations and erroneous handling of the colour information.
It is also important to observe that the LED and Fluor sets bear the CIEDE2000 measurements of the smallest magnitude, mostly between 0 and 6, with some outliers up to 10 in some cases. Δ E = 3 is the minimum acceptability threshold, when two colours start to be appreciated as different. Therefore, a CIEDE2000 difference of magnitude up to 6 bears a minimally perceivable difference. The wide RGB measurements in these sets are free of exaggerated outliers too. This phenomenon hints to the fact that laboratory illuminations of this kind—of neutral colour temperature and low luminosity—are suitable for the purpose of minimizing the colour information loss. A low temperature as in the Mixed set bears higher differences, and a high luminosity as in the Sun set appears to induce more variability in the quality measurements. In the latter, the high illuminations also implies that bright surfaces already lie at the edge of saturation. This variability is confirmed by the exaggerated colour differences perceived in the visual inspection, probably induced by miscolourings due to saturation.
Regarding the debayering techniques, the Direct debayering tends to induce a bigger difference than the Discard and the Interpolated strategies—which stay in the same order of difference. The existence of this phenomenon is favourable to the followed goal of intending to keep control on how the information is processed. The Direct debayering technique, whilst quick, undoubtedly denotes a black box: whilst the demoisaicing algorithm is known, it may incorporate other additional adjustments and transformations, even if some parameters can be adjusted.
The computational cost of photogrammetric reconstruction depends on the size of the input images, such that a common strategy to speed up computations is the reduction in input image size at cost of detail. In such a case, the Discard strategy, in which image size is reduced because only measured colour values are used, is arguably preferable to first creating a full-sized image with an interpolation-based approach and then down-sampling it to the desired size—both in terms of efficiency and original colour preservation.
However, when a more detailed model is needed, the Interpolation debayering can be employed. Even though if the interpolated pixels are “fabricated”, the appreciable difference is imperceptible.
The Discard debayering therefore implies the advantage of quick, light calculation and best results without the need of fabricating new pixels. Only if an extreme amount of detail is needed in the final models, Interpolation can be used due to its quadruple size without incurring in excessive aberrant colour fabrication.
Table 3 and Table 4 show the mean and standard deviation of the commented CIEDE2000 measurements, taking all patches for each space in all scenarios into account. Here it is also reflected how wide RGB bears the smallest average difference between source and texture image for each scenario in the vast majority of cases, with better quality in the Fluor and LED scenarios. It also shows the least variability of all (between CIEDE2000 2 and 4, which implies a non-noticeable or barely noticeable colour difference).
The Pearson correlation (as seen in the Supplementary Materials) between the original patches and the hues of the texture maps is in their vast majority superior to 95%. Some sparse lesser correlate (around 30% or 70%) or uncorrelated (0% or negative) patches can be found, especially in the Sun set or sRGB models, where colour differences are bigger and colour textures are ruptured by mispixelling effects—being a mix between the calibration assignation and the subsequent smoothing of the photogrammetry). However, wide RGB models flaunt the biggest amount of extremely positive correlation (>96% for Mixed, >98% for Fluor and LED and Sun, with sparse outliers in each), meaning a respect of the colour information integrity.

6. Conclusions

In this paper, an empirical study has been performed in order to deduce the optimum operational conditions for colour information integrity preservation in cultural heritage photogrammetry. Since colour is an important characteristic in material cultural heritage pieces, its preservation during documentation is a crucial aspect. The nature of colour information can be affected by external factors, such as illumination or its digitalization. Therefore, it is especially important to thoroughly know how the operational conditions and employed operations can affect it in order to adequately proceed.
It is desirable to maintain a degree of control on how the information is handled, therefore avoiding operational black boxes.
A variety of conditions has been varied, such as scene illumination, raw to RGB transformation parameters (debayering), or the presentation colour space. A self-made multidimensional calibration process of state-of-art results [8] has been employed so any colorimetric bias is removed. Therefore, the image batches used for photogrammetry are homogenized and can represent faithful colour information. The texture generation stage is the only step in the process automatically made, where the blending of colours from multiple input images is not easily traceable, so its effects are finally evaluated.
Wide RGB has proved to be the most effective colour space at avoiding miscolouring in the final images (by hue saturation at the gamut borders). Additionally, it is also the space that works the best for removing any colorimetric bias in calibration, maintaining texture, and revealing the stablest and highest preservation accuracy. When working together with controlled illumination conditions under LED or fluorescent lamps, which are illuminants of moderate brightness compared to natural sources and temperatures around the central section of the spectrum, the highest quality is achieved.
It has also to be ensured that the reference colours in the colour chart surround the colour values of the captured object in space. Then, the calibration will function properly, homogenizing the pictures without provoking miscolourings due to value extrapolations, as seen in the table in the shown Sun and LED sets. The choice of the illumination and chart to use is important [8] and will require a careful choice depending on the characteristics of the object.
Another consideration to be taken is the debayering strategy, or the transformation from raw colour values to RGB, imitating the performance of the acquisition camera block. Relying on a simple manual process discarding pixels not covered by the Bayer filter, or performing a simple bilinear interpolation, the integrity of information is further preserved compared than relying on automated machine processes. Any subsequent visual enhancement of the raw-to-RGB information is performed within the calibration step, instead of during the debayering. Therefore, any additional operations that might further modify the colour are unnecessary.
With the exposed conclusions, guidelines for the optimum handling of colour information have been established. The reproduction of these steps and conditions in any cultural heritage digitalization process will assure a minimum loss or fabrication of colour, after having studied the characteristics of the object.
Future lines of research will ensure the guidelines on how to properly reproduce and respect other physical characteristics of the pieces, such as 3D details. Furthermore, an interesting aspect of the presented process is its scalability, and future improvements in colour reproduction technology are compatible with it. The acquired raw data can be transformed to any other existing RGB space, including wider spaces that most likely will be standardized in the future. This will enable updating already made models along time with newer technologies that can improve its quality and documentation experience.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/heritage6080300/s1.

Author Contributions

Conceptualization: M.A.B.-Á., S.B., R.S. and J.M.M.; methodology: M.A.B.-Á. and S.B.; software: M.A.B.-Á. and S.B.; validation: M.A.B.-Á. and S.B.; formal analysis: M.A.B.-Á. and S.B.; investigation: M.A.B.-Á. and S.B.; resources: S.B. and R.S.; data curation: S.B.; writing—original draft preparation: M.A.B.-Á. and S.B.; writing—review and editing: M.A.B.-Á., S.B., and J.M.M.; visualization: M.A.B.-Á. and S.B.; supervision: R.S. and J.M.M.; project administration: R.S.; funding acquisition: R.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Austrian Science Fund (FWF) grant number P 33721-G.

Conflicts of Interest

The authors declare no conflict of interest.

Note

1
This work was performed within the project “Etruscan Mirrors in Austria (EtMirA)”, Austrian Science Fund, grant no. P 33721-G.

References

  1. Arnold, D. Computer Graphics and Cultural Heritage: From One-Way Inspiration to Symbiosis, Part 1. IEEE Comput. Graph. Appl. 2014, 34, 76–86. [Google Scholar] [CrossRef]
  2. Torres, J.C.; López, L.; Romo, C.; Arroyo, G.; Cano, P.; Lamolda, F.; Villafranca, M.M. Using a Cultural Heritage Information System for the documentation of the restoration process. In Proceedings of the 2013 Digital Heritage International Congress (DigitalHeritage), Marseille, France, 28 October–1 November 2013; Volume 2, pp. 249–256. [Google Scholar] [CrossRef]
  3. Nespeca, R. Towards a 3D digital model for management and fruition of Ducal Palace at Urbino. An integrated survey with mobile mapping. SCIRES-IT 2018, 8, 1–14. [Google Scholar] [CrossRef]
  4. Liyu, F.; Chenchen, H.; Yi, S. Application modes of virtual restoration and reconstruction technology in protection and presentation for cultural heritage in China. In Proceedings of the 2013 Digital Heritage International Congress (DigitalHeritage), Marseille, France, 28 October–1 November 2013; Volume 2, pp. 353–356. [Google Scholar] [CrossRef]
  5. He, N.; Li, J.; Wang, K.; Guo, L. A Study on the Expression of Jingchu Cultural Heritage. In Proceedings of the 2021 International Conference on Culture-oriented Science & Technology (ICCST), Beijing, China, 18–21 November 2021; pp. 409–412. [Google Scholar] [CrossRef]
  6. McCarthy, J. Multi-image photogrammetry as a practical tool for cultural heritage survey and community engagement. J. Archaeol. Sci. 2014, 43, 175–185. [Google Scholar] [CrossRef]
  7. Brown, M.S. Understanding the In-Camera Image Processing Pipeline for Computer Vision; IEEE CVPR 2016; National University of Singapore: Singapore, 2016. [Google Scholar]
  8. Barbero-Álvarez, M.A.; Rodrigo, J.A.; Menéndez, J.M. Self-Designed Colour Chart and a Multi-Dimensional Calibration Approach for Cultural Heritage Preventive Preservation. IEEE Access 2021, 9, 138371–138384. [Google Scholar] [CrossRef]
  9. Boochs, F.; Bentkowska-Kafel, A.; Degrigny, C.; Karaszewski, M.; Karmacharya, A.; Kato, Z.; Marcello, P.; Sitnik, R.; Treméau, A.; Tsiafaki, D.; et al. Colour and Space in Cultural Heritage: Interdisciplinary Approaches to Documentation of Material Culture. Int. J. Herit. Digit. Era 2014, 3, 713–730. [Google Scholar] [CrossRef]
  10. Russ, J.C.; Neal, F.B. The Image Processing Handbook, 7th ed.; CRC Press, Inc.: Boca Raton, FL, USA, 2015. [Google Scholar]
  11. Emmel, P.; Hersch, R. Colour calibration for colour reproduction. In Proceedings of the 2000 IEEE International Symposium on Circuits and Systems (ISCAS), Geneva, Switzerland, 28–31 May 2000; Volume 5, pp. 105–108. [Google Scholar] [CrossRef] [Green Version]
  12. Ohta, N. Colorimetry, Fundamentals and Applications; Wiley–IST Series in Imaging Science and Technology; John Wiley and Sons, Ltd.: Hoboken, NJ, USA, 2005. [Google Scholar]
  13. Bandara, R. A Music Keyboard with Gesture Controlled Effects Based on Computer Vision. Ph.D. Thesis, University of Sri Jayewardenepura, Nugegoda, Sri Lanka, 2011. [Google Scholar]
  14. Canon. White Paper New Generation 2/3-inch 4K UHD Long Zoom EFP Lenses; Canon Europe: Hillingdon, UK, 2017. [Google Scholar]
  15. Park, H.W.; Choi, J.W.; Choi, J.Y.; Joo, K.K.; Kim, N.R. Investigation of the Hue-Wavelength Response of a CMOS RGB-Based Image Sensor. Sensors 2022, 22, 9497. [Google Scholar] [CrossRef] [PubMed]
  16. Dong, X.; Xu, W.; Miao, Z.; Ma, L.; Zhang, C.; Yang, J.; Jin, Z.; Teoh, A.B.J.; Shen, J. Abandoning the Bayer-Filter to See in the Dark. arXiv 2022, arXiv:2203.04042. [Google Scholar]
  17. Johnston-Feller, R. Color Science in the Examination of Museum Objects: Nondestructive Procedures; Getty Publications: Los Angeles, CA, USA, 2001. [Google Scholar]
  18. Molada, A.; Marqués-Mateu, A.; Lerma, J. Correct use of color for Cultural Heritage documentation. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, IV-2/W6, 107–113. [Google Scholar] [CrossRef] [Green Version]
  19. Wang, K.A.; Liao, Y.C.; Tsai, M.T.; Chan, P.C. Research and Practice of Cultural Heritage Promotion: The Case Study of Value Add Application for Folklore Artifacts. In Proceedings of the 2012 International Symposium on Computer, Consumer and Control, Taichung, Taiwan, 4–6 June 2012; pp. 610–613. [Google Scholar] [CrossRef]
  20. Sun, W.; Li, H.G.; Xu, X. Research on Key Technologies of Three-dimensional Digital Reconstruction of Cultural Heritage in Historical and Cultural Blocks. In Proceedings of the 2021 International Conference on Computer Technology and Media Convergence Design (CTMCD), Sanya, China, 23–25 April 2021; pp. 222–226. [Google Scholar] [CrossRef]
  21. Karaszewski, M.; Lech, K.; Bunsch, E.; Sitnik, R. In the Pursuit of Perfect 3D Digitization of Surfaces of Paintings: Geometry and Color Optimization. In Proceedings of the Digital Heritage. Progress in Cultural Heritage: Documentation, Preservation, and Protection, Limassol, Cyprus, 3–8 November 2014; Ioannides, M., Magnenat-Thalmann, N., Fink, E., Žarnić, R., Yen, A.Y., Quak, E., Eds.; Springer International Publishing: Cham, Switzerland, 2014; pp. 25–34. [Google Scholar]
  22. RawPy. RawPy-API. 2023. Available online: https://letmaik.github.io/rawpy/api/ (accessed on 2 July 2023).
  23. Hirakawa, K.; Parks, T. Adaptive Homogeneity-Directed Demosaicing Algorithm. IEEE Trans. Image Process. 2005, 14, 360–369. [Google Scholar] [CrossRef] [PubMed]
  24. Griwodz, C.; Gasparini, S.; Calvet, L.; Gurdjos, P.; Castan, F.; Maujean, B.; Lillo, G.D.; Lanthony, Y. AliceVision Meshroom: An open-source 3D reconstruction pipeline. In Proceedings of the 12th ACM Multimedia Systems Conference, MMSys’21, Istanbul, Turkey, 28 September–1 October 2021; ACM Press: New York, NY, USA, 2021. [Google Scholar] [CrossRef]
  25. Burt, P.J.; Adelson, E.H. A Multiresolution Spline with Application to Image Mosaics. ACM Trans. Graph. 1983, 2, 217–236. [Google Scholar] [CrossRef]
  26. Lévy, B.; Petitjean, S.; Ray, N.; Maillot, J. Least Squares Conformal Maps for Automatic Texture Atlas Generation. ACM Trans. Graph. 2002, 21, 362–371. [Google Scholar] [CrossRef]
  27. Luo, M.; Cui, G.; Rigg, B. The development of the CIE 2000 colour-difference formula: CIEDE2000. Color Res. Appl. 2001, 26, 340–350. [Google Scholar] [CrossRef]
  28. ITU-Ra. Recommendation ITU-R BT.709-6. 2015. Available online: https://www.itu.int/dms_pubrec/itu-r/rec/bt/R-REC-BT.709-6-201506-I!!PDF-E.pdf (accessed on 2 July 2023).
  29. Adobe. Adobe RGB (1998) Color Image Encoding. 2005. Available online: https://www.adobe.com/digitalimag/adobergb.html (accessed on 2 July 2023).
  30. Pascale, D. A Review of RGB Color Spaces. 2003. Available online: https://babelcolor.com/index_htm_files/A%20review%20of%20RGB%20color%20spaces.pdf (accessed on 2 July 2023).
  31. Barbero-Álvarez, M.A.; Menéndez, J.M.; Rodrigo, J.A. An Adaptive Colour Calibration for Crowdsourced Images in Heritage Preservation Science. IEEE Access 2020, 8, 185093–185111. [Google Scholar] [CrossRef]
Figure 1. Conceptual view of a textured 3D model of a Cultural Heritage piece, consisting of a triangular 3D mesh with a colour texture mapped to its surface.
Figure 1. Conceptual view of a textured 3D model of a Cultural Heritage piece, consisting of a triangular 3D mesh with a colour texture mapped to its surface.
Heritage 06 00300 g001
Figure 2. Process workflow of the presented algorithm. The yellow frame shows the stages where RGB data are controlled.
Figure 2. Process workflow of the presented algorithm. The yellow frame shows the stages where RGB data are controlled.
Heritage 06 00300 g002
Figure 3. Representation of the CIE xy values corresponding to the pixels of one of the images under analysis. The blue triangle corresponds to sRGB gamut, and the red triangle to the gamut of wide RGB. (Left) Presentation-ready picture in sRGB rendered with the camera’s default colour profile; (Center) points in black show the distributions of the camera RGB data associated to the picture; (Right) the same data after being converted to sRGB. It is to note that after conversion, the colour values stay within the limits of the chosen standardized gamut.
Figure 3. Representation of the CIE xy values corresponding to the pixels of one of the images under analysis. The blue triangle corresponds to sRGB gamut, and the red triangle to the gamut of wide RGB. (Left) Presentation-ready picture in sRGB rendered with the camera’s default colour profile; (Center) points in black show the distributions of the camera RGB data associated to the picture; (Right) the same data after being converted to sRGB. It is to note that after conversion, the colour values stay within the limits of the chosen standardized gamut.
Heritage 06 00300 g003
Figure 4. Example of texture map: areas of the texture map (left) are referenced from the triangles of the 3D model (right).
Figure 4. Example of texture map: areas of the texture map (left) are referenced from the triangles of the 3D model (right).
Heritage 06 00300 g004
Figure 5. Gamut coverage of the CIE xy plane of the digital RGB spaces considered in this work.
Figure 5. Gamut coverage of the CIE xy plane of the digital RGB spaces considered in this work.
Heritage 06 00300 g005
Figure 6. Photogrammetric image acquisition setups. (Top row) environment and lighting. (Bottom row) examples of acquired images (rendered with the camera’s default colour profile).
Figure 6. Photogrammetric image acquisition setups. (Top row) environment and lighting. (Bottom row) examples of acquired images (rendered with the camera’s default colour profile).
Heritage 06 00300 g006
Figure 7. From left to right: calibrated sample image of the Mixed set after being transformed into sRGB, Adobe RGB, and wide RGB. It is clearly seen that the wide RGB enables the best neutralization of colour bias in the images.
Figure 7. From left to right: calibrated sample image of the Mixed set after being transformed into sRGB, Adobe RGB, and wide RGB. It is clearly seen that the wide RGB enables the best neutralization of colour bias in the images.
Heritage 06 00300 g007
Figure 8. (Left column) Sample images of test set Mixed and Fluor; (Right column) same images after undergoing the process after Direct debayering in wide RGB, ready for photogrammetry.
Figure 8. (Left column) Sample images of test set Mixed and Fluor; (Right column) same images after undergoing the process after Direct debayering in wide RGB, ready for photogrammetry.
Heritage 06 00300 g008
Figure 9. From left to right: calibrated sample image of the Fluor set after being transformed into sRGB, Adobe RGB, and wide RGB. It is to be noticed how a bigger gamut (as in wide RGB) permits the presence of a wider variety of hues in the image, while the saturation of values in the other two spaces provokes miscolouring due to saturation.
Figure 9. From left to right: calibrated sample image of the Fluor set after being transformed into sRGB, Adobe RGB, and wide RGB. It is to be noticed how a bigger gamut (as in wide RGB) permits the presence of a wider variety of hues in the image, while the saturation of values in the other two spaces provokes miscolouring due to saturation.
Heritage 06 00300 g009
Figure 10. From left to right: calibrated sample image of the LED set after being transformed into sRGB, Adobe RGB, and wide RGB. Due to the illuminant source, the table is corrected as gray whilst the colours of high saturation—as the chart—are preserved.
Figure 10. From left to right: calibrated sample image of the LED set after being transformed into sRGB, Adobe RGB, and wide RGB. Due to the illuminant source, the table is corrected as gray whilst the colours of high saturation—as the chart—are preserved.
Heritage 06 00300 g010
Figure 11. From left to right: calibrated sample image of the Sun set after being transformed into sRGB, Adobe RGB, and wide RGB. The multidimensional calibration function based on the saturated colours in the original picture provokes artifacts, pixelings, and miscolouring of the hues in subspaces distant to the chart values, as in the background of the image.
Figure 11. From left to right: calibrated sample image of the Sun set after being transformed into sRGB, Adobe RGB, and wide RGB. The multidimensional calibration function based on the saturated colours in the original picture provokes artifacts, pixelings, and miscolouring of the hues in subspaces distant to the chart values, as in the background of the image.
Heritage 06 00300 g011
Figure 12. (Left) example of model acquired with photogrammetry, of the Mixed scenario; (Right) detail of the same model where visual uniformity-rupturing defects can be seen scattered around the areas with less detailed coverage under acquisition, such as the colour chart.
Figure 12. (Left) example of model acquired with photogrammetry, of the Mixed scenario; (Right) detail of the same model where visual uniformity-rupturing defects can be seen scattered around the areas with less detailed coverage under acquisition, such as the colour chart.
Heritage 06 00300 g012
Table 1. Table depicting the primary xy coordinates for different RGB digitalisations.
Table 1. Table depicting the primary xy coordinates for different RGB digitalisations.
List of Primary Chromaticity Coordinates for Different RGB Digitalisations
SpaceR PrimaryG PrimaryB PrimaryReference White
xyxyxyxy
sRGB0.640.330.30.60.150.060.31270.329
Adobe RGB0.640.330.210.710.150.060.3140.351
Wide RGB0.730.270.120.830.150.020.34570.3585
Table 2. Experimental photogrammetry setups—overview.
Table 2. Experimental photogrammetry setups—overview.
Setup IDSunFluorLEDMixed
Environmentlaboratorymuseum
Lightingindirect
sunlight
fluorescent
room light
white LED 2x halogen
  softboxes
+fluorescent
  room light
Acquisition
mode
moving cameraturntable
LensNikkorTamron
Aperturef/10f/16
ISO5006406401000
Exposure1/401/301/101/60 s
Table 3. Table showing the average CIEDE2000 differences for all patches of the colour chart between the reference image and the photogrammetry image. The results are calculated for two illuminants, D65 and D50, for every debayering technique. The most compelling results are marked bold.
Table 3. Table showing the average CIEDE2000 differences for all patches of the colour chart between the reference image and the photogrammetry image. The results are calculated for two illuminants, D65 and D50, for every debayering technique. The most compelling results are marked bold.
DiscardDiscardInterp.Interp.DirectDirect
Metric Δ E -D50 Δ E -D65 Δ E -D50 Δ E -D65 Δ E -D50 Δ E -D65
Mixed-Adobe12.42212.72812.15712.36612.27912.483
Mixed-sRGB12.74313.03412.96213.27614.28414.514
Mixed-Wide11.31311.29611.23711.24711.03311.049
Fluor-Adobe4.2074.6484.2154.6404.6104.861
Fluor-sRGB5.2425.4935.3445.5628.4208.441
Fluor-Wide4.5844.7654.5244.6714.3324.380
LED-Adobe6.1756.1545.9746.0016.4406.512
LED-sRGB6.3526.2746.2206.1206.3146.220
LED-Wide5.7235.5185.4805.3315.9825.822
Sun-Adobe8.5928.8208.9689.27910.57410.980
Sun-sRGB8.6409.0519.5389.84513.52513.640
Sun-Wide8.1098.2308.1278.4118.8498.961
Table 4. Table showing the standard deviations for the CIEDE2000 differences for all patches of the colour chart between the reference image and the photogrammetry image. The results are calculated for two illuminants, D65 and D50, for every debayering technique. The most compelling results are marked bold.
Table 4. Table showing the standard deviations for the CIEDE2000 differences for all patches of the colour chart between the reference image and the photogrammetry image. The results are calculated for two illuminants, D65 and D50, for every debayering technique. The most compelling results are marked bold.
DiscardDiscardInterp.Interp.DirectDirect
Metric Δ E -D50 Δ E -D65 Δ E -D50 Δ E -D65 Δ E -D50 Δ E -D65
Mixed-Adobe5.8315.9145.8165.8644.2484.272
Mixed-sRGB4.9905.6234.9065.6644.8115.381
Mixed-Wide4.3814.2914.2294.1464.0864.022
Fluor-Adobe2.2462.9562.4573.2852.8403.139
Fluor-sRGB2.4522.6863.1273.38012.04311.061
Fluor-Wide2.5972.5912.9002.9283.0342.668
LED-Adobe2.8252.803.1693.2882.6923.014
LED-sRGB3.1823.0323.3202.9602.7342.604
LED-Wide2.2261.7952.2651.9782.2552.006
Sun-Adobe4.3384.5824.2864.3416.7457.635
Sun-sRGB4.2184.2714.2634.42112.46211.751
Sun-Wide4.9874.8824.8584.9055.0544.802
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Barbero-Álvarez, M.A.; Brenner, S.; Sablatnig, R.; Menéndez, J.M. Preserving Colour Fidelity in Photogrammetry—An Empirically Grounded Study and Workflow for Cultural Heritage Preservation. Heritage 2023, 6, 5700-5718. https://doi.org/10.3390/heritage6080300

AMA Style

Barbero-Álvarez MA, Brenner S, Sablatnig R, Menéndez JM. Preserving Colour Fidelity in Photogrammetry—An Empirically Grounded Study and Workflow for Cultural Heritage Preservation. Heritage. 2023; 6(8):5700-5718. https://doi.org/10.3390/heritage6080300

Chicago/Turabian Style

Barbero-Álvarez, Miguel Antonio, Simon Brenner, Robert Sablatnig, and José Manuel Menéndez. 2023. "Preserving Colour Fidelity in Photogrammetry—An Empirically Grounded Study and Workflow for Cultural Heritage Preservation" Heritage 6, no. 8: 5700-5718. https://doi.org/10.3390/heritage6080300

APA Style

Barbero-Álvarez, M. A., Brenner, S., Sablatnig, R., & Menéndez, J. M. (2023). Preserving Colour Fidelity in Photogrammetry—An Empirically Grounded Study and Workflow for Cultural Heritage Preservation. Heritage, 6(8), 5700-5718. https://doi.org/10.3390/heritage6080300

Article Metrics

Back to TopTop