Next Article in Journal
A Neural Network-Based Fault-Tolerant Control Method for Current Sensor Failures in Permanent Magnet Synchronous Motors for Electric Aircraft
Previous Article in Journal
Development of Improved Empirical Take-Off Equations
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Mars-On-Orbit Color Image Spectrum Model and Color Restoration

1
Institute of Optics and Electronics, Chinese Academy of Sciences, Chengdu 610209, China
2
National Laboratory on Adaptive Optics, Chengdu 610209, China
3
Key Laboratory of Science and Technology on Space Optoelectronic Precision Measurement, Chinese Academy of Sciences, Chengdu 610209, China
4
University of Chinese Academy of Sciences, Beijing 100149, China
*
Author to whom correspondence should be addressed.
Aerospace 2025, 12(8), 696; https://doi.org/10.3390/aerospace12080696 (registering DOI)
Submission received: 3 June 2025 / Revised: 12 July 2025 / Accepted: 30 July 2025 / Published: 4 August 2025
(This article belongs to the Section Astronautics & Space Science)

Abstract

Deep space Color Remote Sensing Images (DCRSIs) are of great significance in reconstructing the three-dimensional appearance of celestial bodies. Among them, deep space color restoration, as a means to ensure the authenticity of deep space image colors, has significant research value. The existing deep space color restoration methods have gradually evolved into a joint restoration mode that integrates color images and spectrometers to overcome the limitations of on-orbit calibration plates; however, there is limited research on theoretical models for this type of method. Therefore, this article begins with the physical process of deep space color imaging, gradually establishes a color imaging spectral model, and proposes a new color restoration method for the color restoration of Mars remote sensing images. The experiment verifies that our proposed method can significantly reduce color deviation, achieving an average of 8.43 CIE DE 2000 color deviation units, a decrease of 2.63 (23.78%) compared to the least squares method. The color deviation decreased by 21.47 (71.81%) compared to before restoration. Hence, our method can improve the accuracy of color restoration of DCRSIs in space orbit.

1. Introduction

Deep space color remote imaging technology is a crucial technological path for humans to explore the geometric structure, surface characteristics, and components beyond the solar system and Earth’s gravitational field [1,2,3]. The color information of a celestial body’s surface, obtained through deep space exploration color imaging, is an important source for studying its geometric structure, surface characteristics, and composition [4,5,6,7]. However, deep space imaging has slight color deviations due to the unique nature of deep space environments, and inconsistencies between the color response characteristics of the camera. Color restoration is the most scientific and practical technical way to solve the color deviation problem [8,9,10].
Different from ground color restoration [11,12,13], deep space color restoration needs to face two special challenges: (1) the lack of atmosphere in the deep space environment makes the ground calibration parameter not suitable, resulting in color distortion, such as the exploration of the moon and Mars; (2) the lack of prior information in the face of unknown environment makes the existing methods based on learning and dark primaries prior not applicable. Therefore, it is a natural idea to directly collect data on-orbit and perceive the characteristics of the deep space environment to recover the actual color.
The earliest color restoration suitable for deep space environments is the image-based method [14,15,16]. To overcome the imaging difference between the deep space environment and the ground, using a color standard plate is a common and reliable choice. The Mars surface “Curiosity” (2011) rover is equipped with three on-orbit color standard plates, and The Mars “Perseverance” (2020) rover also takes more refined on-orbit color standard plates, as shown in Figure 1a,b. In 2020, China’s “Zhurong” Mars rover also carried similar color standard plates, as shown in Figure 1c.
Based on the above equipment, numerous image-based studies have proposed solutions. For example, Zhao Rujin et al. [20,21] designed a two-step color correction method that combines color restoration and white balance, achieving satisfactory results through ground verification, as shown in Figure 2a,b. Ron L. Levin and Gilbert V. Levin utilized an active light source to calibrate the color intensity and applied this technique to the Viking and Pathfinder missions [22], as shown in Figure 2c,d. Shi Z et al. proposed a Color restoration subnetwork and an enhancer subnetwork to correct the image color deviation [23,24]. Then, Liu W et al. proposed that the problem of image color transfer and restoration should be attributed to the problem of reversible information hiding in the image. The internal correlation architecture is proposed [25]. The advantages of this image-based method lie in its ease of implementation and good robustness. However, its shortcomings lie in its dependence on the color fidelity of the color standard board. However, long-term spatial radiation leads to color degradation and distortion, and the board consumes a significant amount of quality resources.
In the absence of a standard color chart, accurately and consistently restoring color is a key issue. Therefore, it is necessary to introduce new information and establish a color restoration model suitable for deep space environments [26,27]. The reference spectroscopy method for achieving color correction involves supplementing the influence of the difference between the illumination spectrum of the extraterrestrial environment and the ground by utilizing spectral information obtained from multispectral cameras or spectrometers. In 2016, Park, Chulhee and Kang, Moon Gi proposed a color restoration method for RGBN multispectral filter array sensor images based on spectral decomposition [28]. In 2020, Wang H et al. proposed a novel color restoration method based on a normalized RGB color model using the Mars Exploration Multi-spectral Camera [29]. Similar methods are applied to full-waveform multispectral LiDAR [30]. The latest research comes from Ren Xin [31], who combined a mineral spectrometer with an RGB camera and applied it to “Tianwen-1” to achieve color correction and balance across Mars [32,33,34], as shown in Figure 3.
This method has significant advantages in color difference evaluation compared to purely image-based methods. However, its drawback is that it often requires a large number of empirical formulas and parameters, and its theoretical research is relatively limited. There is relatively little interaction between deep space color imaging and deep space spectroscopy in the physical model.
In summary, the existing on-orbit color restoration methods are primarily based on on-orbit color calibration plates, with a few methods combining camera and spectral information to achieve color restoration. However, the existing methods for combining spectral information lack an accurate model for the spectral effects during the imaging process. The impact of spectral information on imaging processes is often only qualitatively described, lacking quantitative models. The color restoration technology for deep space exploration in orbit still has difficulties in aerospace technology.
To address the issues of lacking prior color information and image spectrum models, this paper proposes the color image spectrum model to quantitatively describe the color imaging process and establish a mathematical model between deep space environmental reflectance spectra and imaging colors. The beneficial effects can be mainly concluded as follows:
  • Unlike models based on three stimulus functions, the color imaging spectral model starts from the radiation energy spectrum of ambient light, A quantitative spectral imaging method has been developed, considering factors such as optical lenses, Bayer filter arrays, and the quantum characteristics of image sensors.
  • Unlike subjective indicators based on the three stimulus functions, the color spectrum model proposed in this paper relies only on measurable physical quantities, making it more objective.
  • In terms of solving CCM, the least squares method is first used to estimate CCM. Then, the CIE DE 2000 color deviation weight matrix is designed to optimize CCM, effectively reducing color deviation and improving restoration accuracy.

2. Related Works

In the Introduction Section, we have introduced the development process of in-orbit color restoration technology, from methods based on in-orbit calibration plates to methods based on image information and gradually to methods that fuse spectral information. This section mainly discusses the in-orbit color restoration technology that integrates spectral information.
Hong Wang et al. [29] proposed a new color calibration method in brightness-independent chromaticity space to eliminate the influence of light source brightness and relative spectral distribution on color calibration. The core lies in converting complex imaging mechanisms into the product of brightness components and spectral components. Thus, by normalizing the RGB color space to eliminate the differences in brightness components, color calibration can be achieved by simply calculating the 4 × 4 matrix in the normalized RGB space. This method still relies on the color calibration board to obtain the true color value but eliminates the influence of brightness during the experimental process.
Chulhee Park et al. [28] proposed a color restoration method for an imaging system based on the MSFA image sensor with RGBN filters. The proposed color restoration method estimates the spectral intensity in the NIR band and recovers hue and color saturation by decomposing the visible band component and the NIR band component in each RGB color channel. However, this method still requires imaging of the standard color palette to obtain a color reference.
In summary, although existing in-orbit color restoration methods have many innovations in processing color information, they all rely on imaging data of standard color plates, and this dependence is one of the key factors restricting the development of in-orbit color restoration technology.

3. Materials and Methods

3.1. Color Image Spectrum Model

The physical process of color optical imaging can be described as follows: the sunlight reflected by the target object is received by the optical lens of the deep space camera and then transmitted through the Bayer color filter to illuminate the imaging sensor. Finally, the imaging sensor converts light energy into electrical signals through the photoelectric effect, thereby obtaining the target image. The details of the above processes are shown in Figure 4.
Three key physical processes are involved in Figure 4, including optical transmission, photoelectric conversion, and the Bayer color filter process.

3.1.1. Optical Transmission Model

The optical transmission model describes the process by which reflected light from the target passes through the optical lens and illuminates the image sensor.
In the color imaging spectrum model, we are not concerned with the specific location of the target in the image but rather with its grayscale value, which is the output of the color image sensor. Hence, from the perspective of energy transmission, we naturally simplify the complex effect of optical lenses on reflected light and convert it into an optical transmission model, as shown in (1).
Φ l e n s ( λ ) = Ω l e n s Ω 0 Φ 0 ( λ ) m ( λ )
where Φ 0 ( λ ) is the emission radiance spectrum of the target, Φ l e n s ( λ ) is the incident radiance spectrum of the image sensor, m ( λ ) is the lens transmittance, and Ω l e n s and Ω 0 are, respectively, the lens solid angle and the target emission solid angle.
In the optical transmission model, we utilize the radiance spectrum (such as Φ 0 ( λ ) ) to represent the energy distribution of light and we utilize Ω l e n s Ω 0 to model the lens’ ability to focus and diverge light rays. m ( λ ) describes the degree of light transmission through the lens material itself. We have overlooked the impact of the lens field of view on energy, which is relatively small.

3.1.2. Photoelectric Conversion Model

The photoelectric conversion model describes the process of converting light irradiated on an image sensor into electrical signals. This model primarily establishes the relationship between the energy distribution of light, image sensor parameters, and image gray-scale values, as shown in (2).
N 0 ( λ ) = Φ l e n s ( λ ) t S 0 E λ
where N 0 ( λ ) is the number of photons with wavelength λ , S 0 is the target luminous area, t is the integration time, and E λ = h c / λ is the energy of the photons with wavelength λ (h is Planck constant, c is the speed of light).
In (2), we convert the distribution of light energy into the distribution of photon numbers because the parameter describing the photoelectric conversion efficiency in image sensor parameters is based on the number of photons. Therefore, the relationship between the gray-scale value D N of the image and the number of photons is shown in (3).
D N = λ 1 λ 2 N 0 ( λ ) Q E ( λ ) G
where D N is the gray-scale value of the image, and Q E ( λ ) is the quantum efficiency of the image sensor, and G is the conversion factor.
In (3), Q E ( λ ) represents the conversion efficiency of image sensors for light of different wavelengths, while G represents the ability to convert photons into electrons.
Substituting (2) and (3), we can obtain the relationship between the image gray-scale value D N and the incident radiance spectrum of image sensor Φ l e n s ( λ ) (also referred to as light energy distribution), as shown in (4).
D N = λ 1 λ 2 Φ l e n s ( λ ) t S 0 E λ Q E ( λ ) G
however, (4) can only depict the process of panchromatic imaging, ignoring the spectral selection effect of the Bayer color filter; therefore, the model still needs further adjustment. λ 1 and λ 2 cover the entire visible light spectrum (380 nm 780 nm), generally.

3.1.3. Bayer Filter Spectrum Model

The Bayer filter model is established to model the spectral selection effect of Bayer color filter arrays. The basic mechanism of the Bayer color filter array is to utilize the filter to have high transmittance for specific wavelengths of light while reflecting or absorbing light of other wavelengths. So, we naturally establish a simple transmittance model (5) to describe this characteristic.
m b a y e r ( λ ) = { m ( λ ) λ l o w < λ < λ u p p e r 0 o t h e r
where m ( λ ) is the lens transmittance (same as (1)), and λ l o w and λ u p p e r are the lower and upper wavelength bounds of the red, yellow, and green ternary colors, respectively.

3.1.4. Brief Summary

Combining (5), (4), and (1), the color image spectrum model fused with the Bayer color filter effect can be obtained, as shown in (6).
D N r = λ r 1 λ r 2 Φ l e n s ( λ ) t S 0 E λ Q E ( λ ) G D N g = λ g 1 λ g 2 Φ l e n s ( λ ) t S 0 E λ Q E ( λ ) G D N b = λ b 1 λ b 2 Φ l e n s ( λ ) t S 0 E λ Q E ( λ ) G
where ( D N r , D N g , D N b ) are the color-values output by the image sensor, ( λ r 1 , λ r 2 ) is the wavelength range of red light, ( λ g 1 , λ g 2 ) is the wavelength range of green light, ( λ b 1 , λ b 2 ) is the wavelength range of blue light. For specific image detector manuals, these values can be found. For CMV12000, ( λ r 1 , λ r 2 ) = ( 580 nm 800 nm ) , ( λ g 1 , λ g 2 ) = ( 450 nm 610 nm ) , ( λ b 1 , λ b 2 ) = ( 380 nm 550 nm ) .

3.2. Deep Space Color Restoration Combined with On-Orbit Spectrum

The basic principle of color restoration is to directly establish a mapping relationship between the standard color value and the output of the image sensor. Since the basic elements of color are the three primary colors, this mapping relationship can be modeled as a 3 × 3 matrix (namely the color correction matrix, CCM).
r c g c b c = C 11 C 12 C 13 C 21 C 22 C 23 C 31 C 32 C 33 D N r D N g D N b = C C M D N r D N g D N b
where ( r c , g c , b c ) are standard colors, and ( D N r , D N g , D N b ) are the color values output by the image sensor.
On the ground, the core of color restoration is only the accurate calculation of the CCM method by the known standard color and the color value outputs of the image sensor. After all, standard colors and their output in image sensors are readily available. Specifically, the required data can be obtained by imaging the standard color palettes.
However, in deep space environments and the absence of standard color palettes, color restoration will face a new serious problem: how to obtain the true color value in a deep space environment. Therefore, we propose a deep space color restoration method combined with on-orbit spectral data. The core innovation lies in using on-orbit spectrum and image sensor characteristics to calculate the true color imaging values of deep space environments.
The overall process of the proposed method is shown in Figure 5.
In Figure 5, our proposed method consists of two key parts: (1) deep space color true value calculation module is used to calculate the deep space color true value ( r c , g c , b c ) based on the environmental spectrum collected by the mineral spectrometer; (2) color deviation iterative optimization algorithm is designed to obtain the optimal CCM though ( r c , g c , b c ) and the outputs of the image sensor. The details are shown as follows.

3.2.1. Deep Space True Color Calculation Module

Based on the details of the color image spectrum model in Section 3.1, we can see that the key factors determining color differences are quantum efficiency Q E ( λ ) , emission radiance spectrum of target Φ 0 ( λ ) , and lens transmittance m ( λ ) . They represent the selective effects of image sensors, deep space environments, and optical systems on spectra.
Considering that in actual camera design, the lens transmittance curve can be measured and adjusted, the model simplifies m ( λ ) to a fixed value (usually 0.7) at the transparent wavelength λ . By merging the wavelength-independent terms in (6), the final (8) can be obtained as follows:
D N r = 600 nm 800 nm Φ l e n s ( λ ) Q E ( λ ) K D N g = 450 nm 650 nm Φ l e n s ( λ ) Q E ( λ ) K D N b = 340 nm 540 nm Φ l e n s ( λ ) Q E ( λ ) K
where the red D N r corresponds to 600 nm 800 nm , the green D N g corresponds to 450 nm 650 nm , and the blue D N b corresponds to 340 nm 540 nm . The specific wavelength range can be adjusted appropriately according to the quantum efficiency curve of the image sensor.
For the quantum efficiency Q E ( λ ) , we usually obtain it by experiment. Specifically, we scan the spectral response of the four channels ( R , G 1 , G 2 ) , and B of the color image sensor through the integrating sphere and record the integrating sphere irradiance, sensor output D N , and dark field gray-scale value for each channel at each wavelength. Taking the “Tianwen-1” medium-field camera as an example, the quantum efficiency is shown in Section 4.2.
For the emission radiance spectrum of target Φ 0 ( λ ) , we can directly collect it on-orbit through the mineral spectrum analyzer. The details are described in detail in Section 4.2.

3.2.2. Color Deviation Iterative Optimization Algorithm

Due to the lack of atmospheric scattering in the deep space exploration environment, there are no light areas and deep space backgrounds in the imaging area (i.e., the imaging sensor output only contains background noise and no effective signal). Therefore, we set a weight matrix W to remove these special pixels. Then, we designed a color deviation iterative optimization algorithm to improve the restoration accuracy further and obtain the optimal CCM, as shown in Figure 6.
The true color of the image should be infinitely close to the response color of the “standard human eye”. Therefore, the obtained N sets of corrected chromaticity value data C m are fitted with the corresponding standard chromaticity value data C c . According to the traditional ground color restoration method, the least squares method is used to obtain the CCM preliminarily, and the color correction matrix CCM is calculated in the sRGB color space:
C C M = ( C m T C m ) 1 C m T C C
To ensure consistency between the color deviation calculation results and visual evaluation, we chose the CIE DE 2000 color deviation formula [35] as the color evaluation model. Apply the preliminarily obtained CCM to the sample to be calibrated and use the CIE DE 2000 color deviation formula to calculate the calibrated color deviation.
To further optimize CCM, we normalize the obtained color deviation as a weight matrix and use the following formula to re-optimize CCM until satisfactory color deviation results are obtained, obtaining the optimal CCM:
C C M = ( C m T ω C m ) 1 C m T ω C C
Finally, apply the obtained optimal CCM to the entire image I as follows:
I = I C C M
This reverses the true color of the image, I refers to the image that meets the standard color response of the human eye after correction.

4. Results

4.1. Evaluation Index

We used the CIE DE 2000 color deviation formula as an evaluation indicator. The formula for color deviation is an effective computational method for quantitatively describing color deviation, ensuring the visual consistency of color deviation evaluation.
It corrects for the non-uniformity of the LAB color space and is the latest color deviation calculation method proposed by CIE. This formula has accurate prediction performance in all current international visual experimental data tests, both new and old.
Δ E 00 = ( Δ L K L S L ) 2 + ( Δ C a b K C S C ) 2 + ( Δ H a b K H S H ) + R T ( Δ C K C S C ) ( Δ H a b K H S H )
where Δ L is the difference in luminosity, and Δ C a b is the difference in chroma, Δ H a b represents hue difference, and K C , K L , K H parameter weighting factors, and S L , S C , S H weight coefficients. The specific calculation method is described in [36].

4.2. Principal Verification

The principle validation is to verify the correctness of the color imaging spectral model. Specifically, the imaging color of the target is calculated based on the known quantum efficiency curve and target reflection spectrum, and the color theory value is compared with the model output to determine the correctness of the principle. The theoretical value of the target reflection spectrum is based on the reflection spectrum data provided by the standard color plate, while the quantum efficiency curve requires experimental measurement. The measurement method and results are given below.
The quantum efficiency of the deep space exploration camera test platform is shown in Figure 7a. In the quantum efficiency test, we ran the experiment with four steps as follows:
  • Placing the camera in front of the integrating sphere.
  • Covering the camera lens with a file.
  • Setting the appropriate exposure time.
  • Scanning the fixed visible light band with the integrating sphere and filter to test the output response of the four channels.
The test wavelength range is 350 nm to 850 nm, with 11 wavelength points at an interval of 50 nm. The integral sphere irradiation brightness, the optical response grayscale value of each pixel of the camera, and the dark field grayscale value in each wavelength band should be recorded.
The quantum efficiency values of the R, G1, G2, and B channels of the camera’s visible light band obtained through the test are shown in Table 1. The range is from 0 to 100%, and G1, G2 represent two different green channels in the Bayer color filter array.
Figure 7b shows the quantum efficiency curve obtained from the experiment, the curve function of quantum efficiency of each RGB channel R( λ ), B( λ ), G( λ ). Among them, G( λ ) is calculated from the average of G1( λ ) and G2( λ ). The curves of the three channels show significant differences from the “standard human eye” CIE 1931 color matching function, which can lead to significant color deviation in images from deep space environments, further underscoring the necessity of on-orbit color restoration.
To verify the accuracy of the theoretical calculation model, we selected three color blocks that closely match the color of Mars. We compared the theoretical calculation of color display with the results of camera shooting to validate the rationality of our proposed color imaging spectral model in ground experiments.
In Figure 8, the chromaticity values of each color block are as follows: (a1) [88,62,27]; (b1) [88,60,21]; (a2) [124,83,39]; (b2) [123,82,28]; (a3) [189,135,61]; (b3) [190,128,49]. Among them, (a1), (a2), and (a3) are the theoretical color values of the color block itself. And (b1), (b2), and (b3) are color values calculated based on the quantum efficiency curve in Table 1 and color block reflection radiation spectrum [35]. The chromaticity values calculated by the model are very close to those captured, demonstrating the model’s accuracy and enabling it to verify the correspondence between pixels and spectra.

4.3. Ground Verification Test

4.3.1. Experimental Design

To verify the correctness of the proposed method, we conducted simulation experiments and ground tests. The experimental platform is shown in Figure 9. The experimental setup consists of a 24-color standard color scale, “Color-Checker CLASSIC” [37] a D65 standard light source lightbox [38], a camera, and a spectro-radiometer. The above devices were used for ground verification of on-orbit joint color restoration.
Place the Color-Checker under a standard D65 light source as the standard, acquiring image data of each color code block and obtaining chromaticity values in the camera system’s sRGB color space.
The manufacturer determines the standard values of color patches in traditional ground restoration methods and may not be applicable in different scenarios. This paper utilizes a spectro-radiometer to obtain the reflection spectra of color patches when irradiated at a 45-degree angle with a standard D65 light source, within 2-degrees tolerance of each color patch.

4.3.2. Test Results

Through ground verification tests, we obtained the color response of each color block. Figure 10 presents a comparison of results obtained before and after restoration using the Color-Checker method. The sub-figures a, b, c, and d correspond to the different scenarios: the “standard human eye”, pre-restoration [20], and post-restoration using the least squares method [21], and the result after obtaining the optimal CCM effect through this method, respectively. Before restoration, the response of each color block differed significantly from that of the human eye, exhibiting a yellowish and brighter overall color tone.
The comparison methods include the “standard human eye” method, the pre-restoration method, and the post-restoration, which uses the least squares method. The improvement achieved by traditional ground restoration methods is not significant; however, the overall effect obtained through the CCM method in this paper has been dramatically improved, especially for the 1–18 color patches, namely the color blocks in the first three rows of Figure 11. The results are already very close to human visual perception.
Table 2 presents the results of the color deviation calculations. Figure 11 provides a comparison chart of the color deviation results for the colorcheker. After restoration, the color deviation of most color patches has been reduced. Using the color deviation iteration method, after one iteration, the color deviation can be improved by 2 CIE DE 2000 color deviation units. After applying the optimal CCM, the color deviation was reduced by 11.32 units, with a decrease of four units (26.14%) compared to the CIE DE 2000 color deviation unit using the least squares method. The system’s resolution increased by six units of CIE DE 2000 (35.61%) compared to the original system.

4.4. On-Orbit Verification and Results

According to our proposed method flow described in Section 3, we used measured data on the Mars exploration “Tianwen 1” for on-orbit color restoration, and compared it with traditional ground color restoration methods. Figure 12 shows a set of spectra obtained by the mineral spectrometer on-orbit, corresponding to the on-orbit Mars remote sensing image shown in Figure 13.
In Figure 14, each row of images is the result of using the same correction method for different images, and each column represents the results of using different correction methods for the same image. Figure 14(a1–d1) are the pre-restoration results; Figure 14(a2–d2) are the traditional ground restoration results; Figure 14(a3–d3) are our proposed results. Intuitively, our proposed method is more balanced overall, without any yellowing or redness.
Next, we quantitatively analyzed and compared the performance of the comparison methods. Table 3 shows the comparison results of the on-orbit restoration color deviation.
Due to the complexity of on-orbit environmental lighting, the recovered images taken on-orbit exhibit a larger color deviation than those on the ground. However, after being corrected by our proposed method, the average color deviation can reach 8.43. Compared to the least squares method, our method has improved by 2.63 (23.78%). Compared to the pre-restoration method, our method has an improvement of 21.47 (71.81%). It has been confirmed that the calculation method based on real-time data on-orbit in this paper exhibits good adaptability on-orbit. Compared to ground restoration methods, our proposed method can more effectively reduce color deviations.

5. Discussion

The objective of our research article is to tackle the challenge of obtaining accurate color values in deep space environments without the use of in-orbit calibration plates. Our goal is to achieve effective color recovery for color cameras operating in deep space environments. We started with the principal analysis, then developed a color imaging spectrum model, and ultimately demonstrate the validity and accuracy of our proposed method through principal verification, ground validation, and in-orbit testing.
1. Principal analysis
From the physical mechanism of color imaging, the spectrum perceived by the image sensor determines the color of the image. Therefore, it is sensible to recover the color information of targets in deep space environments using spectral data. The spectral information perceived by image sensors is determined by two main factors: the target’s reflection spectrum and the camera’s quantum efficiency. The reflection spectrum can be measured using a spectrometer, while the quantum efficiency can be assessed with ground testing equipment. Therefore, it is feasible to restore the true color values in deep space environments through spectral information.
2. Modeling
The process of model building is the physical process of color imaging. We first list the transmission paths of light in order, starting from the light source and ending at the point where the image sensor receives it. Then, we analyze each stage step by step, identify the stages that change the spectral information, and model them in detail. For links unrelated to spectral information, simplification was carried out, resulting in an optical transmission model, a photoelectric conversion model, and a Bayer filter model.
3. Verification
To verify the correctness of our proposed model, we conducted a comprehensive discussion in three stages: principle verification, ground verification, and in-orbit verification.
(3.1) The principal verification part is shown in Section 4.2. The purpose of this verification is to validate the principle correctness of the model by measuring the camera quantum efficiency curve and the theoretical reflection spectrum of color. The first row in Figure 8 displays the theoretical imaging values of three color blocks, while the second row shows the calculated values obtained from the model. Both numerical and subjective differences are small. The model proposed in this article is theoretically correct.
(3.2) Ground verification and in-orbit verification are described in Section 4.3 and Section 4.4, respectively. Two experiments were conducted to verify the applicability of our method in both ground and deep space environments. The average color reduction achieved by our method in the ground validation was 11.36. During in-orbit verification, the average color difference was reduced to 8.43, which is superior to existing methods.

6. Conclusions

Our contribution can be summarized in two parts:
  • Proposed the color image spectrum model to quantitatively describe the color imaging process in deep space environments, solving the problem of how to obtain true color values with the lacking of a color palettes, and providing a theoretical basis for color recovery of color cameras by fusing spectral information.
  • Proposed a method for on-orbit joint color restoration of color cameras for deep space exploration, which aims to solve the problem of color bias in remote sensing images caused by the complex operating environment, poor lighting conditions, and the unavailability of color restoration boards in on-orbit conditions.
This method uses spectral data provided by the on-orbit payload spectrometer to calibrate on-orbit colors. In terms of data fitting, the least squares method and the color deviation iteration method are used to solve the optimal CCM. After the ground and on-orbit testing, this method can significantly reduce color deviations, with an average color deviation of 8.43, which is 2.63 (23.78%) lower than that achieved by the least square method; which makes the camera observation effect more consistent with the visual perception effect of the “standard human eye.” It improved the accuracy and on-orbit adaptability of the color restoration of DCRSIs.

Author Contributions

Conceptualization, R.Z.; methodology, H.L. and S.L.; software, S.L. and J.Z.; validation, H.L.; formal analysis, H.L.; investigation, Y.M. and K.L.; resources, K.L. and R.Z.; data curation, J.Z.; writing—original draft preparation, H.L.; writing—review and editing, Y.M.; visualization, S.L.; supervision, R.Z.; project administration, R.Z.; funding acquisition, R.Z., H.L. and K.L. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported in part by the Sichuan Outstanding Youth Science and Technology Talent Project 2022JDJQ0027, and Sichuan Science and Technology Program under Grant No. 2025ZNSFSC1504 and No. 2024ZNSFSC1443.

Data Availability Statement

The data presented in this study are available on request from the corresponding author due to the sensitivity of data collection equipment and the audit requirements of research institutions.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Han, F.; Liu, Q.; Wang, H.; Ren, Z.; Zhou, F.; Kang, C. Deep-Space Background Low-Light Image Enhancement Method Based on Multi-Image Fusion. Appl. Sci. 2025, 15, 4837. [Google Scholar] [CrossRef]
  2. Fan, M.; Lu, W.; Niu, W.; Peng, X.; Yang, Z. A large-scale invariant matching method based on DeepSpace-ScaleNet for small celestial body exploration. Remote Sens. 2022, 14, 6339. [Google Scholar] [CrossRef]
  3. Li, G.; Xi, B.; He, Y.; Zheng, T.; Li, Y.; Xue, C.; Chanussot, J. Diamond-Unet: A novel semantic segmentation network based on U-Net network and transformer for deep space rock images. IEEE Geosci. Remote. Sens. Lett. 2024, 21, 8002205. [Google Scholar] [CrossRef]
  4. Chi, W. Prospects of Global Space Science Breakthroughs and China’s Contributions. Bull. Chin. Acad. Sci. 2022, 37, 1050–1065. [Google Scholar]
  5. Xu, L.; Li, H.; Pei, Z.; Zou, Y.; Wang, C. A brief introduction to the international lunar research station program and the interstellar express mission. Chin. J. Space Sci. 2022, 42, 511–513. [Google Scholar] [CrossRef]
  6. Suo, J.; Long, H.; Ma, Y.; Zhang, Y.; Liang, Z.; Yan, C.; Zhao, R. Resource-Exploration-Oriented Lunar Rocks Monocular Detection and 3D Pose Estimation. Aerospace 2024, 12, 4. [Google Scholar] [CrossRef]
  7. Liang, Z.; Long, H.; Zhu, Z.; Cao, Z.; Yi, J.; Ma, Y.; Liu, E.; Zhao, R. High-Precision Disparity Estimation for Lunar Scene Using Optimized Census Transform and Superpixel Refinement. Remote Sens. 2024, 16, 3930. [Google Scholar] [CrossRef]
  8. Zhang, Z.; Feng, J.; Chang, L.; Deng, L.; Li, D.; Si, C. SpaceLight: A Framework for Enhanced On-Orbit Navigation Imagery. Aerospace 2024, 11, 503. [Google Scholar] [CrossRef]
  9. Frosio, T.; Menaa, N.; Bertreix, P.; Rimlinger, M.; Theis, C. A novel technique for the optimization and reduction of gamma spectroscopy geometry uncertainties. Appl. Radiat. Isot. 2020, 156, 108953. [Google Scholar] [CrossRef]
  10. Peng, X.; Liu, E.H.; Tian, S.L.; Fang, L.; Zhang, H. Study of high-precision velocimetry technique based on absorption spectrum for deep space exploration. Acta Astronaut. 2022, 199, 327–336. [Google Scholar] [CrossRef]
  11. Han, M.; Lyu, Z.; Qiu, T.; Xu, M. A review on intelligence dehazing and color restoration for underwater images. IEEE Trans. Syst. Man Cybern. Syst. 2018, 50, 1820–1832. [Google Scholar] [CrossRef]
  12. Li, X.; Lu, D.; Pan, Y. Color restoration and image retrieval for Dunhuang fresco preservation. IEEE MultiMedia 2000, 7, 38–42. [Google Scholar] [CrossRef]
  13. Charrière, R.; Hébert, M.; Trémeau, A.; Destouches, N. Color calibration of an RGB camera mounted in front of a microscope with strong color distortion. Appl. Opt. 2013, 52, 5262–5271. [Google Scholar] [CrossRef]
  14. Florides, G.A.; Christodoulides, P. The color of the Moon in visible light through a review of published photographs. A paradox? Int. J. Cult. Herit. 2021, 6, 48–62. [Google Scholar]
  15. Yerramreddy, D.R.; Marasani, J.; Gowtham, P.S.V.; Don, S. Analysis of Image Restoration Techniques on Lunar Surface Images. In Proceedings of the 2023 Innovations in Power and Advanced Computing Technologies (i-PACT), Kuala Lumpur, Malaysia, 8–10 December 2023; IEEE: Piscataway, NJ, USA, 2023; pp. 1–6. [Google Scholar]
  16. McCord, T.B. Color differences on the lunar surface. J. Geophys. Res. 1969, 74, 3131–3142. [Google Scholar] [CrossRef]
  17. Bell, J.F., III; Godber, A.; McNair, S.; Caplinger, M.; Maki, J.; Lemmon, M.; Van Beek, J.; Malin, M.; Wellington, D.; Kinch, K.; et al. The Mars Science Laboratory Curiosity rover Mastcam instruments: Preflight and in-flight calibration, validation, and data archiving. Earth Space Sci. 2017, 4, 396–452. [Google Scholar]
  18. Fries, M.D.; Lee, C.; Bhartia, R.; Razzell Hollis, J.; Beegle, L.W.; Uckert, K.; Graff, T.G.; Abbey, W.; Bailey, Z.; Berger, E.L.; et al. The SHERLOC calibration target on the Mars 2020 Perseverance rover: Design, operations, outreach, and future human exploration functions. Space Sci. Rev. 2022, 218, 46. [Google Scholar] [CrossRef]
  19. Zhang, Q.; Liu, D.; Liu, J.; Guo, L.; Xue, B.; Yang, J.; Yang, B.; Wang, X.; Huang, H.; Liu, B.; et al. A reflectance calibration method for Multispectral Camera (MSCam) on the Zhurong rover. Icarus 2022, 387, 115208. [Google Scholar] [CrossRef]
  20. Zhao, R.J.; Liu, E.H.; Wang, J.; Yu, G.B. A method of color correction for Chang’E-3 satellite camera topography image. J. Astronaut. 2016, 37, 341–347. [Google Scholar] [CrossRef]
  21. Ren, X.; Li, C.-L.; Liu, J.-J.; Wang, F.-F.; Yang, J.-F.; Liu, E.-H.; Xue, B.; Zhao, R.-J. A method and results of color calibration for the Chang’e-3 terrain camera and panoramic camera. Res. Astron. Astrophys. 2014, 14, 1557. [Google Scholar] [CrossRef]
  22. Levin, R.L.; Levin, G.V. Solving the color calibration problem of Martian lander images. In Proceedings of the Instruments, Methods, and Missions for Astrobiology VII, San Diego, CA, USA, 3–8 August 2003; SPIE: Bellingham, WA, USA, 2004; Volume 5163, pp. 158–170. [Google Scholar]
  23. Shi, Z.; Liu, C.; Ren, W.; Shuangli, D.; Zhao, M. Convolutional neural networks for sand dust image color restoration and visibility enhancement. Chin. J. Image Graph 2022, 27, 1493–1508. [Google Scholar] [CrossRef]
  24. Shi, Y.; Wang, B.; Wu, X.; Zhu, M. Unsupervised low-light image enhancement by extracting structural similarity and color consistency. IEEE Signal Process. Lett. 2022, 29, 997–1001. [Google Scholar] [CrossRef]
  25. Duan, J.; Zhang, E. An anti-counterfeiting method for printed image by digital halftoning method. In Proceedings of the 2012 5th International Congress on Image and Signal Processing, Agadir, Morocco, 28–30 June 2012; IEEE: Piscataway, NJ, USA, 2012; pp. 562–566. [Google Scholar]
  26. Liu, S.; Long, H.; Ma, Y.; Qiao, L.; Liang, Z.; Cao, Z.; Zhao, Y.; Yi, J.; Zhu, Z.; Tang, Y.; et al. Deep-space exploration camera on-orbit white balance calibration based on correlated color temperature interpolation. In Proceedings of the International Conference on Image, Signal Processing, and Pattern Recognition (ISPP 2024), Guangzhou, China, 1–3 March 2024; SPIE: Bellingham, WA, USA, 2024; Volume 13180, pp. 575–585. [Google Scholar]
  27. Rodgers, D.H.; Beauchamp, P.M.; Soderblom, L.A.; Brown, R.H.; Chen, G.S.; Lee, M.; Sandel, B.R.; Thomas, D.A.; Benoit, R.T.; Yelle, R.V. Advanced technologies demonstrated by the miniature integrated camera and spectrometer (MICAS) aboard deep space 1. Space Sci. Rev. 2007, 129, 309–326. [Google Scholar] [CrossRef]
  28. Park, C.; Kang, M.G. Color restoration of RGBN multispectral filter array sensor images based on spectral decomposition. Sensors 2016, 16, 719. [Google Scholar] [CrossRef] [PubMed]
  29. Wang, H.; Yang, J.; Xue, B.; Yan, X.; Tao, J. A novel color calibration method of multi-spectral camera based on normalized RGB color model. Results Phys. 2020, 19, 103498. [Google Scholar] [CrossRef]
  30. Wang, B.; Song, S.; Gong, W.; Cao, X.; He, D.; Chen, Z.; Lin, X.; Li, F.; Sun, J. Color restoration for full-waveform multispectral LiDAR data. Remote Sens. 2020, 12, 593. [Google Scholar] [CrossRef]
  31. Ren, X.; Zhang, X.; Chen, W.; Yan, W.; Zeng, X.; Tan, X.; Gao, X.; Fu, Q.; Guo, L.; Zhang, Q.; et al. A new approach to color correction and equalization for generating mars global color image mosaics from Tianwen-1 MoRIC images. ISPRS J. Photogramm. Remote Sens. 2025, 225, 291–301. [Google Scholar] [CrossRef]
  32. He, Z.; Xu, R.; Li, C.; Yuan, L.; Liu, C.; Lv, G.; Jin, J.; Xie, J.; Kong, C.; Li, F.; et al. Mars mineralogical spectrometer (MMS) on the Tianwen-1 mission. Space Sci. Rev. 2021, 217, 27. [Google Scholar] [CrossRef]
  33. Liu, B.; Ren, X.; Liu, D.; Liu, J.; Zhang, Q.; Huang, H.; Xu, R.; Wang, R.; Liu, C.; He, Z.; et al. Ground Validation Experiment and Spectral Detection Capability Evaluation of Mars Mineralogical Spectrometer (MMS) Aboard HX-1 Orbiter. Space Sci. Rev. 2022, 218, 1. [Google Scholar] [CrossRef]
  34. Caporale, A.G.; Vingiani, S.; Palladino, M.; El-Nakhel, C.; Duri, L.G.; Pannico, A.; Rouphael, Y.; De Pascale, S.; Adamo, P. Geo-mineralogical characterisation of Mars simulant MMS-1 and appraisal of substrate physico-chemical properties and crop performance obtained with variable green compost amendment rates. Sci. Total Environ. 2020, 720, 137543. [Google Scholar] [CrossRef]
  35. Bell III, J.; Wolff, M.; Malin, M.; Calvin, W.; Cantor, B.; Caplinger, M.; Clancy, R.; Edgett, K.; Edwards, L.; Fahle, J.; et al. Mars reconnaissance orbiter Mars color imager (MARCI): Instrument description, calibration, and performance. J. Geophys. Res. Planets 2009, 114. [Google Scholar] [CrossRef]
  36. Luo, M.R. CIE 2000 color difference formula: CIEDE2000. In Proceedings of the 9th Congress of the International Colour Association, Rochester, NY, USA, 24–29 June 2001; SPIE: Bellingham, WA, USA, 2002; Volume 4421, pp. 554–559. [Google Scholar]
  37. Banić, N.; Koščević, K.; Subašić, M.; Lončarić, S. The past and the present of the color checker dataset misuse. In Proceedings of the 2019 11th International Symposium on Image and Signal Processing and Analysis (ISPA), Dubrovnik, Croatia, 23–25 September 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 366–371. [Google Scholar]
  38. Berger, A.; Strocka, D. Quantitative assessment of artificial light sources for the best fit to standard illuminant D65. Appl. Opt. 1973, 12, 338–348. [Google Scholar] [CrossRef] [PubMed]
Figure 1. On-orbit color standard plates in previous deep space missions [17,18,19]. (a) the SHERLOC Calibration Target, (b) the Mastcam calibration target, (c) The MSCam RGB image of the calibration.
Figure 1. On-orbit color standard plates in previous deep space missions [17,18,19]. (a) the SHERLOC Calibration Target, (b) the Mastcam calibration target, (c) The MSCam RGB image of the calibration.
Aerospace 12 00696 g001
Figure 2. On-orbit color restoration method based on image information [20,22]; (a) Chang’e-3 terrain camera without color restoration image, (b) Color restoration image by [21], (c) Viking 1 Lander First, Color Photo 12A006 [22], (d) Color restoration image by [22].
Figure 2. On-orbit color restoration method based on image information [20,22]; (a) Chang’e-3 terrain camera without color restoration image, (b) Color restoration image by [21], (c) Viking 1 Lander First, Color Photo 12A006 [22], (d) Color restoration image by [22].
Aerospace 12 00696 g002
Figure 3. On-orbit color restoration method combined RGB camera and spectrometer; (a) Tianwen-1 MoRIC Level 2C Global Mosaic Map without color restoration image [31], (b) Tianwen-1 global mosaic image after color correction and global equalization [31].
Figure 3. On-orbit color restoration method combined RGB camera and spectrometer; (a) Tianwen-1 MoRIC Level 2C Global Mosaic Map without color restoration image [31], (b) Tianwen-1 global mosaic image after color correction and global equalization [31].
Aerospace 12 00696 g003
Figure 4. The physical process of color optical imaging.
Figure 4. The physical process of color optical imaging.
Aerospace 12 00696 g004
Figure 5. Schematic diagram of the principle of the method.
Figure 5. Schematic diagram of the principle of the method.
Aerospace 12 00696 g005
Figure 6. Flow chart of color deviation iterative optimization algorithm.
Figure 6. Flow chart of color deviation iterative optimization algorithm.
Aerospace 12 00696 g006
Figure 7. Quantum efficiency of “Tianwen-1” medium-field camera: (a) Quantum efficiency test platform; (b) Quantum efficiency curve.
Figure 7. Quantum efficiency of “Tianwen-1” medium-field camera: (a) Quantum efficiency test platform; (b) Quantum efficiency curve.
Aerospace 12 00696 g007
Figure 8. Comparison between theoretical calculation and camera shooting display: (a1,a2,a3) the results of the theoretical calculation; (b1,b2,b3) the results of the camera shooting.
Figure 8. Comparison between theoretical calculation and camera shooting display: (a1,a2,a3) the results of the theoretical calculation; (b1,b2,b3) the results of the camera shooting.
Aerospace 12 00696 g008
Figure 9. (a) Ground Test Verification Platform; (b) Color-Checker (Swatches are numbered 1 to 24 from left to right and from top to bottom).
Figure 9. (a) Ground Test Verification Platform; (b) Color-Checker (Swatches are numbered 1 to 24 from left to right and from top to bottom).
Aerospace 12 00696 g009
Figure 10. Comparison of results before and after the Color-Checker restoration using our method: The sub-figures (ad) correspond to the different scenarios-the “standard human eye”, pre-restoration [20], post-restoration using the least squares method [21], and our proposed method, respectively.
Figure 10. Comparison of results before and after the Color-Checker restoration using our method: The sub-figures (ad) correspond to the different scenarios-the “standard human eye”, pre-restoration [20], post-restoration using the least squares method [21], and our proposed method, respectively.
Aerospace 12 00696 g010
Figure 11. Statistical chart of color deviation of each color block.
Figure 11. Statistical chart of color deviation of each color block.
Aerospace 12 00696 g011
Figure 12. Mars spectrum obtained by the mineral spectrometer [32]. (a) Mars Mineralogical Spectrometer (MMS) on the Tianwen-1 Mission. (b) Mars spectrum.
Figure 12. Mars spectrum obtained by the mineral spectrometer [32]. (a) Mars Mineralogical Spectrometer (MMS) on the Tianwen-1 Mission. (b) Mars spectrum.
Aerospace 12 00696 g012
Figure 13. On-orbit color restoration for the remote sensing image: (a) pre-restoration; (b) traditional ground restoration; (c) our proposed method.
Figure 13. On-orbit color restoration for the remote sensing image: (a) pre-restoration; (b) traditional ground restoration; (c) our proposed method.
Aerospace 12 00696 g013
Figure 14. Comparison effect pictures of four locations pre-restoration, the traditional ground restoration, and on-orbit restoration: (a1) pre-restoration results in South Crater; (a2) the traditional ground restoration results in South Crater; (a3) on-orbit results in South Crater; (b1) pre-restoration results in Malea Planum; (b2) the traditional ground restoration results in Malea Planum; (b3) on-orbit results in Malea Planum; (c1) pre-restoration results in Huygens Crater; (c2) the traditional ground restoration results in Huygens Crater; (c3) on-orbit results in Huygens Crater; (d1) pre-restoration results in Arabia Terra; (d2) the traditional ground restoration results in Arabia Terra; (d3) on-orbit results in Arabia Terra.
Figure 14. Comparison effect pictures of four locations pre-restoration, the traditional ground restoration, and on-orbit restoration: (a1) pre-restoration results in South Crater; (a2) the traditional ground restoration results in South Crater; (a3) on-orbit results in South Crater; (b1) pre-restoration results in Malea Planum; (b2) the traditional ground restoration results in Malea Planum; (b3) on-orbit results in Malea Planum; (c1) pre-restoration results in Huygens Crater; (c2) the traditional ground restoration results in Huygens Crater; (c3) on-orbit results in Huygens Crater; (d1) pre-restoration results in Arabia Terra; (d2) the traditional ground restoration results in Arabia Terra; (d3) on-orbit results in Arabia Terra.
Aerospace 12 00696 g014
Table 1. Quantum efficiency of each color channel.
Table 1. Quantum efficiency of each color channel.
λ Quantum Efficiency R%Quantum Efficiency G 1 %Quantum Efficiency G 2 %Quantum Efficiency B%
3500.561.51.51.41
4000.380.610.621.34
4502.0312.3812.5561.34
5004.963.3863.5650.82
5509.2791.6191.1514.81
60068.8728.5627.967.97
65081.9615.7615.479.65
70062.8228.9328.3713.95
75062.0938.1637.4414.24
8000.540.470.460.39
8500.020.010.010.01
Table 2. Comparison of CIE DE 2000 color deviation results.
Table 2. Comparison of CIE DE 2000 color deviation results.
Pre-RestorationEast Square MethodFirst, IterationOur Method
Swatch 16.465.788.232.73
Swatch 24.762.936.541.68
Swatch 35.183.152.772.96
Swatch 49.549.377.674.74
Swatch 516.125.552.031.86
Swatch 610.837.216.856.73
Swatch 72.363.057.75.09
Swatch 814.476.695.912.34
Swatch 911.277.034.672.85
Swatch 1038.3537.9336.2123.56
Swatch 112.912.33.353.74
Swatch 127.888.797.187.25
Swatch 1314.2511.149.877.71
Swatch 1417.7913.4413.2212.45
Swatch 1510.779.268.37.01
Swatch 1610.648.946.765.13
Swatch 1711.278.896.483.55
Swatch 187.621.151.966.02
Swatch 1939.8840.1433.5333.23
Swatch 2038.3438.6932.7632.54
Swatch 2138.438.7329.9325.32
Swatch 2234.7435.1122.1530.35
Swatch 2328.4728.9710.121.38
Swatch 2428.2828.9620.222.36
Average17.1115.1312.2811.36
Table 3. Comparison of the color restoration effects in different regions.
Table 3. Comparison of the color restoration effects in different regions.
Mars AreaPre-RestorationTraditional Ground MethodOur Proposed Method
South Crater30.1711.088.41
Malea Planum29.9111.418.57
Huygens Crater28.6110.458.22
Arabia Terra30.9111.38.54
Average color deviation29.911.068.43
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Long, H.; Liu, S.; Ma, Y.; Zeng, J.; Lu, K.; Zhao, R. Mars-On-Orbit Color Image Spectrum Model and Color Restoration. Aerospace 2025, 12, 696. https://doi.org/10.3390/aerospace12080696

AMA Style

Long H, Liu S, Ma Y, Zeng J, Lu K, Zhao R. Mars-On-Orbit Color Image Spectrum Model and Color Restoration. Aerospace. 2025; 12(8):696. https://doi.org/10.3390/aerospace12080696

Chicago/Turabian Style

Long, Hongfeng, Sainan Liu, Yuebo Ma, Junzhe Zeng, Kaili Lu, and Rujin Zhao. 2025. "Mars-On-Orbit Color Image Spectrum Model and Color Restoration" Aerospace 12, no. 8: 696. https://doi.org/10.3390/aerospace12080696

APA Style

Long, H., Liu, S., Ma, Y., Zeng, J., Lu, K., & Zhao, R. (2025). Mars-On-Orbit Color Image Spectrum Model and Color Restoration. Aerospace, 12(8), 696. https://doi.org/10.3390/aerospace12080696

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop