Challenges for In-Flight Calibration of Thermal Infrared Instruments for Earth Observation

: Satellite instruments operating in the thermal infrared wavelength range > 3 µ m provide information for applications such as land surface temperature (LST), sea surface temperatures (SST), land surface emissivity, land classiﬁcation, soil composition, volcanology, ﬁre radiative power, cloud masking, aerosols, and trace gases. All these instruments are dependent on blackbody (BB) calibration sources to provide the traceability of the radiometric calibration to SI (Syst è me International d’Unit é s). A key issue for ﬂight BB sources is to maintain the traceability of the radiometric calibration from ground to orbit. For example, the temperature of the BB is measured by a number of precision thermometers that are calibrated against a reference Standard Platinum Resistance Thermometer (SPRT) to provide the traceability to the International Temperature Scale of 1990 (ITS-90). However, once calibrated the thermometer system is subject to drifts caused by on-ground testing, the launch and space environments. At best the uncertainties due to thermometer ageing can only be estimated as there is no direct method for recalibrating. Comparisons with other satellite sensors are useful for placing an upper limit on calibration drifts but do not themselves provide a traceable link to the SI. In this paper, we describe we describe some of the technology developments, including phase change cells for use as reference standards, thermometer readout electronics and implementation of novel coatings, that are in progress to enhance the traceability of ﬂight calibration systems in the thermal infrared.


Introduction
The Planck radiation law predicts that for temperatures that cover the typical range of earth scenes from 180 to 350 K, the peak of the radiation distribution occurs within the range 3-20 µm. Also, the radiances cover a wide dynamic range, particularly at wavelengths <5 µm. This makes observations in the thermal infrared range particularly useful for measurements of the Earth's surface temperatures and atmospheric sounding. Thermal InfraRed (TIR) measurements from satellite instruments have a range of applications ranging from global climate change monitoring, improved weather forecasting through assimilation of data into Numerical Weather Prediction (NWP) [1] and monitoring urban pollution, Table 1, presents a summary of applications from hyperspectral TIR imagers [2]. Table 1. Applications of high resolution, hyper-spectral thermal infrared instruments [2].

Domain Application Domain Application
Volcanoes and Earthquakes There are many TIR instruments on-orbit and planned to cover these and other applications. It is not practical to provide an exhaustive list in this paper. Instead the World Meterological Organisation, Observing Systems Capability Analysis and Review tool (WMO OSCAR) provides a comprehensive searchable list of sensors [3].
A particular use of thermal infrared instruments has been to measure global sea-surfacetemperatures (SST) for climate change monitoring. Sea surface temperatures are an important indicator of climate change [4]. Satellite instruments are particularly useful because they can provide global coverage within a few days and are operational for several years. Polar orbiting filter radiometers such as the Advanced Very High-Resolution Radiometer (AVHRR), the Along Track Scanning Radiometer (ATSR), the Moderate Imaging Spectroradiometer (MODIS), the Visible and Infrared Imaging Spectroradiometer (VIIRS) and more recently the Sea and Land Surface Temperature Radiometer (SLSTR) have provided a near continuous record of global sea-surface temperatures since the 1970s.
For satellite measurements of SSTs to contribute to long-term climate records it is essential that the measurements are consistent and are traced back to the same temperature standards that were used for the historical records. For temperature, these measurements need to be expressed in terms of the ITS-90, which provides a practical approximation to the SI unit of temperature, the kelvin. For climate change monitoring, the satellite instrument must be capable of observing atmospheric and surface temperature trends as small as 0.1 K decade −1 [5] which requires an uncertainty in the radiometric calibration <0.06 K (k = 2) [6].
Remote Sens. 2020, 12, 1832 3 of 10 For SLSTR the main objective is to measure sea surface temperature to an uncertainty of <0.3 K (k = 3) [7]. To achieve this level of uncertainty, top-of atmosphere measurements of the earth's brightness temperature at 3.7 µm, 10.8 µm and 12 µm must be measured to an uncertainty <0.1 K (k = 3). For such observational requirements, the instrument system design had to ensure that it can be calibrated, such that any remaining errors can be measured to a known uncertainty either before launch or in-orbit. This process began from the initial assessments of the mission performance to place limits on the uncertainties of individual components (e.g., calibration sources) so that the requirements, could be achieved [8]. This in-turn informed the design of the on-board calibration systems, the design of the data processing chain, and the pre and post launch calibration plan [9].
Although the calibration principle for SLSTR is relatively straightforward, the actual process from raw data to calibrated radiances/brightness temperature involves several processing stages and requires several inputs. For example, for SLSTR, calibration coefficients are computed for each spectral band (5 thermal bands) and each detector element (2 × 2) and for each earth view (separate for nadir and oblique views). These coefficients are derived from data recorded on-orbit and parameters contained in the Satellite Characterization Database Document (CCDB). Each effect contributes to the overall uncertainty of the instrument calibration budget.
An analysis of the Level-1 processing chain has been performed to build up the uncertainty budget of the SLSTR instrument. Smith et al. [10] describes the analysis in more depth. Starting from the instrument calibration model all uncertainty effects contributing to the calibrated Level-1 data are traced through to their original sources. As with most thermal infrared instruments, the primary effects are due to detector noise, uncertainties in the calibration source thermometry and emissivity, instrument spectral response and non-linearity effects. Data processing effects are also considered, for example where averages of several measurements from calibration sources are used to reduce random noise effects. The contribution of the primary effects to the uncertainty for a scene temperature at 270 K are presented Table 2. The combined uncertainty for a scene brightness temperature of 270 K is within the required performance budget of <0.1 K at k = 3. As expected, the main contributor is from the on-board calibration sources which we will explore further in the next section. * The radiometric noise is considered a purely random effect that is treated separately from the correlated effects since the latter are not reduced by averaging. Only correlated effects are considered in the combined uncertainty estimates.

Calibration Systems for IR Instruments
Thermal infrared instruments are notoriously difficult to calibrate. Calibrations are required for field of view, wavelength selection and radiance response. The field of view and wavelength selection are relatively stable over time by design and can be calibrated pre-flight and may be adjusted on-orbit [11], but radiance signals at the instrument detector include the thermal self-emission of the instrument. This self-emission combined with the temperature sensitivity and degradation over time of components such as the infrared detectors and other electronic parts causes an instrument's radiometric gain and offset to vary continuously. Consequently, the instrument must be re-calibrated repeatedly in orbit.
For IR instruments such as ATSR and SLSTR, traceability to SI is achieved via internal black-body calibration sources, Figure 1. For an ideal blackbody source, the Planck radiation law can be used to derive the spectral radiance (W·m −2 ·µm − 1·sr −1 ) from the temperature and wavelength, and are needed to calculate the spectral radiance since: radiometric gain and offset to vary continuously. Consequently, the instrument must be re-calibrated repeatedly in orbit.
For IR instruments such as ATSR and SLSTR, traceability to SI is achieved via internal blackbody calibration sources, Figure 1. For an ideal blackbody source, the Planck radiation law can be used to derive the spectral radiance (W·m −2 ·µm − 1·sr −1 ) from the temperature and wavelength, and are needed to calculate the spectral radiance since: SLSTR utilises two cavity blackbody sources with emissivity > 0.998 based on the design originally used for ATSR [12]. One blackbody 'floats' at the optical enclosure temperature of 260 K and the other is heated to 302 K to provide two calibration points. Platinum Resistance Thermometers (PRTs) embedded in the base of each cavity are used to measure the temperature. The blackbody sources are 'viewed' by the instrument every scan thereby providing a continuous calibration to minimise effects due to the orbital and long-term temperature variations of the instrument optics. A thorough analysis of the SLSTR blackbody radiometric calibration budget was performed as part of the subsystem design and development process [13,14]. The uncertainty estimates for the two blackbody sources and their contributing effects are presented in Tables 3 and 4. Note that many of the contributing effects are based on manufacturers data and analysis (so-called type-B estimates [15]). This is especially significant for the estimates of ground and orbit degradation effects, Table 4. Although the End-of-Life (EOL) budgets meet the requirement for SST measurement, these are predicted estimates and do not guarantee traceability to ITS-90. SLSTR utilises two cavity blackbody sources with emissivity >0.998 based on the design originally used for ATSR [12]. One blackbody 'floats' at the optical enclosure temperature of 260 K and the other is heated to 302 K to provide two calibration points. Platinum Resistance Thermometers (PRTs) embedded in the base of each cavity are used to measure the temperature. The blackbody sources are 'viewed' by the instrument every scan thereby providing a continuous calibration to minimise effects due to the orbital and long-term temperature variations of the instrument optics.
A thorough analysis of the SLSTR blackbody radiometric calibration budget was performed as part of the subsystem design and development process [13,14]. The uncertainty estimates for the two blackbody sources and their contributing effects are presented in Tables 3 and 4. Note that many of the contributing effects are based on manufacturers data and analysis (so-called type-B estimates [15]). This is especially significant for the estimates of ground and orbit degradation effects, Table 4. Although the End-of-Life (EOL) budgets meet the requirement for SST measurement, these are predicted estimates and do not guarantee traceability to ITS-90.

Validation of the Radiometric Calibration
Pre-launch calibration testing at instrument level against accurate reference sources, under a controlled thermal-vacuum environment, is essential for thermal infrared instruments. Without these tests, it is not possible to demonstrate the radiometric calibration of TIR instruments within the required uncertainty levels.
The principle for SLSTR following the experience of (A) ATSR is to validate the on-board calibration systems and processing under flight representative thermal vacuum conditions with the instrument fully operational in its flight configuration in a purpose-built calibration facility [9]. SLSTR-A was tested between March-May 2015 and SLSTR-B between October 2016 and February 2017. Measurements were performed at different thermal environment conditions and for different instrument configurations. Data are processed using the same algorithms and characterisation data that are used for flight. Furthermore, raw data generated during the tests are used to validate the flight L0-L1 processing chains. The results of the calibration tests demonstrated that the SLSTR calibration was within requirements over the expected dynamic range of the instrument. They also revealed an issue with SLSTR-A that requires a further processing step to account for a scan dependent offset error [16].
After launch, establishing the absolute calibration to uncertainties <0.1 K becomes far more challenging. Vicarious calibration methods such as those used for instruments operating in the visible to short-wave infrared range [17][18][19] is restricted by a number of factors including knowledge of the surface emissivity, non-uniformity of the surface, contribution of the atmosphere to the measured signal, and temporal variations over very short timescales due to the effect of surface winds, cloud shadow, and solar elevation. Hence, post launch activities are currently limited to monitoring of instrument parameters, satellite to satellite intercomparisons, and the validation of the retrieved surface temperatures by in situ-measurements.
On-orbit monitoring of critical parameters, in particular the radiometric noise, gains and offsets derived from the on-board calibration sources, the black-body and instrument temperatures provide a useful indication of the instrument's stability. While monitoring is used to provide an assessment of some components of the on-orbit calibration uncertainty budget, for example radiometric noise and temperature gradients, with no on-orbit reference standard such as a phase change cell, it is not possible to quantify and then correct for any on-orbit changes.
Currently, the only method available to evaluate the absolute calibration is via comparisons with other satellite sensors operating in the same wavelength range. An example is comparisons of SLSTR with respect to IASI on Metop-Second-Generation [20]. Here, Simultaneous Nadir Observations (SNOs) are performed by comparing spectrally averaged IASI BTs over the spectral responses of the SLSTR bands, and spatially aggregated SLSTR BTs over the IASI footprint. These comparisons have been performed for both SLSTR-A and B. and show that SLSTR and IASI agree within 0.1 K for homogenous scenes. However, there are limitations to the approach.

1.
The spectral range of IASI does not cover the full range of SLSTR bands. So only two channels at 10.8 µm and 12 µm can be covered.

2.
Differences between the Sentinel-3 and METOP-SG orbits mean that SNOs are only available at high latitudes. This limits the range of scene temperatures over which the comparisons can be performed which is typically from 200 to 280 K. High temperature scenes are not included.

3.
The presence of clouds and sea ice within the IASI footprint increases the noise in the measurements which can be several K.

4.
Because of the temporal variability of the IR measurements, the matchups have to be performed within a 5-min window.

5.
Agreement does not signify accuracy-they could both be in error Another example of direct comparisons has been the Sentinel-3 tandem phase where the S3A and S3B satellites flew on the same ground track 30 s apart for 4 months. This provided a unique opportunity to compare the radiometric calibrations of all SLSTR channels. The results from this phase are still under assessment but have been useful for verifying that the radiometric calibrations of the two sensors are consistent with each other [21]. Nevertheless, this does not demonstrate the absolute radiometric accuracy of either.

Towards On-Orbit Traceability
Despite the efforts described in the previous sections, there are currently no operational IR sensors with the means to provide a calibration reference that is traceable to ITS-90.
One solution is to employ phase change cells within the calibration system. Under laboratory conditions low uncertainty calibration is achieved using phase changes of pure materials e.g., triple point, freezing point, and melting point, as these are extremely reproducible and have well characterised temperatures specified by the International Temperature Scale of 1990 (ITS-90). Key points are the triple point of mercury (−38.8344 • C) and water (0.01 • C), and the melting point of gallium (29.7646 • C). Typically, the temperature of on-board satellite BB calibrators can be changed in flight, so if PCCs (Phase Change Cells) are embedded in a BB structure in close proximity to the temperature sensors and the BB is warmed or cooled, then during the phase change the material will absorb or release heat without a significant change in temperature, and the thermometer output will show a 'plateau' at the phase change temperature. As this temperature is known, the thermometer can be calibrated in situ. Such miniature PCCs for use within blackbody sources have been developed and tested under the CLARREO project [22].
A joint study with Rutherford Appleton Laboratory Space (RAL Space), the National Physical Laboratory (NPL) and Surrey Nano Systems (SNS) funded by the United Kingdom Space Agency, UKSA has sought to improve the on-orbit traceability of satellite IR instruments, [23]. The aim of the Next Generation InfraRed Source (NGenIRS) was to demonstrate the performance of novel readout electronics, carbon-nanotube black coatings and a phase-change cell within an existing SLSTR like blackbody cavity geometry with the goal of reaching (Technology Readiness Level) TRL-5/6, Figure 2. The NGenIRS prototype BB cavity (BBC) is based on the design heritage and lessons learned from the S-3 SLSTR BB calibration systems but has been updated to incorporate the thermometry electronics which wrap around the outside of the BBC; the use of the Vantablack ® S-VIS coating to provide an emissivity of 0.998 and the addition of a PCC mounted to the BBC base. Platinum resistance thermometers (PRTs) are used to monitor the temperature of the BBC, and two heaters located on the BBC base and wall are used to achieve the specific temperatures required for calibration.
2. The NGenIRS prototype BB cavity (BBC) is based on the design heritage and lessons learned from the S-3 SLSTR BB calibration systems but has been updated to incorporate the thermometry electronics which wrap around the outside of the BBC; the use of the Vantablack ® S-VIS coating to provide an emissivity of 0.998 and the addition of a PCC mounted to the BBC base. Platinum resistance thermometers (PRTs) are used to monitor the temperature of the BBC, and two heaters located on the BBC base and wall are used to achieve the specific temperatures required for calibration. The PCCs developed for NGenIRS use less than 2 g of gallium. Early attempts found that there was a problem with excessive undercool, but the team now have achieved repeatable phase changes and simple freezing procedure (only requires 2-3 °C undercool). Results from the prototype cells have achieved melt durations of 3-12 h with a repeatability of <5 mK independent of orientation. Furthermore, methods have been developed to ensure the long-term robustness of the PCCs that make them suitable for high levels of vibration and long-term use (many years) without leakage or contamination due to reaction between the gallium and the container, Figure 3. [24]. The PCCs developed for NGenIRS use less than 2 g of gallium. Early attempts found that there was a problem with excessive undercool, but the team now have achieved repeatable phase changes and simple freezing procedure (only requires 2-3 • C undercool). Results from the prototype cells have achieved melt durations of 3-12 h with a repeatability of <5 mK independent of orientation. Furthermore, methods have been developed to ensure the long-term robustness of the PCCs that make them suitable for high levels of vibration and long-term use (many years) without leakage or contamination due to reaction between the gallium and the container, Figure 3   The PRTs in the cavity were measured by an embedded thermometer readout system consisting of semi-rigid printed circuit boards (PCBs), each made up of interleaved "thermometer sensor nodes" which incorporate all of the main mechanical and electronic elements. The circuit design provides high equivalent thermometric accuracy and low noise characteristics, is extremely compact and lightweight, consumes little power and is mechanically very robust. It also provides an extremely simple interface to the host spacecraft and requires no external circuitry beyond what is installed on the BBC.
The BBC was fully assembled and underwent a full thermal vacuum (TVAC) test in representative thermal conditions defined using the SLSTR BB test process and thermal environment as a baseline. The test was split into three phases: TVAC cycling, PRT calibration and characterization of the BBC and PCC. The test was used to successfully verify the performance of the critical functions of the three key technologies. The PRTs in the cavity were measured by an embedded thermometer readout system consisting of semi-rigid printed circuit boards (PCBs), each made up of interleaved "thermometer sensor nodes" which incorporate all of the main mechanical and electronic elements. The circuit design provides high equivalent thermometric accuracy and low noise characteristics, is extremely compact and lightweight, consumes little power and is mechanically very robust. It also provides an extremely simple interface to the host spacecraft and requires no external circuitry beyond what is installed on the BBC.
The BBC was fully assembled and underwent a full thermal vacuum (TVAC) test in representative thermal conditions defined using the SLSTR BB test process and thermal environment as a baseline. The test was split into three phases: TVAC cycling, PRT calibration and characterization of the BBC and PCC. The test was used to successfully verify the performance of the critical functions of the three key technologies.
Tests under flight thermal vacuum conditions have verified that the phase change transition can be measured by the cavity thermometers to an uncertainty <20 mK (k = 3), Figure 4

Conclusions
To realize the potential of satellite instruments in the TIR wavelength range for providing climate quality data records, it is essential that they be calibrated with uncertainties <0.1 K (k = 3) to standards that are traceable to ITS-90 and between different instruments. Instrument design is fundamental to achieving traceable measurements. This includes control of internal and external stray light sources to a level that they do dominate the uncertainty budget. The instrument traceability chain needs to be developed during instrument design phase to identify all calibration effects, including data processing as well as from on-board calibration sources.
The primary route to SI-traceability in the TIR range is through the use of on-board black-body sources. Here the measured radiance is linked to the measured temperature via the Planck function. A limitation with all blackbody sources flying on all current TIR instruments missions is that the uncertainty is based on pre-flight measurements of the thermometry system and estimates of the long-term drift through analysis and predictions. Currently there are no direct methods to verify that the drift of the thermometry calibration is within the required limits.
Pre-launch calibration is essential to validate the end to end calibration model, particularly at thermal infrared wavelengths. Because TIR instruments are highly sensitive to their thermal environment, testing must be done with the instrument in flight representative thermal vacuum conditions, where the temperatures of all thermal surfaces are controlled and measured.
Post launch validation activities are limited to comparisons with other satellite sensors operating in the same wavelength range. However, these are currently limited by the uncertainties introduced in the comparison methods and the uncertainty in the reference sensor's own radiometric calibration.
Recent developments of miniature phase change cells provide a route to maintain the traceability of the on-orbit calibration of the blackbody thermometers to uncertainties <20 mK (k = 3) necessary to achieve the requirements for climate change monitoring [4][5][6]. Further work is in progress to By implementing a PCC into the blackbody cavity, we are able to provide a very robust, repeatable fixed temperature reference that can be used to correct for on-orbit drifts that is currently not possible in operational systems. This allows the radiometric budget to be maintained throughout the lifetime of the mission without relying on pre-launch estimates of drift.
Further work is in progress to raise the TRL of the technology with the goal of implementing in future flight missions.

Conclusions
To realize the potential of satellite instruments in the TIR wavelength range for providing climate quality data records, it is essential that they be calibrated with uncertainties <0.1 K (k = 3) to standards that are traceable to ITS-90 and between different instruments. Instrument design is fundamental to achieving traceable measurements. This includes control of internal and external stray light sources to a level that they do dominate the uncertainty budget. The instrument traceability chain needs to be developed during instrument design phase to identify all calibration effects, including data processing as well as from on-board calibration sources.
The primary route to SI-traceability in the TIR range is through the use of on-board black-body sources. Here the measured radiance is linked to the measured temperature via the Planck function. A limitation with all blackbody sources flying on all current TIR instruments missions is that the uncertainty is based on pre-flight measurements of the thermometry system and estimates of the long-term drift through analysis and predictions. Currently there are no direct methods to verify that the drift of the thermometry calibration is within the required limits.
Pre-launch calibration is essential to validate the end to end calibration model, particularly at thermal infrared wavelengths. Because TIR instruments are highly sensitive to their thermal environment, testing must be done with the instrument in flight representative thermal vacuum conditions, where the temperatures of all thermal surfaces are controlled and measured.
Post launch validation activities are limited to comparisons with other satellite sensors operating in the same wavelength range. However, these are currently limited by the uncertainties introduced in the comparison methods and the uncertainty in the reference sensor's own radiometric calibration.
Recent developments of miniature phase change cells provide a route to maintain the traceability of the on-orbit calibration of the blackbody thermometers to uncertainties <20 mK (k = 3) necessary to achieve the requirements for climate change monitoring [4][5][6]. Further work is in progress to advance this technology in a flight blackbody source.