The radiometric resolution of a satellite sensor refers to the smallest increment in the spectral radiance that can be detected by the imaging sensor. The fewer bits that are used for signal discretization, the larger the quantization error in the measured radiance. In satellite inter-calibration, a difference in radiometric resolution between a reference and a target sensor can induce a calibration bias, if not properly accounted for. The effect is greater for satellites with a quadratic count response, such as the Geostationary Meteorological Satellite-5 (GMS-5) visible imager, where the quantization difference can introduce non-linearity in the inter-comparison datasets, thereby affecting the cross-calibration slope and offset. This paper describes a simulation approach to highlight the importance of considering the radiometric quantization in cross-calibration and presents a correction method for mitigating its impact. The method, when applied to the cross-calibration of GMS-5 and Terra Moderate Resolution Imaging Spectroradiometer (MODIS) sensors, improved the absolute calibration accuracy of the GMS-5 imager. This was validated via radiometric inter-comparison of GMS-5 and Multifunction Transport Satellite-2 (MTSAT-2) imager top-of-atmosphere (TOA) measurements over deep convective clouds (DCC) and Badain Desert invariant targets. The radiometric bias between GMS-5 and MTSAT-2 was reduced from 1.9% to 0.5% for DCC, and from 7.7% to 2.3% for Badain using the proposed correction method.
This is an open access article distributed under the Creative Commons Attribution License
which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited