Global land surface temperature (LST) data derived from satellite-based infrared radiance measurements are highly valuable for various applications in climate research. While in situ validation of satellite LST data sets is a challenging task, it is needed to obtain quantitative information on their accuracy. In the standardised approach to multi-sensor validation presented here for the first time, LST data sets obtained with state-of-the-art retrieval algorithms from several sensors (AATSR, GOES, MODIS, and SEVIRI) are matched spatially and temporally with multiple years of in situ data from globally distributed stations representing various land cover types in a consistent manner. Commonality of treatment is essential for the approach: all satellite data sets are projected to the same spatial grid, and transformed into a common harmonized format, thereby allowing comparison with in situ data to be undertaken with the same methodology and data processing. The large data base of standardised satellite LST provided by the European Space Agency’s GlobTemperature project makes previously difficult to perform LST studies and applications more feasible and easier to implement. The satellite data sets are validated over either three or ten years, depending on data availability. Average accuracies over the whole time span are generally within ±2.0 K during night, and within ± 4.0 K during day. Time series analyses over individual stations reveal seasonal cycles. They stem, depending on the station, from surface anisotropy, topography, or heterogeneous land cover. The results demonstrate the maturity of the LST products, but also highlight the need to carefully consider their temporal and spatial properties when using them for scientific purposes.
This is an open access article distributed under the Creative Commons Attribution License
which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited