Since 2009, three low frequency microwave sensors have been launched into space with the capability of global monitoring of sea surface salinity (SSS). The European Space Agency’s (ESA’s) Microwave Imaging Radiometer using Aperture Synthesis (MIRAS), onboard the Soil Moisture and Ocean Salinity mission (SMOS), and National Aeronautics and Space Administration’s (NASA’s) Aquarius and Soil Moisture Active Passive mission (SMAP) use L-band radiometry to measure SSS. There are notable differences in the instrumental approaches, as well as in the retrieval algorithms. We compare the salinity retrieved from these three spaceborne sensors to in situ observations from the Argo network of drifting floats, and we analyze some possible causes for the differences. We present comparisons of the long-term global spatial distribution, the temporal variability for a set of regions of interest and statistical distributions. We analyze some of the possible causes for the differences between the various satellite SSS products by reprocessing the retrievals from Aquarius brightness temperatures changing the model for the sea water dielectric constant and the ancillary product for the sea surface temperature. We quantify the impact of these changes on the differences in SSS between Aquarius and SMOS. We also identify the impact of the corrections for atmospheric effects recently modified in the Aquarius SSS retrievals. All three satellites exhibit SSS errors with a strong dependence on sea surface temperature, but this dependence varies significantly with the sensor. We show that these differences are first and foremost due to the dielectric constant model, then to atmospheric corrections and to a lesser extent to the ancillary product of the sea surface temperature.
This is an open access article distributed under the Creative Commons Attribution License
which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited