For the elimination of radio-frequency interference (RFI) in a passive microwave radiometer, the threshold level is generally calculated from the mean value and standard deviation. However, a serious problem that can arise is an error in the retrieved brightness temperature from a higher threshold level owing to the presence of RFI. In this paper, we propose a method to detect and mitigate RFI contamination using the threshold level from statistical criteria based on a spectrogram technique. Mean and skewness spectrograms are created from a brightness temperature spectrogram by shifting the 2-D window to discriminate the form of the symmetric distribution as a natural thermal emission signal. From the remaining bins of the mean spectrogram eliminated by RFI-flagged bins in the skewness spectrogram for data captured at 0.1-s intervals, two distribution sides are identically created from the left side of the distribution by changing the standard position of the distribution. Simultaneously, kurtosis calculations from these bins for each symmetric distribution are repeatedly performed to determine the retrieved brightness temperature corresponding to the closest kurtosis value of three. The performance is evaluated using experimental data, and the maximum error and root-mean-square error (RMSE) in the retrieved brightness temperature are served to be less than approximately 3 K and 1.7 K, respectively, from a window with a size of 100 × 100 time–frequency bins according to the RFI levels and cases.
This is an open access article distributed under the Creative Commons Attribution License
which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited