The local oscillators (LOs) of the Soil Moisture and Ocean Salinity mission payload are used to shift the operating frequency of the 72 receivers to an optimal intermediate frequency needed for the signal processing. The LO temperature variations produce phase errors in the visibility, which result in a blurring of the reconstructed brightness temperature (Tb) image.
At the end of the commissioning phase, it was decided to calibrate the LO every 10 min while waiting for a more in-depth analysis. During short periods of time, the LO calibration has been performed every 2 min to assess the impact of a higher calibration rate on the quality of the data.
By means of a decimation experiment, the relative errors of 6- and 10-min calibration interval data sets are estimated using the 2 min as a reference. A noticeable systematic across- and along-track pattern of amplitude ±0.3 K is observed for Tb differences between 10 and 2 min, whereas this is reduced between 6 and 2 min.
Scatter plots of the SMOS Tb differences between 10 and 2 min and between 6 and 2 min (see Fig. 1) at H-pol and V-pol are shown as a function of latitude (which is proportional to time) for the portion of the orbit shown in Fig. 1.
A simulation experiment confirms that the nature of such systematic pattern is due to the visibility phase errors induced by the LO calibration rate. The MIRAS Testing Software (MTS), which is an independent software that fully processes the SMOS raw data up to brightness temperature images, has been used to perform this analysis. Using a decimation scheme, the thermal noise (or radiometric sensitivity) in the measurements is correlated for the three cases (2, 6, and 10 min), since the measurements are exactly the same, and only the different LO phase calibration processing is performed.
Moreover, the impact of the LO phase calibration rate on the brightness temperatures has been analyzed at the antenna frame using real data (half-orbit over the Pacific), as shown in Figure 4. From this figure, it can be clearly observed that LO phase errors, at visibility level, translate into brightness temperature systematic errors, appearing as spatial structures at the antenna frame.
In summary, the visibility phase errors induce noticeable systematic patterns in the spatial Tb distribution, which are translated into SSS retrievals. Overall, the SSS error increase (relative to the 2 min SSS data) is about 0.39 and 0.14 psu for the 10- and 6-min data sets, respectively.
Work published in: Gabarró, C.; González-Gambau, V.; Corbella, I.; Torres, F.; Martínez, J.; Portabella, M.; Font, J., “Impact of the Local Oscillator Calibration Rate on the SMOS Measurements and Retrieved Salinities”, IEEE Transactions on Geoscience and Remote Sensing doi: 10.1109/TGRS.2012.2233744, in Press.