Abstract:
The Aquarius/SAC-D mission was launched on June 10, 2011 from Vandenberg Air Force Base. Aquarius consists of an L-band radiometer and scatterometer intended to provide global maps of sea surface salinity. One of the main mission objectives is to provide monthly global salinity maps for climate studies of ocean circulation, surface evaporation and precipitation, air/sea interactions and other processes. Therefore, it is critical that any spatial or temporal systematic biases be characterized and corrected. One of the main mission requirements is to measure salinity with an accuracy of 0.2 psu on montly time scales which requires a brightness temperature stability of about 0.1K, which is a challenging requirement for the radiometer. A secondary use of the Aquarius data is for soil moisture applications, which requires brightness temperature stability at the warmer end of the brightness temperature dynamic range. Soon after launch, time variable drifts were observed in the Aquarius data compared to in-situ data from ARGO and models for the ocean surface salinity. These drifts could arise from a number of sources, including the various components of the retrieval algorithm, such as the correction for direct and reflected galactic emission, or from the instrument brightness temperature calibration. If arising from the brightness temperature calibration , they could have gain and offset components. It is critical that the nature of the drifts be understood before a suitable correction can be implemented. This paper describes the approach that was used to detect and characterize the components of the drift that were in the brightness temperature calibration using on-Earth reference targets that were independent of the ocean model.