The time-interleaved RF-ADC rely on several background calibrations to maximize its performance over process variation and maintain it over voltage and temperature variation. The background calibration algorithms require an input signal to be present over a period of time to converge and reach optimal performance. When the input signal is not present the calibration diverges and as result when the input signal is restored it takes a period of time for the calibration to re-converge.
Some practical applications such as burst mode signals used in cable applications and TDD wireless communications, or unpredictable signals from the likes of radar applications will degrade the accuracy of the background calibration if not handled properly.
A magnitude detector is built into each RF-ADC channel to monitor the RF-ADC data and detect when the signal magnitude is lower than a certain threshold. The on/off state of the signal is presented to the programmable logic and can be used through the real-time port to freeze or enable the background calibration blocks.