# Cut ADC Skewing Errors

Today's high-speed digitizers used in, say, oscilloscopes, may employ more than one ADC (analog-to-digital converter) per channel and interleave them. Even if they don't, many four-channel oscilloscopes can interleave two channels to provide 2x sampling rates on one or two channels over that attainable from three or four channels.

When two ADCs are interleaved, you need an effective sample-clock frequency that's two times the frequency needed for a single ADC. Thus, two ADCs can produce twice as much data as one. Interleaving two ADCs isn’t, however, so easy. The sample clock's duty cycle must be close to a perfect 50 percent to minimize distortion. Clock jitter and ADC gain also add distortion to the sampled signal. Digital techniques can take care of much of the timing skew and clock jitter, but some distortion remains, particularly from mismatched ADC gains.

A recent article in EDN, "Wideband error correction elevates time-interleaved ADCs" by Per Loewenborg at Signal Processing Devices addresses the issue.

The figure below (Figure 2 in the article), shows the problem. The signal to be sampled is the blue sine wave. But, because of mismatched ADCs, an aliased distortion signal (red trace) is created, which adds to the original signal, distorting it.

Loewenborg's solution to the problem is to process the sampled signal in the frequency domain. That is, run a Fourier Transform on the signal. The result shows spurs -- unwanted frequencies -- that can be filtered out using digital techniques.

Loewenborg concludes:

The large benefit here is that the sampling frequency has in fact been doubled compared with the state-of-the-art single-core 14-bit ADC. Thus, one can draw the conclusion that digital time-interleaved ADC mismatch error correction makes the SFDR (spurious-free dynamic range)-performance of the time-interleaved ADC array to correspond to that of a single-core ADC.

Take a look at the EDN article. Does Loewenborg's theory seem practical? Share your thoughts.