by Philip Pratt, Texas Instruments
Previously analog-to-digital converters (ADC) at high input frequencies were limited in usefulness due to distortion and noise performance. Today, however, ADCs can provide nearly 9.5 bits of effective number of bits (ENOB) at radio frequencies (RF) of 1 GHz with signal bandwidths greater than 200MHz. Such performance at high frequencies eliminates a mixer stage, simplifying receiver design to improve overall system performance.
Until now, the only way to capture up to ~200MHz of bandwidth and achieve ENOB values greater than nine bits was to use a receive architecture that down-converted the RF to an IF (intermediate frequency) of less than 750MHz. While this is a fairly high frequency for an ADC, it is not high enough to eliminate a mixing stage in some applications. The elimination of an IF mixing stage helps in many ways. Mixing stages have an adverse affect by adding power consumption, heat, board space, cost, noise and distortion. Figure 1 offers a glimpse of ENOB versus input frequency for today’s best converters capable of capturing at least 200MHz (values estimated from device datasheets).
Challenges of VSWR
In a receiver, RF components must carefully match impedances to maintain acceptable voltage standing wave ratio (VSWR) values that do not cause the loss of too much signal due to transmission line reflections. This is true whether driving a mixing stage or an ADC. If the impedances of the driving circuit and load circuit cannot be sufficiently matched at high frequency, the driving circuit may not be able to drive the load and still maintain an acceptable signal level at the load. Ideally, the VSWR should be a ratio of 1:1. That is, all of the power transmitted to the load is absorbed by the load, with no reflection of energy.
In reality, this never happens. Most RF designers hope to obtain VSWR values <1.5:1 in actual board design. The higher the ratio, the more power the driver will need in order to maintain the same effective signal level at the load. Distortion out of the amplifier usually increases with an increase in signal level, so compensating for bad VSWR with high drive levels from the driving amplifier could corrupt the signal chain.
Typically at high frequency, an ADC’s inputs are driven ac-coupled through a transformer or balun after the signal has passed through a band-pass filter (to eliminate signal aliasing from unwanted Nyquist-bands). To perform a computer simulation of the load impedance presented by the ADC, include the balun, band-pass filter, the source impedance of the driving amplifier, and board parasitics in the simulation. The computer simulation is a good start to understanding the impedance of the total load at the frequency of interest.
Input circuits should be high impedance so that the parallel impedance interaction between the external circuitry and the internal circuitry will not create impedance so low it cannot be driven efficiently. High input impedance provides maximum flexibility to the circuitry outside the ADC.
If the ADC does not have an integrated high-impedance input buffer, it becomes nearly impossible to drive it at high frequencies while maintaining reasonable distortion from the driver amplifier. Without the integrated input buffer, the ADC input impedance at high frequency is often so low it becomes unacceptable and the amplifier cannot keep up without insufficient performance or extremely high power. The ADC input impedance will be low without an integrated buffer amplifier because the driving circuit most likely will see the sampling-capacitor of the ADC, which has low impedance at high frequencies. For this reason, it is common for an ADC to have an integrated input buffer when designed for use at these frequencies. Even with the buffer, the task is still not trivial. At high frequencies, the parasitic capacitance of the ADC’s package can become a significant component in the overall input impedance.
Take a look at Figure 2, the equivalent input circuit of an example ADC with a wide input bandwidth. The measured –3dB full power bandwidth (FPBW) of the ADC is ~2.3GHz (the frequency at which the input signal is attenuated by 3dB before being digitized). Notice that before the input signal gets to the die, it passes through parasitic capacitance and inductance in the package, bond wire, and bond pad. The ADC track-and-hold and sampling capacitor are located passed the buffer inside the ADC. The frequency characteristics of the sampling circuits affect FPBW, but are isolated from affecting the VSWR of the analog inputs because of the presence of the buffer. This is important to point out, as it is not sufficient to use the ADC’s FPBW specification as a final answer to the question of whether or not the ADC can be driven at the frequency of interest.
Challenges of Clock Jitter
One requirement that is not eliminated by removing a mixing stage is the high-quality oscillator. Instead of driving a mixer with an oscillator, you drive the ADC clock input. The ADC is essentially providing the mixing by means of the clock and sampling circuit. The higher the analog input frequency, the tighter the requirement for total combined jitter between the clock and ADC. To give you an order of magnitude: if an ADC has internal aperture jitter of ~150fs, the RMS jitter at the ADC clock input pins must be <50fs to achieve 60dBc signal-to-noise ratio (SNR) at 1200MHz analog input frequency. The clock jitter requirement is a function of the analog input frequency, not the clock frequency. A high-quality oscillator and likely a bandpass filter are required in the clock path to achieve such low RMS jitter. A discussion on how to determine the required jitter is detailed in “High-IF sampling A/D converters beyond baseband frequencies,” written by Chuck Sanna [1] as well as the ADS54RF63 datasheet [2].
Distortion, Noise, and EVM
ENOB is a function of both noise and distortion and is an excellent way to summarize the overall quality of a converter. Sometimes one or the other is dominant in the ENOB calculation. Depending on the bandwidth of the signal of interest, some systems can achieve the desired ADC noise floor through the processing gain of decimation. Distortion can lead to noise in the data link, so it is important to look at the achievable spurious-free dynamic range (SFDR) at the frequency of interest. The SFDR is a measure of the worst spur, usually a distortion component and cannot be easily improved without adding the complexity of dithering circuits or digital linearization techniques, which are limited in effectiveness. A wideband modulated signal will also modulate onto the ADC’s distortion. Often this will fall across the signal of interest in the frequency domain (due to aliasing), adding to the in-band noise. If the distortion is good, the modulated version of it will be low in magnitude and cause added in-band noise. If the ADC is receiving a wideband modulated signal and the distortion is not adequate, the error vector magnitude (EVM) will be degraded and the constellation points harder to detect, ultimately leading to a higher bit error rate for the data link.
As technology and design know-how continue to improve, the performance at high IF and clock rates should continue to increase.
References
1. “Using high-IF sampling A/D converters beyond baseband frequencies,” by Charles Sanna, Texas Instruments, EETimes Europe, October 5, 2007: https://www.eetimes.eu/201804147
2. To download datasheets, order samples or evaluation modules, visit: www.ti.com/ads5474.
About the Author
Philip M. Pratt is a Systems Engineer for High-Speed ADCs at Texas Instruments. Philip received a BSEE from the University of Tennessee, Knoxville and a MSEE from Florida Institute of Technology, Melbourne.