Advances in digital subscriber line data communication are opening a new world of Internet access for home users, a world where they can speed to a remote server at data rates of 512 kbits/second to 8Mbits/s. This should make DSL modems immensely popular-particularly since such high data rates are achieved over a common household telephone-wire pair, though the real "magic" is in the DSP algorithms and the data coding schemes.
However, as with almost any system, DSL still requires good fundamental analog amplifier functions to get power onto the wire and to pick off the small signals received at the other end. A review of the requirements placed upon the amplifiers and some of the challenges that need to be addressed will take some of the mystery out of DSL modem design.
The data information is processed through a circuit block called an AFE, for analog front end. Transmitted and received information is typically band-limited to the frequencies of interest for the particular type of modem (e.g., full rate ADSL, G.Lite, HDSL2,). The interface to the line is typically through a transformer for isolation and differential signal drive. A differential high-speed power amplifier stage drives the primary coil of the transformer to put the signal on to the phone line. Two back-termination resistors (Rbt) between the amplifier (driver) and the transformer are used for impedance matching. Proper selection of these resistors makes the impedance of the modem as seen from the line equal to the characteristic impedance of the line: 100 ohms or 135 mohms are typical.
The received signal is also coupled through the transformer and imposes a differential signal across the Rbt resistors. These signals are amplified by another pair of high-speed, low-noise amplifiers for the line receiver function. The configuration and scaling of the gain-setting resistors provide a first-order cancellation of the simultaneously occurring transmitted signal (echo cancellation). This configuration is called a 4-to-2 wire hybrid.
The toughest of the standards to implement is the full-rate ADSL, used for downstream communication from and within the telephone company central office. For this standard, a full 100 mW of power needs to be put on the line in 256 carrier tones spaced 4 kHz apart covering a bandwidth over 1 MHz.
The coding scheme used for DSL puts some interesting demands on the line-driver design. Excessive harmonic distortion in any of the tones can contaminate the frequency space used by other tones. The total harmonic distortion and any intermodulation distortion products of the driver amplifiers have to be more than 70 dB below the signal level of the fundamental carrier frequency (dBc).
The simultaneous combination of all of these tones makes the actual signal that appears on the phone line look like noise. The frequency and phase response of the line loop, together with the randomness of the data patterns transmitted, will occasionally (0.0001 percent of the time) align and create a peak in signal level. These peaks are significantly larger than the nominal signal level on the line (3.82x for HDSL2 or 5.33x for ADSL). This factor is called the PAR, or peak-to-average ratio. This peak must be handled by the driver without clipping or current limiting, or transmission errors will occur. This characteristic plays a major part in selecting a driver with sufficient output current and determines the power supply voltages to be used for the driver.
In order to help minimize distortion from the driver stage, a differential configuration is used. If the gain and phase shift with frequency of each amplifier are perfectly matched, all of the even-order harmonic products are canceled. This can be explained mathematically by taking the difference between the two amplifier outputs with each output voltage function expressed as a power series.
In DSL applications, dc precision of the amplifiers used is not a critical factor. Current feedback amplifiers and some of the newer voltage feedback amplifiers designed specifically for high-speed operation are ideal choices for DSL line-driving applications. They offer a very wide bandwidth and slew at very fast rates. These two factors combine to preserve the gain and phase shift of the transmitted line signal. All that is needed for a driver is power transistors in the output stage to provide the necessary current to achieve adequate line signal power.
In a G.Lite training-up sequence frequencies used vary from 30 to 552 kHz. The modem uses the lower band of frequencies (up to 140 kHz). During the train-up sequence, both the modem and the central office put out large-amplitude carriers covering the entire spectrum to characterize the line and determine which carriers are best to use for error-free data transfer. Any distortion from the driver will contaminate other carriers and possibly eliminate them from use for that connection. This directly reduces the possible data rate for a given loop length (reach).
Once transmission begins, the intelligence at each end of the line adjusts the power level and frequency content of the signal to what is adequate for the connection. Higher frequencies are usually attenuated more over the line than lower frequencies. Distortion that interferes with the lower frequencies can cause trouble in finding an adequate number of carriers to use over a long line and result in no connection.
Varying driver requirements
Depending on the DSL standard, the operating requirements of the driver vary significantly. The most important operating conditions are supply voltage, peak output current and total power dissipation. The turns ratio of the line-coupling transformer controls the supply voltage and the peak output current. In general, with increasing turns ratio, peak driver current rises but operating supply voltage drops, leaving the power dissipation in the driver fairly constant.
A single-port modem card is not too demanding in terms of power requirements and heat generation. However, incorporating many ports on a single card can be a real eye-opener. The combined requirements of many channels can lead to major design challenges. For example, a 16-port ADSL card will require a 24-V supply with nearly 6-A peak current capability and will dissipate nearly 30 W if all ports are active simultaneously. The line power requirements for a DSL application are a given, but how they are achieved can be tailored to minimize power dissipation and heat. Three basic parameters can be optimized: the quiescent operating current of the amplifiers, and minimizing the supply voltage and pc-board layout for spreading heat.
Several of the newer power amplifiers provide pins through which the quiescent current can be controlled by selecting an external resistor. A dual amplifier such as the LT1795 has internal current mirrors that scale the current through an on-chip diode to provide bias to the amplifiers. Once the supply voltage is known, a single resistor is chosen to set the quiescent current of each amplifier over 0 to 30 mA. Running the amplifiers at the maximum current wastes power when the line is not in use and can add significantly to the total power dissipation. With a plus/minus 15-V supply and 60 mA of supply current, the quiescent power dissipation alone is 1.8 W.
Activity-based power-dissipation management reduces power dissipation in a line driver when it is not in use. Many amplifiers have a shutdown input line that completely disables them and drops the supply current to microamp levels. When enabled in preparation for another transmission, the LT1795 comes alive quickly and begins processing the input signal in 2 microseconds. This short time is insignificant in comparison to the line training-up interval. However, with complete shutdown, the output stage is also turned off. This removes the dc bias for the back termination resistors, which interferes with the receiver circuitry. To keep the receiver active and listening to the line, a partial shutdown is preferred. Retaining just enough current through the amplifier to keep the output stage biased keeps the termination resistors connected, yet still reduces the idle operating current by a factor of 10 from normal operating conditions.
Choosing the right supply voltage levels for the driver is very important for providing both an unclipped peak voltage swing and setting the power dissipation of the driver. The total supply voltage required includes the saturation voltage of the amplifiers, how close the amplifier output swings to the supply voltage rails, the signal loss across the Rbt resistors and the transformer turns ratio and insertion loss. These factors are added to the peak-to-peak voltage swing required on the line for a particular DSL standard.
The most significant way to reduce the driver supply voltage is to let the transformer provide signal amplification through a higher turns ratio. However, the limit to this approach is that the more the transformer steps up the transmitted signal, the more it also reduces the received signal. Another, more obvious, approach is to modify the power supply output voltage to match just what is required.
The line driver function requires more attention than the receive amplifiers. For the receiver, low noise is the primary selection parameter. A little fundamental analog know-how and proper selection of the amplifiers used make the magic of DSL a reality.
See related chart