By Randy Stephens, Texas Instruments, CommsDesign.com
ADSL technology has been a resounding success and the number of subscribers is expected to continue
growing over the next several years. But with the consumer quest for more speed
continuing, many believe that VDSL is the next step in the evolution of wired
Like ADSL, VDSL must communicate to the outside world through the use of
analog signals. And, just like ADSL, the analog interface issues facing VDSL are
complex, irrespective of what line coding scheme (QAM or DMT) finally emerges as
In this article, we'll take a look at the analog interface issues that
designers must grapple with when developing a VDSL front-end. During the
discussion, we'll look at bit error rate (BER) requirements, noise, distortion
and more. To kick it off, however, let's start by examining VDSL line driver
Defining the Line Driver Requirements
VDSL breaks apart the
spectrum into five different bands as shown in Figure 1. It is very easy
to obtain symmetrical data rates compared to the lop-sided ADSL spectrum.
Granted, line conditions and the service providers will dictate the actual line
rate utilization, but the capability of achieving symmetrical data rates exists.
Additionally, it is up to each manufacturer to utilize two, three, four, or all
five bands if desired. This allows tailoring the data rates very easily based on
the desired service options a telephone company can offer.
Figure 1: T1.424 upstream and downstream bands.
Regardless of the data pump, the data converters, and any filtering, the
signals have to be transmitted in the time domain using analog amplifiers. An
example circuit configuration employing a 1:1 transformer ratio is shown in
Figure 2: VDSL line driver circuit with +14.5-dBm line output and
The circuit shown in figure 2 uses traditional termination techniques for
simplicity. This configuration has been widely used in ADSL and has proven very
effective based on the differential nature of the configuration. The advantages
of this configuration include doubling the voltage appearing through the
transformer, even-order harmonic reduction, and the inherent balanced signaling
maintained throughout the system.
Power Level Concerns
The VDSL draft standard for T1.424 states that
the maximum power level on the line shall be no more than +14.5 dBm for the
central-office (CO) deployment for both upstream and downstream signals while
using 100-ohm line. For the cabinet deployment, the downstream is limited to
+11.5 dBm and +14.5 dBm for the upstream. By contrast, ADSL allows +13 dBm for
the upstream and +20 dBm for the downstream. The ETSI standard dictates a
maximum of +11.5 dBm for all conditions while using 135-ohm line.
One nice feature of VDSL is the standards allow the system to transmit at the
same -40 dBm/Hz power spectral density (PSD) level as ADSL within the ADSL
frequency range of 25 kHz to 1.1 MHz. Above 1.1MHz, the limit is generally
-60dBm/Hz, but some masks allow as much as -50dBm/Hz at some points in the
spectrum at up to 12 MHz. The benefit is VDSL can achieve the same data rates at
the same distances of ADSL while allowing the capability of achieving
substantial rate improvements for shorter loop lengths.
Correlating these power levels into voltages and currents is the first thing
that must be done. For T1.424 and the 100-ohm load, +14.5 dBm implies a voltage
of 1.68 Vrms and +11.5 dBm implies 1.19 Vrms. For the ETSI standard, +11.5 dBm
implies a voltage of 1.38 Vrms. Using these values, the next step is to examine
what the line driver amplifier must provide in terms of peak voltage and
But the values shown thus far have been root-mean-square (rms) values. Just
like ADSL, when multiple tones are produced in the time domain, a large peak can
and will occur. To keep the bit error rate (BER) below the standard's 1 x
10-7 level, clipping must be held to a minimum.
To define how much clipping is allowed a crest factor (CF), or
peak-to-average ratio (PAR), is typically defined for the system. The proper PAR
is frequently debated and each manufacturer appears to have a different value
for PAR based on code implementation incorporating PAR reduction. Typical values
for PAR range from 15 dB to as much as 18 dB for DMT systems.
BER and Transformer Ratio Requirements
For the analog line driver
design, the peak voltages and currents must be utilized to ensure meeting the
BER requirements. Therefore, the power supplies must be chosen to support the
peak output voltages, plus the amplifier's own internal headroom, plus some
safety margin. For most amplifiers this headroom is at least 2 volts.
Remember that this is a dynamic headroom and not a static or DC headroom.
Additionally, the amount of current being pushed by the amplifier is the maximum
when driving the peak voltage. This implies that an amplifier's data sheet may
not show everything required to select the proper amplifier.
Another factor for the line driver amplifier output requirement is the
transformer ratio. A low ratio, such as 1:1, allows the maximum amount of
receive signal into the system. It also places the lowest demands on the noise
requirements of the system.
Due to the very high frequencies involved with VDSL, coupled with the
attenuation characteristics of twisted-pair copper, having the highest receive
signal possible"i.e. lowest transformer ratio"will typically achieve the best
receive data rates. As shown in Figure 2 above, a 1:1 transformer, with a PAR of
16.9 dB, producing +14.5 dBm of line power requires the line driver to produce
an output voltage of 11.8 Vp (23.6Vpp) and an output current of 118 mA peak. The
rms requirements are a very modest 1.68 Vrms and 16.8 mArms due to the effective
loading the transformer reflects that is equal to
On the other hand, using a large transformer ratio, such as 1:2, allows the
use of lower power supplies for the line driver. Lower power supplies can lead
to lower power dissipation and consumption. But the noise demands are now harder
to achieve and the receive signal is now reduced by the same ratio. In addition,
the output current of the line driver increases by the same transformer ratio.
Using the same example as before, the line driver now needs to produce an
output voltage of 5.9 Vp (11.8Vpp) and an output current of 235 mA peak.
Remember that this has to be produced by the line driver at frequencies up to
12-MHz. Controlling this amount of current with low distortion can become very
difficult to achieve.
Table 1 shows the different standards' impact on the line driver
amplifier. ADSL levels are shown for comparison. Note that these numbers do not
take into account the transformer losses, which can typically be between 0.2 dB
and 0.5 dB across the entire spectrum.
Table 1: Comparison of Line Driver Amplifier Output Requirements
When examining the amplifier output requirements shown in Table 1, adding the
insertion loss to the line power accounts for the added power loss. For example,
if there was 0.5-dB insertion loss the amplifier would need to create an
equivalent +15-dBm power level at the output of the amplifier. This forces the
amplifier to produce 12.45 Vp and 124.5 mA peak for a 1:1 transformer in-order
to yield +14.5 dBm on the line (given the 0.5-dB insertion loss through the
Line driver noise is another key issue with the
system. An amplifier will produce noise across the entire frequency spectrum,
not just within the transmit band. This noise will exist in the receive bands
and it is up to the hybrid to reject the signals and noise coming out of the
line driver from getting into the receive path.
But due to uncontrolled line conditions, the hybrid typically is only able to
reject the transmit signal and noise by 6 to 20 dB. The basic rule of thumb is
to assume that there is only 6 dB of rejection, but this can be as bad as 0 dB
in some scenarios. This means that any noise generated by the line driver will
couple through the receive path with only 6 dB of rejection.
It is desirable to meet a -140-dBm/Hz line noise goal of the system. This
implies the total differential output noise of the amplifier must be no more
than -140 dBm/Hz + 20log (2/N) (where N is the transformer ratio in the
traditional system as shown in Figure 3).
Figure 3: Noise contribution in a typical line driver
In order to meet the noise goal of -140-dBm/Hz in a 100-ohm system, the
differential output noise of the line driver must be no more than 63 nV/ √Hz for
a 1:1 transformer and 31.6 nV/ √Hz for a 1:2 transformer. This is not
necessarily an easy number to achieve as the output noise is directly influenced
by the gain of the amplifier, the amplifier's voltage noise, the amplifier's
current noises, and the resistor values.7.
Due to the speeds involved with the transmit signals, current feedback (CFB)
amplifiers are the line drivers of choice. Their slew rates are typically well
over 2000 V/μs, do not have a gain bandwidth limitation like voltage feedback
(VFB) amplifiers have, and have relatively low noise when used at gains greater
Dealing with Distortion
The next thing that must be considered for
the line driver amplifier is distortion. Multi-tone power ratio (MTPR) has been
around since ADSL's beginning. In theory this is a good requirement as compared
to harmonic distortion since there are hundreds to thousands of tones being
Think of MTPR as the same as a third-order intermodulation distortion (IMD3)
test but to the extreme. The test is performed by transmitting all tones in the
transmit band(s) except one"the missing tone. The amount of distortion being
placed into the missing bin as a result of distortion is measured and the
difference is taken as MTPR. Keep in mind that an amplifier only amplifies what
is placed at its input. Thus, if there is distortion coming from the codec, the
amplifier will amplify this distortion accordingly.
The MTPR requirement was defined from the original ADSL standard as being
equal to (3B + 20) dB where B is number of bits of the system. Currently, many
VDSL systems appear to be 10-bits. But, it is expected that this will increase
to at least 12-bits in the near term and eventually to 14-bits as the need for
faster data rates comes into play. For 12-bit systems a MTPR of 56 dB is
expected while a 14-bit system requires 62 dB of MTPR.
But the problem with this 3B + 20 requirement is that it has been proven time
and time again that having only 52 to 54 dB of MTPR in an ADSL system shows good
enough performance. Most ADSL designs strive to use a value of 15-bits for B
resulting in a minimum requirement of 65 dB, although 52 dB worked fine in the
system. This implies that MTPR is not a de facto standard, but rather a figure
of merit. The higher the MTPR, the better the line driver's linearity and the
potential to produce a clean signal.
Receive Band Spillover
Another distortion test that makes more
sense for the line driver is receive band spillover. The only problem is this
kind of specification is not typically found in manufacturers' data sheets.
Figure 4 illustrates the concept of this test for a downstream
amplifier. Essentially this test produces all tones in the transmit band(s),
just like the MTPR test. Although individual tones such as the ones used for
MTPR can be used, this is an extremely harsh test and not very realistic in the
actual system environment. Instead, a modulated test signal such as the one used
for the training sequence (also called showtime signal) is best. The amount of
distortion produced in the receive band(s) is then examined.
Figure 4: Upstream receive band spillover distortion.
If is the line driver is producing distortion in the receive band, this
effectively raises the noise floor of the receive band(s). Just as it is
important for the output noise of the line driver to be very low due to the poor
hybrid rejection, the receive band distortion is equally important. The net
effect of this distortion is a reduction in receive data rates and reach.
Achieving a receive band distortion level of better than -68 dBc is highly
desirable for VDSL. In contrast, for ADSL where the receive band is typically
25-kHz to 138-kHz, the receive band distortion should be better than 90 dBc to
achieve good receive data rates and long line reach.
One may think that -68 dBc is not very good, especially compared to ADSL. But
considering the 10X wider bandwidth"12MHz vs. 1.1MHz"and up to 10X the number of
tones being utilized, achieving this level of receive band distortion is a
challenge for the line driver. Couple this with the requirement for low noise,
potentially high signal gains, and the need to drive peak currents in excess of
100 mA, and the design task becomes even more challenging. Again, a CFB
amplifier is a perfect choice for meeting the distortion requirements.
Learning from ADSL Mistakes
VDSL draws upon the development and
teachings of ADSL. One thing that has been a major concern for ADSL is power
consumption in the central office (CO). Especially when the CO has to provide
+20 dBm onto the line and have up to 72 lines on a single PCB. To combat this
ADSL now utilizes synthesized impedance, or active impedance. This technique
effectively reduces the series resistor value needed for back termination and
receiving of the incoming signal from the line.8
The advantage of performing active impedance is that the signal loss across
this resistor is considerably reduced. The line driver's output voltage can then
be reduced along with the power supply voltage. This reduces power consumption
by as much as 50 percent.
The drawback to this impedance technique is that the receive signal is
significantly reduced. For VDSL, where the receive signal frequencies are very
high and have the most attenuation from the line, this is very risky. If active
termination is to be used the synthesis factor, the ratio of the original
resistor value to the new resistor value, should be kept reasonably small. ADSL
systems today typically utilize a synthesis factor from four to as high as 10.
In contrast, VDSL should strive to use a synthesis factor of no more than two or
three except for special circumstances or for short reach systems.
VDSL is still in a state of flux with modulation schemes
still being sorted out. Regardless of what modulation scheme wins out, the
fundamental issues defining the line driver will remain essentially the same.
The need for low total output noise, very low distortion"especially in the
receive band(s)", the ability to drive large currents at up to 12-MHz, and low
power are key to the line driver. Using a low transformer ratio is imperative to
achieve the ultimate in receive performance, which requires large power supply
voltages typically as high as +/-15 V.
Even using some amount of active termination to help conserve power will
require the use of large power supplies to achieve the high receive data rates
and long reach required for a VDSL system. This is not an easy thing to achieve,
but it can certainly be accomplished.
- T1E1.4/2002-031R1, " Very-high-bit-rate Digital Subscriber Line (VDSL)
Metallic Interface, Part 1: Functional Requirements and Common
Specifications", Vancouver, BC, Canada, February 18-21, 2002.
- T1E1.4/2002-011R3, " Very-high-bit-rate Digital Subscriber Line (VDSL)
Metallic Interface, Part 2: Technical Specification of a Single-Carrier
Modulation (SCM) Transceiver", Greensboro, NC, November 05-09, 2001.
- T1E1.4/2002-031R2, " Very-high-bit-rate Digital Subscriber Line (VDSL)
Metallic Interface, Part 3: Technical Specification of a Multi-Carrier
Modulation Transceiver", Vancouver, BC, Canada, February, 2001.
- ETSI TS 101 270-2 V1.1.5 (2000-12), " Transmission and Mulitplexing (TM);
Access transmission systems on metallic access cables; Very high speed Digital
Subscriber Line (VDSL); Part 2: Transceiver specification".
- ITU-T G.993.1, "Series G: Transmission Systems and Media, Digital Systems
and Networks; Digital sections and digital line system -- Access networks;
Very-high-speed Digital Subscriber Line Foundation -- For Consent", October
- ITU-T Temporary Document BB-R16, Study Group 15, "G.hs ter: Draft text of
G.994.1 for submission to TSB for AAP processing", June 2002.
- Texas Instruments Literature number SLVA043, "Noise Analysis in
Operational Amplifier Circuits".
- Texas Instruments Literature number SLOA100, "Active Output Impedance for
ADSL Line Drivers".
About the Author
Randy Stephens is a member of the
technical staff with Texas Instruments. For the past 5-years he has been in
charge of new product definition and applications for high-speed amplifier
products concentrating on xDSL systems. He graduated from Rochester Institute of
Technology in 1994 and obtained a BS degree in EE. Before joining TI, he was an
Analog Design Engineer with Trek Inc. for 4 years. Randy can be reached at firstname.lastname@example.org.