The engineer confronting a 10-Gbit/second telecommunications electrical hardware design faces several challenges. The most common include creating a broadband circuit with high-frequency and high-phase response, meeting all the ITU-specified jitter parameters.
Basically, the frequency response of a circuit-which includes devices and transmission media-determines how much signal attenuation will occur and how much the transition time will be degraded. The phase response of a circuit determines the amount of intersymbol interference, which in turn results in jitter. The frequency and phase responses together affect the data-eye opening, which determines the timing margin for the interface.
Very little timing margin is available even for a 622-MHz interface when all the factors that reduce the margin are considered. Therefore the frequency and phase response of a circuit become critical factors in timing margin for interfaces at high frequencies.
The typical frequency response required for a successful circuit is virtually from a few kilohertz to 15 GHz. The low end of the spectrum is defined by the lowest frequency component of the data, the high end is determined by the transition times of the signals.
The highest frequency component of OC-192/STM-64 data is approximately 5 GHz. This translates to approximately a 15-GHz frequency component for the transition times, when you consider that the Third Harmonic is required for successfully transmitting a digital signal over a transmission line from a source to a load. The Third Harmonic is transmitted to reduce the transition times. Without it, a digital waveform is sinusoidal and the transition times will take up all of the margin for the timing interface of digital signals.
It is good practice to keep the OC-192/STM-64 serial data interface as short as possible-within a 50-ohm transmission line environment-on a printed-circuit board. The clock can be recovered from the data by a clock and data recovery (CDR) device and the data can be demultiplexed to a lower frequency and transmitted over longer pc board transmission lines for interfacing to a processor via a parallel bus.
Active circuitry can be utilized to reduce timing jitter, which in turn improves the timing margin. A phase-nulling circuit can do the trick by reducing the jitter present on a clock signal and therefore improve the timing margin. Jitter reduction not only improves the timing margin, it also improves the chances of meeting the ITU jitter requirements.
The phase noise present on the reference clock into the multiplexer is critical for the jitter performance. The shortcomings of a high-phase-noise reference clock on system performance can be significant since the reference clock is multiplied up to the OC-192 rate. The quality of clock outputs from CMOS or ECL ASICs is poor in terms of phase noise, and a phase-nulling circuit is necessary to clean up the mess.
So what is phase nulling and why is it necessary? Phase nulling eliminates high-frequency phase noise or jitter. It becomes necessary when a clock is provided as a reference from a source that produces signals with a significant amount of jitter. An ASIC that consists of CMOS, TTL or ECL technology can be considered a source that produces signals with significant phase noise. A phase-nulling circuit can be utilized to remove some of the jitter from the reference clock to a multiplexer, as that signal requires extremely low jitter.
Any phase noise (jitter) beneath the bandwidth of the phase-nulling circuitry is passed through, while any above is eliminated. This results in the need to minimize the loop bandwidth to optimally reduce the jitter. Therefore the loop bandwidth should be set to just above the point where the output from the phase-nulling circuitry locks to the input-what this means is that the output signal from the phase-nulling circuit is phase- and frequency-locked to the input signal.
To implement a phase-nulling circuit it is first necessary to compare the phase and frequency of the noisy signal with a stable signal such as a low-phase-noise clock. Then the stable signal must be phase- and frequency-locked to the noisy signal. Once this is accomplished, the stable signal can be utilized as a low-phase-noise reference clock.
Giga provides a 10-Gbit/s multiplexer (GD16585) that can be utilized with a dual PLL circuit implementation for phase-nulling purposes.
Now that some of the key design matters have been covered, it is necessary to define the jitter parameters that an engineer must design to before discussing the test setup and methodology necessary for verifying OC-192/STM-64 application hardware operation.
The ITU-T-specified jitter parameters are related to the optical interface at the equipment level rather than to specific component characteristics. Although simple in definition, they can be difficult to evaluate at the component level. As an inherent part of component jitter characterization, Giga seeks to emulate as closely as possible the environment found in a communication system. Included below are definitions for jitter tolerance, transfer and generation and test results from Giga 10-Gbit/s CDR/demux and mux devices (GD16544 and GD16555B, respectively).
The jitter tolerance of receiving equipment is defined as the sinusoidal peak-to-peak phase modulation that causes a 1-dB optical penalty. When characterizing a CDR component, white noise is added to the input signal. By varying the signal-to-noise ratio an input sensitivity curve of the CDR is obtained. (The GD16544 meets the jitter-tolerance requirements specified by ITU and Bellcore.)
The jitter-transfer function is defined as the ratio of jitter on the output relative to the jitter applied on the input vs. frequency. The input sinusoidal jitter should conform to the amplitude vs. frequency of the jitter-tolerance mask: 1.5 unit interval peak-to-peak (UI p-p) up to 400 kHz and 0.15 UI p-p above 4 MHz. There are two distinct characteristics in jitter transfer: the jitter gain (or jitter peaking), defined as the highest ratio above 0 dB, and the jitter transfer bandwidth.
In systems where several PLLs form the path from input to output, only the jitter gain is of importance to the CDR circuit and to the clock generator of the output device. The overall jitter transfer bandwidth is controlled by a low-bandwidth loop, typically using a VCXO.
In systems where the CDR is the only clock-generating element, the jitter-transfer bandwidth together with jitter tolerance must conform to ITU-T specifications. The GD16555 meets the jitter-transfer requirements specified by ITU and Bellcore.
The 10-Gbit/s output from the Giga GD16555 mux provides a jitter-generation signal of 0.03 UI p-p over the ITU-specified frequency range of 50 kHz to 80 MHz.
The jitter generation
A much misunderstood animal is jitter generation and how to measure it. This parameter is actually specified at the component and network interface level in the GR-1377-Core specification. Section 18.104.22.168 specifies the component-level jitter generation over a frequency range from 50 kHz to 80 MHz as 0.1 UI p-p without a specified measuring time interval.
Section 5.6.1 specifies that jitter for network interfaces measured over an interval of one minute should not exceed 1.5 UI p-p from 10 kHz to 80 MHz and 0.15 UI p-p from 4 MHz to 80 MHz. This requirement is at the network level and may be measured in the time domain, since a measurement time interval is specified.