There are several common pitfalls to look out for when putting together a 10-Gbit/second system. Here are some real-life examples:
An OC-192 application where a CDR demultiplexer and a multiplexer were utilized with a 155.52-MHz reference clock from a VCO. The output of the VCO produced a clock signal containing 200-ps peak-to-peak jitter. The clock was provided as a reference to both the 10-Gbit mux and CDR/demux where the signal was multiplied by 64 to generate the 9.953280-GHz clock.
The result: The CDR/demux was not able to lock to the incoming 10-Gbit/s data stream and the mux provided nothing but noise at the data and clock outputs. The designer commented, "I thought that 200 ps of jitter for a 155-MHz clock was pretty good-after all, the period is 6.43 ns." The circuit worked as expected, within ITU limits, when the noisy VCO was replaced with a low-phase noise device that provided a reference clock with approximately 1-ps point-to-point jitter.
The 622-Mbit/s interface between the 10-Gbit CDR/demux to an ASIC and the mux to another ASIC caused the design to fail because high-speed-transmission-design techniques were overlooked. The designer paid attention to the 10-Gbit signal paths but was sloppy with the lower speed 622-Mbit/s signals.
The result: The pitfall was a combination of pc board mismatches that led to a poor transmission medium for the signals. The transmission lines were not properly designed. They were partly differential and single-ended from the source to the load. Also, a connector used with the 622-MHz signal interconnect did not have good ground shielding and was not 50 ohms. The 622-MHz signals were distorted and bit errors were occurring. The remedy was to use a high-frequency 50-ohm connector-rated at a frequency response of more than 2 GHz-with good 50-ohm differential stripline transmission lines on the pc board. The differential traces were also isolated within the layout from other signals.