In recent years, the performance and price of measurement equipment has increased dramatically. While a 5 GHz oscilloscope was state of the art in 2002, today bandwidths have increased to 30 GHz and beyond.
The price of high performance instruments has also risen reflecting the increased performance and this is a consequence of the level of research and development that goes into these instruments. Given the small market and high development costs associated with leading-edge equipment, one must ask why not simply focus on the low to mid range market where the volumes are much larger and product development costs are more reasonable?
The answer is two-fold; first, leading-edge technologies would not be possible without the tools to develop them and, second, as technology improves the high-end instruments of today will become the mid range of tomorrow.
In addition to higher transfer rates being driven by increasing processing power, communications applications are beginning to employ coherent optical modulation methods to increase bandwidth on long-haul links. Back in 2002 when USB 2.0 was being developed, a 5 GHz bandwidth oscilloscope was leading edge. USB 2.0 had a maximum transfer rate of 480 Mb/s and 2 GHz was enough bandwidth to measure it.
The latest version, USB 3.0 operates at 5 Gb/s and requires a 13 GHz scope to measure. Equivalent-time oscilloscopes which have been available for a long time have provided the necessary bandwidth but USB3 like many of the newer standards employs features such as spread spectrum clocking (SSC) which slowly varies the bit rate in order to spread the signal energy out in frequency.
Spreading makes it virtually impossible to measure the signal without a real time oscilloscope (one whose sampling rate is at least 2x its bandwidth). In addition, USB 3.0 signal compliance testing requires that the test instrument provide equalization in order to have a measurable eye. Equalization also requires real time sampling as it must process data from consecutive bits in real time.
Coherent optical modulation encodes information in the phase of the optical carrier (laser light). Because it is not practical to phase-lock one laser to another as would be required to detect the optical carrier in a manner similar to the way a radio receiver in a mobile phone does, the optical local oscillator is set very near the carrier wavelength and the resulting signal is demodulated using DSP techniques. A real time oscilloscope is essential for this type of detection as it provides the real time data record necessary for the processing. Optical signals using this type of modulation are currently operating at 56 Gbaud in experimental setups requiring a minimum of 30 GHz instrument bandwidth to measure.
In addition to measuring coherent optical modulation signals, high bandwidth oscilloscopes are used as digitizers in the development of techniques and algorithms for next generation optical transmission systems.
Indeed, 100 Gigabit Ethernet was developed through the use of digital oscilloscopes as A/D converters and software post-processing. Using this arrangement, different coding, decoding and compensation methods could be tested and their performance verified in terms of bit error rate. Once the ideal architecture was developed, hardware could then be designed.
While the high speed of serial data interfaces is challenging measurement technology for oscilloscopes, the increasingly complex protocols that run on these interfaces require analysis and debugging tools that are not only capable of running at high transfer rates but whose software provides the deepest set of tools.
Protocol analyzer companies such as LeCroy work closely with system vendors to adopt the latest electrical interfaces to its software tools which allow analysis of the complete protocol stack. This task is complicated by the need to support a number of high speed lanes in parallel. Features such as real time triggering on specific bus commands require complex hardware/software integration within the instrument.
While the cost for leading edge performance is high due to the investment necessary for development and the low volume (there are a much smaller number of projects at the leading edge of technology), the availability of such instruments is essential.
The most difficult part about investing in new technology is the risk and initially low volume. There is a large gap when migrating from USB3.0 from USB2.0 regarding measurement needs. It's difficult to find a host that is adequate enough to record real-time performance in all of the industry parameters. We have the issue of having the cart before the host, yet this never hinders the progression of leading edge technology. Wherever someone said we couldn't, we now live with light, thin-paneled LED TV's for instance that use a fraction of the energy of most other TV's. This article does a fantastic job of specifying the exact requirements for sound measurements of USB3.0 activity. As our technology gets faster, so our measurement tools must become more sophisticated to ensure sound measurements. It will be interesting to see how design companies whose focus is testing adapts to these major equipment investments.
Join our online Radio Show on Friday 11th July starting at 2:00pm Eastern, when EETimes editor of all things fun and interesting, Max Maxfield, and embedded systems expert, Jack Ganssle, will debate as to just what is, and is not, and embedded system.