This article covers the two main loss effects related to cables (the skin effect and dielectric losses) and presents a simple cable modeling method for use in standard SPICE simulators.
Cables are used in many high-frequency board designs and can become a critical element in the signal path. This is especially true for signals that exceed 500MHz. If not modeled as part of that system, cables can lead to unexpected system performance degradation and to costly time delays in debugging and corrections. Even with this understood, cables are notoriously difficult to model correctly.
Using a simple transmission line model may not effectively model this element because it is difficult to model a cable both in the frequency and the time domains.
Cable nonideal dispersive effects can affect system performance. These cable effects are seen on drivers, buffers, and comparators. In the driver or buffer, the low-frequency dribble up primarily degrades propagation delay versus pulse-width dispersion. It also degrades minimum pulse time and rise time. In the comparator, the low-frequency dribble up primarily degrades delay versus pulse width and propagation delay versus overdrive. However, it also degrades the minimum pulse width. This article discusses the two main loss effects related to cables (the skin effect and dielectric losses) and presents a simple method for modeling the cable for use in standard SPICE simulators.
To continue reading this article on EDN.com
, follow the jump