This year's DesignCon has more tutorials and panel discussions devoted to power management - among the most challenging technologies for ensuring signal integrity. Tuesday conference sessions in fact will feature a three-hour tutorial on power distribution and a late afternoon panel discussion on DC-DC converters
To suggest that the DesignCon presentations — with their emphasis on modeling and high-speed digital circuitry — never paid much attention to “analog” design would be a serious misrepresentation. What are we looking at as we try to preserve the “signal integrity” of a GBits-per-second data stream? To be sure, DesignCon 2017, opening in Santa Clara this week (Jan 31 – Feb 2), will showcase an impressive catalog of high-speed design tools and models for complex SoCs, as well as hands-on techniques for insuring signal integrity, while minimizing crosstalk and jitter.
DesignCon’s technical sessions typically focus on knotty engineering problems such as overcoming chip and module package parasitics, PCB absorption and reflections, high-speed serial clocking, techniques for measuring and simulating signal impairments, test and measurement methodologies, and the analysis of interconnects. The opening day keynote by Zoltan Cendes, founder of Ansoft Corp. — “Turning Signal Integrity Simulation Inside Out” — effectively sets the focus knob for the entire conference.
What may be a bit different this year are the number of tutorials and panel discussions devoted to power management — among the most challenging technologies for ensuring signal integrity. Tuesday conference sessions in fact will feature a three-hour tutorial on power distribution and a late afternoon panel discussion on DC-DC converters, several application focused presentation on power management for automobiles, power management for servers. Focused presentations will be looking at power delivery networks, bypass capacitors (their role in suppressing noise and jitter), distributed low-dropout regulators (LDOs), and power delivery for 10- and 7-nm chips. There sessions will review state of the art and identify the technology challenges moving forward.
Bring your notebooks
From a power management perspective, the most interesting panel sessions will be devoted to switching regulators. “Why We Love (hate) DC-DC Converters?” the panel asks. The point-of-load converters using pulse width modulated (PWM) supplies are getting smaller and more energy-efficient year-by-year. But they’re noisy, and there’s not a lot we can do about that, the DC-DC converter panelists agree. Their experience comes from IBM, Intel, Cisco and Oracle.
DC-DC converters offer almost perfect energy transfer efficiencies, says Tony Obrien of Cisco. They are easily 97% (compared with 85%, the then State-of-The-Art in 2007). Power density is almost 1000 watts in a ¼-brick module (2.30 x 1.45 x 0.42 in). Under the supposition that elevated switching frequency unlocks the treasure chest for smaller size and higher density point-of-load (POL) power modules, Obrien sees 400 kHz as today’s typical frequency. The panelists look for the switching speed advantages of gallium nitride (GaN), but cite 5MHz as the likely ceiling for switching power supplies using GaN.
The “hate” side of the DC-DC converter story is that they are noisy. And the smaller they become, the more they are injecting switching noise into to microprocessor’s circuitry.
Digital systems designers have a love-hate relationship with DC-DC converters. On one hand, converters offer higher power densities for large current consumers. On the other hand, they are noisy, and newer power module design techniques only increase susceptibility.
(Source: IBM (DesignCon 2016 Presentation))
Cisco Systems’ O’Brien identifies the power consumption architectures engineers will need to design for: In servers, which has been a driver for DC-DC converter development we’re seeing systems with 380V distribution buses, and local (in the rack) 48V-to-1V converters.
Panelist Jordan Keuseman of IBM sees the use of the PMBus as among the most promising additions to the DC-DC converter story. Inserting a microcontroller or state machine into the control loop of a DC-DC converter does not increase its energy transfer efficiency, nor improve its response time. The insertion does allow data centers (and other large power distribution networks) to monitor and control the phase and frequency of multiphase regulators in software.
Keuseman dismisses the usual argument about analog engineers not trained or well-suited to write code, but reminds that so-called “digital loops” have many analog elements. (We still need transformers for high-voltage power distribution, for example.) Power delivery misbehavior must be discovered and corrected before the power components leave the factory. Keuseman also points to wide bandgap semiconductors (GaN and Silicon Carbide) as a possible means to improve DC-DC converter performance, but the “revolution” widely touted five years ago is barely underway today.
The di/dt is very hard to characterize without lab measurements and the DC-DC converters are typically selected for worst case scenarios, in order to insure they hold their specs. Worst case conditions are rare, Keusement believes. And firmware updates can introduce instabilities — which makes the DesignCon case for LOTS of testing and simulation.
Voltage regulators in servers
Madhavan Swaminathan and Anto Davis, researchers at Georgia Tech http://c3ps.gatech.edu/ will use the DesignCon meeting to examine the power delivery mechanisms for large processors and SoCs. In servers and PCs, power is supplied by a multi-phase buck regulator, whose outputs are paralleled to deliver (in some cases) up to 200A. This architecture is changing, as the CPU substrate shares space with what Intel calls the ‘IVR” (integrated voltage regulator), Georgia Tech explains.
The ongoing application of Moor’s Law, the researchers insist, enables greater functionality in smaller packaging. But this forces the placement of voltage regulator modules (and prepackaged voltage regulator chips) on a shared substrate (with embedded passives) to shorten the power delivery path. Stacking techniques, the authors suggest, will be perfected as an ongoing technology development. But delivering “clean power” to the CPU will be the major challenge as the stacking techniques look to position switching regulators on top of the SoCs, making it likely switching noise will be directly injected into the CPU circuits.
Moore’s Law shows system volume shrinking and functionality increasing. With circuity imploding in size, providing CLEAN POWER has become a major challenge
(Source: Georgia Tech, Center for Co-Design of Chips, Packages, and Systems)
Legacy technology puts a prepackaged buck converter feeding a packaged processor on the same substrate (which embedded inductors and capacitors), says Georgia Tech. With the buck regulator horizontally feeding the CPU power rails, it becomes possible to “tune” the circuit by lengthening-or-shorting the space between the buck regulator and the SoC. The advantage, the authors suggest, is “reasonable” signal and power integrity. The level of integration is low; but the Solution is what the authors call “Non-disruptive.”
Newer technology, the authors say, packages the IVR (with a buck converter) and a prepackaged SoC (with LDO in package) in a SIP package (with embedded capacitors and inductors). The power delivery path from Buck Converter to SoC is horizontal on board. This topology allows slow/fast power phases, acceptable efficiency, but good Signal and Power Integrity. Integration is modest, but “non-disruptive.”
What is in development are specialized Integrated Voltage Regulators (IVR) for computer applications. These use SIP packaging (with embedded capacitors and inductors, shrunken by the application of high switching frequencies), buck converter, SoC and LDO (bare die in their own shared package). While the power path between buck converter and SoC and LDO combo is horizontal, both are mounted on the IVR substrate with a vertical conduction path. This offers high performance, high efficiency, modularity, excellent signal and power integrity. But it will require lower cost materials and thermal management. Integration is ambitious as the solution can be disruptive.
The most ambitious topology — already implemented in some Intel CPUs — packages the SoC, the LDOs, and buck converter in the same package, integrated vertically to the IVR (also featuring embedded passives). This is said to offer very high performance, but the modules (so far) are practically hand-crafted. The implementations thus far resist modularity, scalability, and thermal management. Technology challenges (moving forward) include the development of high-frequency multi-phase buck converters, and stacked power stages for high input voltage conversion.
Current research on power delivery architectures focuses on IVR-SIP solutions with both buck converters and LDOs on the same substrate.
(Source: Georgia Tech, Center for Co-Design of Chips, Packages, and Systems)
In Intel’s FIVR (Fully Integrated Voltage Regulator) implementation, a high-current buck regulator drives a battery of parallel LDOs with a very fast clock and specialized inductors in package. A 93% efficiency is cited for 1.7V:1.05V conversion, and a 140MHz switching frequency. Georgia Tech’s own implementation obtained a 91.1% efficiency with a 100MHz clock. Efficiency, researcher noted, were process dependent. (A 65nm IVR would offer somewhat higher efficiency than a 130nm CMOS implementation.)