Power has pushed performance out of the spotlight. Yes, speed still matters, and, especially in mobile applications, we haven’t yet reached that desktop level of “good enough.” So there’s more work to do to make mobile devices faster, even as more duties are piled on.
But people won’t buy a phone that they have to recharge every couple hours, no matter how fast it is. So it’s fair to say that power has to be consulted before performance gets to have its way. And the search for power savings has now affected every level of the design hierarchy. It used to be pushed down to the circuit designers, but low-level techniques, like the use of multi-VT transistors and clock gating, bring only moderate gains in the power struggle.
The real game is now at the system level, and this involves software as well as hardware. In particular, the ability to shut off parts of the circuit when not in use has become an important consideration. In fact, there’s been something of an attitude shift in some quarters: instead of starting with an “everything on” default, and turning things off when you can, start with an “everything off” assumption and turn on only what you need.
Regardless of which of these angles you take, you end up with power “islands” on the chip that can be on or off, and control of the power state can be set either by hardware or software (either low-level firmware or even an application). This reflects an endpoint in the ongoing trajectory away from monolithic power. And it adds new verification challenges, especially at the emulation stage, when an SoC is being validated before the actual chip circuitry may be complete.
Moving to the islands
Chip designers are taught some basic assumptions when designing fundamental circuits. Key amongst them is that VCC (or VDD, depending on whether your pedigree includes bipolar junction transistors) and ground are connected. This is particularly important for digital design. Analog designers are used to having to pay attention to every detail short of the phase of the moon, but digital design is all about abstraction. And abstraction involves assumptions.
The assumption that power and ground are connected is critical to logic synthesis and design at the behavioral level. If you break that assumption without telling anyone, diodes that should block electrons stop blocking, and currents that you never expected can wreak analog havoc on the otherwise clean, Boolean design. With a monolithic power structure, you can assume that if power and ground aren’t connected, nothing will work, and no one will be surprised.
So the move away from a single power/ground setup came cautiously, and was first motivated less by power and more by the need for different voltages in different parts of the circuit. I/Os in particular needed to support higher legacy voltages even as the core voltages dropped. Power did play into the decision as higher voltages were allocated to blocks that were in the critical path, with relaxed voltages for other parts of the chip.
Interfacing different domains with different voltages presents obvious risks. It’s not just a matter of paying attention to what happens when a block is on or off; even with everything running, you have to protect against “leakage” (or “floodage”) from one power domain to another. Level-shifting and isolation go together to make sure that all the different power levels can cohabitate.
But the next step in this move away from uniform power was the idea of powering blocks down. Here you may or may not be talking about different voltage levels; you may well have multiple blocks at the same voltage, but they might not be on at the same time. In these cases, level shifting isn’t an issue, but isolation is critical.
Figure 1. The move from monolithic power to multiple power islands. Many islands now share the same voltage, but must be independent so they can be powered down.
Of course, these analog considerations depart far from the clean assumptions that digital designers have enjoyed. Rather than trying to turn digital designers into analog designers, tools have evolved to manage the power structure, including the creation of not one, but two different file formats for expressing power intent – the UPF and the CPF formats. These allow designer to define the islands at a higher level; the file can then direct tools to automatically insert isolation and level shifters as necessary, restoring a higher level of abstraction for the designer.
Given a design with power islands, it’s no longer enough simply to verify that the logic works. A thorough verification plan must include the powering up and down of various blocks. If an application allows this to happen at the software level, then even more verification is needed to test out the various combinations of island power that might occur.
It’s also not enough to test that things work in the static powered-up and -down states. The transition between on and off provides a particularly rich opportunity for unanticipated electrical mischief. It’s easy (relatively) to figure out what will happen under static conditions; it’s much harder to figure out all the intermediate circuit states that occur while powering up and powering down.
A thorough analysis of the entire power subsystem requires low-level circuit simulation or the use of tools specifically designed for power verification. They provide the sign-off-level assurance that power islands play well together. But such analysis, due to its precise and exhaustive nature, takes time and requires some simplifications in order to make the problem tractable.When software has control
It would be beneficial to test the robustness of the design as software turns the various pieces on and off. But even simulation at the logic level – abstracted above the circuit level – is prohibitively slow when running software. Simply booting Linux can take over 20 years – this isn’t just a productivity issue, it’s completely infeasible. So being able to simulate the execution of software while validating the power subsystem at the analog level lies even further outside the realm of the reasonable.
The standard solution for verifying software as it executes on an SoC is to use emulators. The many orders of magnitude improvement in effective clock rate mean that, for instance, Linux can boot in a matter of minutes instead of years. Given this as a viable means of bringing software into the verification flow, it then becomes a matter of deciding the appropriate level of abstraction of the power characteristics, since an emulator can’t measure analog behavior.
Different emulators use different logic processing fabrics; some custom, some with FPGAs. The custom chips are expensive to design, and so must last many generations of emulator. None of them has the ability to implement actual power islands in the emulated design.
Even with FPGAs, which may have some capabilities for power compartmentalization, it isn’t enough to implement true, functional islands. So the ability to emulate a circuit while literally powering parts of it up or down simply doesn’t exist with today’s emulators.
But high-value verification can still be performed by moving the power abstraction level up a bit more. While it may not be possible to check the voltages and currents, a more abstract requirement is that the islands are mutually simpatico even at a logic level. This can be emulated without actually powering anything on or off.
When an island is powered up, all of its pieces are (hopefully) in well-defined states that are determined by the RTL, and this is how you would typically emulate your design. When a block is powered down, however, you don’t know what the state of the circuitry is. The expectation is that the I/O of the block – those registers on the periphery – will isolate the core of the powered-off island from the rest of the chip. While the effectiveness of that isolation has to be proven using other tools, you can still use emulation to check that undefined states in the block I/O will not create problems in the surrounding logic.
Power transitions can also be evaluated. The difference is that, when power is down, you assume that isolation is in effect; during the power transition, you can’t assume that, since you’re somewhere between powered up and powered down. So undefined states in the core of the block may be visible to the neighboring circuits.
In the simulation world, the ubiquitous answer to the “unknown state” is “X.” But, in reality, there is no “X” state in an emulator. Every node will be at a level; you just may not know what that level is. So the way to address this is to “corrupt” the states of the core logic and/or the periphery to model the different power states.
Figure 2 shows how this can be managed for an island that’s powered down. Because the internal core logic of the powered-down block is assumed to be effectively isolated from the rest of the chip, only the peripheral registers matter; their values are then corrupted (represented by “C” in the figure). For a given application, the designer may have a more precise sense of what the powered-down values will be, so he or she should have some control over how the corruption is implemented.
Figure 2. A powered-down island assumes an isolated interior and corrupted periphery.
Figure 3 shows a similar circuit while powering up or down. The abiding assumption here is that nothing is known, neither in the periphery nor the core of the block. But the worst case would be where the periphery is still active, sending on the results of the logic in the core, which is in an indeterminate state.
Figure 3. An island in power transition has an active periphery and a corrupted interior.
The solution here is to corrupt the core register values while leaving the periphery in an active state. As with the prior case, it must be possible to override the corrupted assignments when more is known about what the actual values will be.
With this capability in place, a verification engineer can now run multiple power cycles, using different corrupt values, to build confidence that the circuit is robust in the face of these changes. But even more powerful is the fact that an emulator with this capability can simply run the target software, and, at the direction of a UPF file, all power island changes and the isolation tests will automatically occur, further building confidence in the quality of the design.
The existence of these powered-down islands is only going to increase, especially as “powered off” becomes the default instead of “powered on.” Using power-aware emulation to verify the logic-level robustness of an SoC in the face of power changes, before actual circuits may be in place, is likely to become as commonplace as logic verification itself.About the author
Ludovic Larzul is one of the founders of EVE and the vice president of engineering. He has close to 20 years of engineering and management experience in the EDA industry. Through his leadership, EVE developed seven generations of emulation platforms, capturing a leadership in hardware/software co-verification. Ludovic previously served as principal engineer in the emulation division of Mentor Graphics and worked for Alcatel on telecommunication switches. He holds a Diplome d'Ingenieur de l'Institut de Recherche et d'Enseignement Superieur aux Techniques de l'Electronique (Ecole Polytechnique de Nantes).
If you found this article to be of interest, visit EDA Designline
where you will find the latest and greatest design, technology, product, and news articles with regard to all aspects of Electronic Design Automation (EDA).
Also, you can obtain a highlights update delivered directly to your inbox by signing up for the EDA Designline weekly newsletter – just Click Here
to request this newsletter using the Manage Newsletters tab (if you aren't already a member you'll be asked to register, but it's free and painless so don't let that stop you).