Scalable FPGA-based verification has become a serious alternative to big-box emulation.
Did you see the news?
Aldec is adopting Xilinx Virtex UltraScale devices in its seventh-generation Hardware Emulation Solution, HES-7, thereby heralding a great leap in the capability of FPGA-based verification. There's more detail in this press release from whence you can also download the technical specification.
Now read on...
Articles often start with introductory statements of the blindingly obvious, so here's mine: "Today's SoCs are big and complex and it takes a long time to verify them." Ok, that's understood. Now, what can we do about it?
Well, emulation would sure help shorten those verification run times, wouldn't it? Just think what you could achieve by running your simulations thousands of times faster. Hmmmm... it's a pity about the price tag on that emulator, though (they're not called "big-box" emulators for nothing).
What if you could get that valuable emulator speed-up, but without the capital and operating expense associated with the big-box? What if that came from a company with a 30-year experience in verification EDA, rather than just another FPGA board-maker with a couple of scripts, a transactor, and a big marketing budget?
Well, this column is here to tell you that the scale of the latest FPGA technology is making that possible; also that scalable FPGA-based verification has become a serious alternative to big-box emulation.
Putting transactor lipstick on a prototype doesn't make it an emulator
Historically, some verification teams have tried to compensate for lack of emulation by using their FPGA-based prototypes as a substitute. In fact, while running public workshops based on the FPGA-based Prototyping Methodology Manual (FPMM) in 2011/12, I remember presenting in our research findings that over half of prototypers at that time considered that their FPGAs were used for "verification." At the time, perhaps we should have called their activity "FPGA-based verification." Digging deeper, however, it might have been better described as "validation," rather than verification. What's the difference between verification and validation? The source for the best explanation may be lost in time, but I first heard it at a public presentation by ARM, when the speaker said that validation asks "Did we build the right thing?" while verification asks "did we build the thing right?" Do you see the difference?
FPGA-based prototypes are excellent not only for validation, but also as platforms for early software development and in-field trials; however, they are not really verification platforms (i.e., a prototype is not the same as an emulator).
There are significant technical differences between using FPGAs for prototyping and using them for emulation, as summarized in Table 1, and dual-purpose FPGA systems that are good for both tasks have been rare indeed.
Table 1: Typical differences between FPGA usage in prototyping and emulation.
This doesn't mean that FPGAs cannot be used for emulation or for accelerating our simulation runs; however, attempting to re-purpose an existing collection of FPGA prototype boards as an emulator is not likely to impress anybody.
Are dual-purpose FPGA platforms a realistic expectation?
Dual-use FPGA hardware requires considerable design effort and planning, but the reward is a platform that can take its place at the heart of a prototype or a verification environment equally well. For example, by adding significant amounts of tightly-coupled memory onto the platform, Aldec remove the dependency on internal FPGA memory or add-on cards; this memory is then useful for deep buffering of instrumentation data in a prototype mode, or for modelling different kinds of SoC memory in emulation mode.
Considering the extra expertise and infrastructure needed for interfacing with simulators, we should probably expect most progress on dual-use to be made from those coming from the verification camp, rather than those specializing only in the FPGA hardware.
The aim is to harness the FPGA hardware into the verification environment, so that means connecting together simulators on a host machine and FPGAs on a board. That harnessing task is a worthy subject for a whole new article (watch this space), but readers may not be aware that, Aldec has been successfully doing exactly that for years -- using FPGAs in multiple generations of their Hardware Emulation Solution platforms (that's where the HES name comes from).
Big just got bigger
In the HES7XV12000 board, Aldec already has the largest capacity single FPGA boards commercially available and in use today, but by upgrading the six XC7V2000T FPGAs one-for-one with Xilinx's latest Virtex UltraScale-440 devices (VU440, for short), the resultant boards are quite simply the largest and most capable FPGA platforms ever made available. Aldec's engineers assure me that the development of a new FPGA platform isn't really quite that simple, but you get the picture.
Each VU440 device is published as containing resources for implementing 50M ASIC gates, it says so right there in the datasheet, and is demonstrably correct for certain designs. On the other hand, there are factors that eat into that figure when mapping a SoC or ASIC design. For example, there's the advice in the Synopsys-Xilinx FPMM (see page 101) of staying well within the upper limit of resource usage in order to maximize ease-of-use and reduce place-and-route (P&R) runtimes. We should also expect the incoming RTL design style to be FPGA-hostile, and Murphy says that it won't conveniently partition into six equal-sized blocks of LUTs, RAMS and flip-flops. All this means that users should confidently expect each HES-7 with Xilinx UltraScale Devices to implement 160M ASIC gates of SoC design. Wow. Go back and read that again. 160 MILLION GATES on one board. Many designs may even exceed this ease-of-use guideline, but it's better to be safe than sorry, don't you think?
To Page 2 >