Gates, lies and common sense
It is comforting to know some things never change. The U.S. remains firmly anchored to the British measuring system, candidly ignoring that the British switched to the Metric system long time ago. Madonna remains Hollywood's foremost narcissist. And, based on the latest claims of the hardware-assisted verification solutions providers, the gate as a metric to measure design complexity continues to cover a range as wide as the Nile's delta during the rainy seasons.
The gate-counting war started with the launch of the programmable logic devices, when the PLD industry switched from using the fairly unambiguous design gates, to equivalent ASIC gates, on to FPGA gates, and, recently, to system gates. From the viewpoint of the PLD industry, the move is sensible. After all, a modern FPGA encompasses a lot more stuff than just standard logic resources. For instance, all major FPGA architectures now include memory of some sort as a fundamental part of its fabric.
However, the lack of an official standard, or even implicit consensus on the metric, opens the door to manipulation. It so happens that the very same design may include 200,000 design gates, or 400,000 equivalent ASIC gates, or a staggering 2,000,000 FPGA gate figure, or reach a breathtaking 6,000,000 system gate summit. What do you make of it? Well, you may be confused, but the vendors of hardware-assisted verification tools based on FPGAs certainly are not, and so they make a lot out of it.
If you feel compelled, just dug into the web sites of few of these vendors, or pick and browse one of their datasheets. You will be offered verification solutions to support 12M gates for $99,000, or so, making them the "lowest dollar-per-gate" and thus the cheapest emulation systems on the market. Never mind that those are the minuscule and misleading system gates. Too bad that the very same system may indeed map "only" 400,000 design gates, and at $99,000 be all but inexpensive.
But let's give credit to where it is deserved.
No matter how you measure them, the Virtex-II, the latest generation of FPGA devices unveiled by Xilinx, are remarkable components. Fully loaded with a vast amount of logic, including heaps of wide multipliers, specialized multiplexers, and extensive and versatile memory banks, they also boast profuse routing resources. In fact, all rapid prototyping systems have adopted them.
With a twist of irony, the very same device that may have added a level of confusion to the appraisal of design complexity will also sort out the puzzlement thanks to its universal adoption.
Realistically, now there is a simple, practical way to compare the design capacity of two emulation solutions based on the Virtex-II components. By listing type and quantity of Virtex-II devices allocated to mapping the design-under-test, possibly augmented by one or more external memory banks, you can now truthfully and reliably evaluate two or more emulation systems.
Regardless of their tagged specifications that may call for supporting 400,000 design gates, or 800,000 equivalent ASIC gates, or 4,000,000 FPGA gates, or 12,000,000 system gates, two systems, each based on two Virtex-II XC2V-6000 FPGAs, will have the same design capacity. For example, the ZeBu ZV-6000, the universal verification platform introduced at the Design Automation Conference 2002 by EVE, maps designs into two XC2V-6000 FF1517 FPGAs plus 128Mbits of SRAM chips and sells for $49,000 inclusive.
At last, you'll have a comparative measure of just how much logic functionality will fit into an emulation system, which should help you make an informed choice in selecting the appropriate verification solution to accommodate your application.
Lauro Rizzatti is marketing vice president at startup Emulation and Verification Engineering (EVE).