System-on-Chip (SoC) designs continue to increase in size and complexity. At the same time, market windows are shrinking and today's electronic markets are extremely sensitive to time-to-market pressures. All of this is putting tremendous demands on SoC design and verification teams. Indeed, it is now widely accepted that verification accounts for around 70 percent of the total SoC development cycle. Thus, anything that decreases verification costs, speeds verification runs, and allows verification to be deployed earlier in the development cycle is of extreme interest.
This article begins with an introduction to the major elements that comprise a typical SoC design and verification environment. Also considered are the advantages and disadvantages of conventional verification solutions, including software simulation, hardware-assisted acceleration and emulation, and the use of FPGA-based prototype boards. The article goes on to describe an innovative and affordable way in which standard FPGA-based prototype boards are turned into full-blown desktop emulators. This proposed approach enables a paradigm shift that could dramatically increase the verification efficiency of off-the-shelf and custom-designed FPGA-based prototype boards by automating their existing in-circuit emulation capabilities and adding new co-emulation and co-simulation capabilities.
Typical SoC design and verification environment
First, let us consider the front-end portion of a typical SoC design and verification environment. At a minimum, this will comprise some form of design capture, functional verification in the form of software simulation, and logic synthesis as illustrated in Figure 1. Furthermore, the majority of such design environments today also include SpringSoft's Verdi Automated Debug System. The Verdi system allows users to analyze and debug the results from their software simulator and automatically correlates any gate-level results to the corresponding register transfer level (RTL) source code. (The Verdi system also allows users to analyze and debug results from hardware accelerators, emulators, and FPGA-based prototype boards as discussed later in this article.)
Figure 1. Minimalist SoC front-end design
and verification environment.
One of the problems with any form of functional verification is the amount of data that needs to be collected and stored. In the case of a software simulator, for example, monitoring large numbers of signals can significantly slow the simulation, while a long simulation run can result in humongous quantities of data. Thus, many design and verification environments also include SpringSoft's Siloti Visibility Automation System, which eliminates the overhead associated with recording data for all of the signals in a design. The Siloti system is used to identify the minimal set of signals that must be recorded during the simulation run; the Siloti system then uses these signals to auto-generate any unrecorded signal data "on-the-fly" as required.
The main advantage of software simulation is total visibility into the design. The main disadvantage is that, even when running on a powerful, high-end workstation while also employing the Siloti visibility automation technology – a software simulation of one of today's large SoC designs will struggle to achieve equivalent simulation speeds of more than a few Hz (that is, a few cycles of the design's main system clock for each second in real time). This means that software simulation is typically applicable to only small portions of the design, or to a few tens of clock cycles of the full-chip design. But fully verifying a modern SoC requires hundreds of thousands or millions of clock cycles, in which case some form of hardware-assisted verification is required as illustrated in Figure 2.
Figure 2. Minimalist SoC front-end design and verification
Conventional hardware-assisted verification solutions
environment augmented with some form of
There are a wide variety of hardware-assisted verification solutions with different capabilities, strengths, and weaknesses. There are also various ways in which the different systems can be used to solve different classes of problems; these include in-circuit emulation, transaction-based co-emulation, and HDL co-simulation.
Generally speaking, conventional hardware-assisted verification solutions are understood to include only hardware accelerators and/or emulators. FPGA-based prototype boards are not usually considered to be a viable alternative because they lack the capability to link with a workstation and they don’t provide a sufficient level of visibility into the design required for debugging.
Conventional hardware acceleration and/or emulation systems are dedicated systems that are constructed using either special-purpose, custom-designed chips or standard FPGAs presented in a special-purpose, custom-designed system. The objective of these systems is to function as closely as possible to the way in which a software simulator would operate, including such factors as visibility and debug capabilities. Using special software that takes full advantage of their custom architectures, these systems provide large capacity with relatively fast compile times in which the design is mapped into the hardware. They also provide reasonably good visibility (observability and controllability) into the design. However, these systems are hugely expensive and difficult to deploy broadly to multiple users, projects, and sites. Also, once such a system has been adopted, it is difficult to upgrade to the next generation of the system; in addition to the fact that new versions of the custom-designed chips and systems take a long time to develop, there are other factors to be taken into account such as the high cost of transition.
Turning to FPGA-based prototype boards
As an alternative to hardware accelerators and emulators, many design houses use FPGA-based prototype boards, which may be purchased "off-the-shelf" or custom-designed by the SoC verification team. In the example illustrated below, the design is compiled (synthesized) on the workstation; mapped, placed, and routed; and the resulting FPGA configuration file (or files if the system contains multiple FPGAs) is downloaded into the prototype board. The typical usage model is for the SoC design (or portions thereof in the case of block-level verification) to be used in-circuit; that is, to be driven with real-world input/output (I/O) signals as illustrated in Figure 3. In addition to being driven by – and driving – the external system, the real-world I/O signals may also be captured for subsequent analysis using tools like a logic analyzer.
Figure 3. High-level representation of a conventional FPGA-based
prototype board environment operating in an in-circuit mode.
The main advantages of conventional FPGA-based prototype boards are their high performance and the fact that they are relatively inexpensive, thereby allowing them to be provided to multiple users and projects and also deployed to multiple sites. Also, these boards can leverage the latest-generation FPGA technologies, and users can quickly and easily transition to new generations of the boards. The main disadvantages of these solutions are that they are difficult to setup and there is no link with the workstation to support co-emulation and/or co-simulation. Also, they provide very limited visibility into the design and they lack sophisticated debug capabilities.