Verification remains the single biggest challenge in the design of system-on-chip (SoC) devices and reusable IP blocks. As designs continue to grow in size and complexity, new techniques emerge that must be linked by an effective methodology for significant adoption and deployment. The SoC industry needs a reuse-oriented, coverage-driven verification methodology built on the rich semantic support of a standard language.
This is the second in a series of four articles outlining a reference verification methodology enabled by the SystemVerilog hardware design and verification language standard. This methodology is documented in a comprehensive book the Verification Methodology Manual (VMM) for SystemVerilog jointly authored by ARM and Synopsys. This article summarizes some of the key recommendations of the VMM for SystemVerilog for building a scalable, predictable, and reusable environment enabling users to take full advantage of assertions, reusability, testbench automation, coverage, formal analysis, and other advanced verification technologies.
The purpose of the VMM for SystemVerilog is twofold. First, it is intended to educate users about the best practices shown to be effective in assembling a repeatable, productive and robust verification methodology. This allows users to take advantage of the same language capabilities, tool capabilities, and methodology used by verification experts. Second, it enables verification tool vendors to deliver the documentation, SystemVerilog code examples and boilerplates to enable users to take advantage of this methodology quickly and conveniently with a minimum of custom code development.
Layered testbench architecture
In order to have a common verification environment that facilitates reuse and extension to take full advantage of automation, a layered testbench architecture is required. This approach supports both top-down and bottom-up verification within a project and also makes it easier to share common components between projects. The VMM for SystemVerilog testbench architecture comprises five layers around the design-under-test (DUT), as shown in Figure 1.
Figure 1 A multi-layered testbench fosters verification reuse.
The layered testbench is the heart of the verification environment:
- The lowest layer is the signal layer that connects the testbench to the RTL design. It consists of interface, clocking, and modport constructs.
- The command layer contains lower-level driver and monitor components, as well as the assertions (properties) that check design intent. This layer provides a transaction-level interface to the layer above and drives the physical pins via the signal layer.
- The functional layer contains higher-level driver and monitor components, as well as the self-checking structure that determines whether tests pass or fail. Additional checking, for example protocol checkers, can span the command and functional layers.
- The scenario layer uses generators to produce streams or sequences of transactions that are applied to the functional layer. The generators have a set of weights, constraints or scenarios specified by the test layer. The randomness of constrained-random testing is introduced within this layer.
- Finally, the test layer is where the tests are located. The tests can define new sequences of transactions using the scenario layer, synchronize multiple transaction streams, generate sequences by interacting directly with the functional or command layers, or supply directed stimulus directly to the command layer.
Although this layered testbench is designed primarily for using constrained-random stimulus generation, it supports manual directed tests as well. The upper-left portion of Figure 1 shows a path running directly from the tests to the driver, bypassing the generator entirely. This allows a verification engineer to generate transactions directly without setting up constrained-random scenarios.