Functional verification often creates problems in the design flow. More than 50 percent of all chips fail initially, with 70 percent of problems due to functional verification. Simulation vendors respond by telling users to do more simulation, build bigger simulation farms, and use emulation. Analysis of past verification crises demonstrates that these answers did not and will not properly address the problem.
Significant advances in verification productivity have occurred recently, particularly in testbench development, making it quicker and easier to develop new tests. These advances also reduce the maintenance time for existing tests by reacting to design changes without significant modification.
However, advances also burdened overworked simulators. Now companies can generate thousands of vector sets, but have little time or resources to run them. Hence, the simulator vendors attempt to identify the “best” vector sets differentiated by the coverage provided.
Design has changed, but today’s verification techniques are outdated. With the introduction of languages such as SystemC, and in particular transaction level modeling, designers can think and work at higher levels of abstraction. Increased IP usage also results in more attention paid to system level effects, the interaction of blocks, and the architectures of the system. Verification must reflect these changes in the design methodology, but this will require the convergence and maturation of three independent technologies, as described later in this article.
The verification continuum
We should view verification as a complete continuum that parallels the design flow1, rather than a singular function. Each task in that continuum is performing verification of a single facet of the system, such as functionality, architecture, performance, timing and implementation.
Consider the simplified design flow shown in Figure 1. At the top of the design flow, the system is designed and algorithms are developed. Most companies do this on paper, or small pieces may be modeled in UML, Matlab or similar languages capable of the necessary levels of abstraction. No distinction is made between hardware and software, and no consideration is given to the algorithms’ implementation.
Assuming that models existed for everything at this level, you would want to verify that the algorithms were correct, and the interaction between the blocks produced the desired functions on the primary outputs. System simulators are nothing new, but functional verification is performed later in the flow today because of unsuitable models.
Figure 1 — Facets of design and verification
The next stage in the design process is setting the solution’s basic architecture by deciding which functionality to implement in hardware and software. Much of the intellectual property (IP) that will be used is selected, particularly any platforms and essential operating systems, forecasting a fairly accurate picture of system performance.
The industry standard taxonomy2 defines this as an abstract-behavioral model, which describes the function and timing of a component without explaining its implementation. The models’ interfaces are token-passing in nature but contain real data, and accurate functionality is performed on them.
The industry often calls these "transactions," enabling exploration of load factors, congestion, resource utilization and other system aspects. While abstract hardware/software co-simulation has been available, few people currently do performance verification due to a lack of models, deferring until later in the flow.
In the hardware space, the design process considers the micro-architectural decisions, such as the amount of parallelism to use, pipelining, and resource sharing. These decisions impact the area, power, latency and throughput of the solutions.
Once the implementation detail is added to the refined abstract behavioral model, an RTL model emerges. Here, most companies start the verification process, including system-level functional verification, performance verification and implementation verification. These models contain superfluous detail for the performance of these types of verification, resulting in wasted effort.
The introduction of languages such as SystemC3 has enabled the development of behavioral models, adding a higher level of abstraction to the performance of functional verification. However, if the industry accepts and utilizes this abstraction for functional verification, other changes are required.