Analyze the device specifications
All projects begin with several forms of specifications. Marketing contributes the customer requirements. Management defines the resource and schedule, and build versus buy constraints of the project. The systems engineers, hardware and software engineers and the verification teams each provide verification implementation specifications that will guide the project.
The biggest risk to the project is not anticipating the moving target. Some teams attempt to capture all their requirements and then move through the project sequentially, sometimes called the waterfall method. In reality the process is quite iterative. The risk comes from not fully anticipating the iterative impact and managing the frequency and burden of those iterations.
Scope the verification objectives
The most common failure at the scoping stage of the project is not requiring a comprehensive verification discussion process with everyone involved. Scoping and documenting the verification objectives is the only way to discern if the specifications were well conceived and understood.
It is terribly inefficient, and unpredictable, to plan details into a verification project after it is underway. Management must allocate a sufficient amount of time upfront to get the verification goals accurate and complete. Propagating requirements changes are another matter and are discussed in the next section.
All stakeholders must participate in scoping verification objectives: management, marketing, systems designers, hardware designers, software designers and the verification team. It is only through team dialog that the full project requirements emerge. Marketing requirements are never complete because they only address market-critical factors.
Management resource and schedule constraints are usually aggressive, by definition, and must be relaxed as the project is fully scoped. Systems engineers bring the technical perspective closest to the marketing requirements. Hardware and software engineers provide valuable insight about implications of implementation.
And, the verification team is the choreographer of the discussion, continuously challenging the team to see and fill the gaps and identify the unverifiable choices a key design-for-verification objective. If any of these points of view are missed, the gaps and risks will remain in the project, inevitably leading to quality concerns, cost overruns or schedule slips.
Figure 2 Verification team planning
Identify the feature set of the design
Good measurement techniques have been an elusive part of verification. Metrics are the means by which verification progress is measured. Historically, engineers have designed tests, implemented checkers and used code coverage. Test lists have become obsolete as a metric because they have become too numerous to specify or implement.
Although sometimes directed tests are easier to write, they simply don't scale to the size of today's systems on chip. Checkers aren't appropriate as metrics because they detect erroneous system behavior rather than record that good behavior has been observed. And, code coverage is only loosely correlated with behavior because the device context of each line of code is not recorded.
Functional coverage has been identified as the most accurate verification metric by leading electronics teams. As part of a coverage driven verification (CDV) methodology, a team is able to measure how much real verification they've completed, as opposed to how many (mostly redundant) simulation cycles they've executed.
The specification language of functional coverage is designed to match the requirements specification captured during the scoping process. The assertion language in a coverage specification can also be used to capture implementation assumptions, part of assertion-based verification (ABV).
Total coverage is the only means to have a complete picture of the verification status of a project. Functional coverage correlates directly to the features. Assertion coverage relates to functional coverage and also to the implementation integrity. Hardware and software code coverage tell us how well we have exercised the design.
Design detailed coverage models
The scoping process results in a list of the features to be verified. The specification must describe those features so that they can be measured. Coverage is used to define the metrics of verification, derived from design features.
There are two types of metrics that result from scoping: explicit specification metrics and explicit implementation metrics. Explicit specification metrics are chosen by the engineer from the specification. Explicit implementation metrics are chosen by the engineer from the RTL implementation, once the RTL is available.
The typical failure at this stage of planning is lack of review of the coverage models. They need to be complete enough to represent the full list of features to be verified. Yet they also need to be succinct enough that the tools and the team can actually accomplish the task within the desired timeframe. Judgment is crucial because it's a lot better to have verified 100% of the most critical functionality than to have critical functionality lost in a coverage model that is too detailed.
For a more complete treatment of this topic, refer to the book Functional Verification Coverage Measurement and Analysis by Andrew Piziali (Springer, ISBN 1-4020-8025-5).
Select aggregate metrics to track progress
Management is all about planning and executing. Tracking progress is the only way to know your team is executing. The old management adage "you get the behavior that you measure" is apropos. The challenge is selecting the few metrics to track that are representative of progress and will expose problems and risks.