There has been a lot of talk in the industry about the usefulness of assertions as part of a complete verification methodology. But there is something bigger going on here that many vendors are missing the value that properties can contribute to fundamental aspects of the design flow.
By combining synthesis with data logging techniques, properties can be turned into full on-chip diagnostics, error logging or usage monitoring systems. Property languages are a perfect starting point for defining these capabilities. This paper will explore the expanded role for properties in both the verification and design domains. It will show examples from a tool called DiaLite from Temento Systems1.
Assertions have emerged over the past couple of years as a significant mechanism for improving the functional verification of complex systems. While assertions are not new, dating back to the 1950's in the software world, it has only been recently that their value in the hardware community has been demonstrated.
An assertion is the execution of a property, where the property describes functionality or temporal relationships in a formal manner that must or must not be exhibited by an implementation. In other words, it is a fragment of an executable specification that can be compared against the result of the implementation effort.
Several mechanisms have appeared for the creation of these assertions. These include OVL2, which is a library of relatively simple assertions, and two new languages the Property Specification Language (PSL)3 and SystemVerilog Assertions (SVA), a subset of the SystemVerilog4 language dealing with assertions.
Assertions have been demonstrated to be successful in several conference papers5 and books6. Most of these early success stories were reported by large processor companies, who were the early adopters of the technology, and these stories demonstrate the benefit that they received even without the advantages of the new languages.
Not only do assertions increase product quality, but they add additional observability into a design, making it easier to detect and diagnose the problems when they are discovered. In short, what is not to like about assertions?
Assertions in a verification flow
When using properties in a verification flow they have one purpose ensuring that the functionality and timing that they specify are always adhered to in the implementation. These properties can be exercised in two primary ways either through a traditional simulator, or in a formal property checker.
The beauty of a formal property checker is that it does not require a testbench, as it will examine all possible behaviors of the design and attempt to prove that the property will always hold true. Once this has been done, there is no condition under which the failure can ever happen.
In some cases the tool will identify a way that it can be violated, and provide the testcase to exercise that failure. The problem can then be investigated and resolved. Formal property checkers cannot always provide this proof, and there may be properties where neither success nor failure can be provided.
Under these cases, the verification team has to fall back on a dynamic execution environment, such as simulation. Now the implementation and the properties are subjected to a set of testcases to see if they result in a disagreement. The problem is that no matter how many testcases are run, the simulator can never prove that it will not fail in the product, and running unlimited test cases on large systems takes too long.
Design for debug
It is a known fact that bug rates will go up. In a DAC 2003 paper7, Intel talked about the detected bug rates in each generation of the IA-32 architecture. They showed that total detected bugs were going up by 3 to 4X over each successive product generation. They predicted the bug rate for the next design iteration would top 25,000 bugs.
Each and every one of those bugs has to be detected, tracked to its root cause, corrected and re-verified. Anything that can be done to reduce the time spent on each bug will clearly save enormous amounts of total effort.
For many companies, the verification process does not finish with simulation, especially if significant parts of the design are going into one or more FPGAs. FPGAs can provide real time exposure to the actual data streams the design is likely to see, and enable the discovery of bugs that are minutes or hours into operation, something that is not possible with simulation.
But FPGAs have their own problems when compared to simulators, such as the lack of visibility into what is happening deep inside the design. Some FPGA vendors provide ways to access internal signals8 that can be viewed on a logic analyzer, but this is far from an ideal situation. Others allow you to capture a few events through a JTAG port, but the data rates on this are severely limiting such that real time operation is not possible.
So, what is a designer to do when they want to diagnose problems deep inside a design? The answer is to build the logic analyzer into the design itself, including triggers, data storage and a mechanism to stream this data out for analysis in a traditional debug environment9. A number of companies have been working to provide the intellectual property (IPP necessary to perform this operation. They, together with software analysis companies and test equipment companies, recently formed a consortium called the Design for Debug consortium10 to bring about some standardization to the field.
With the right software, assertions that were defined in the verification flow can also become part of this in-hardware verification. The synthesis of assertions is not easy, and care has to be taken to ensure that they do not blow up in size such that they finish up taking more space in the FPGA than the actual design.
An IBM designer by the name of Rent11 made an observation about the relationship between the numbers of pins required by a block versus the size of the block. It basically showed that given the constraints of silicon, pins have always been a constraining factor. What this means is that there is normally logic space available on the FPGA that can be put to good use.