The device was verified in parallel by two teams. One team went the conventional, simulation-based route while the other team verified design blocks with formal methods. The simulation team used tests automatically generated and analyzed by in-house testbench tools. The formal verification team used OneSpin's formal verification tool 360MV* with formal coverage analysis, provided by the EDA partner. At first, we defined the properties in the tool’s proprietary language ITL, but later we migrated to SVA, because this is a standardized language, supported by both 360MV and other formal tools, as well as by simulators. We will return to this aspect in a later section of this article.
It was not possible to check the entire design with formal methods because – at the time – we lacked sufficient trained people for this job. Thus, we formally checked only a selection of blocks, based on the following criteria:
• Block complexity
The more complex a block is, the harder it is to verify thoroughly using simulation. Thus, complex blocks are the most promising ones to find additional bugs with formal verification.
• Block importance
The formal verification approach developed in the HERKULES project ensures 100 percent coverage of the complete design behavior. Naturally, we used this formal verification and formal coverage analysis approach particularly on the most important blocks, that is, blocks which implement the main functionality, to ensure a deployable device.
• Block suitability
We used formal verification on blocks of a complexity that would not exceed the capacity of state-of-the-art formal tools, using workarounds for long checks. For example, the frame length of the input data stream is in the range of several thousand clock cycles and, basically, this directly translates to the cycle count of the verification window. In order to achieve short proof times, we split some of the properties into parts of equal characteristics. Later, we checked the correctness of the split by means of techniques developed in the HERKULES project.
Formal verification typically was applied to blocks that were already verified with simulation. The status of most of the blocks was “verification completed.” Why? Because we wanted to measure the additional quality achieved by formal verification in comparison with the traditional simulation approach. And the results were a real surprise.
*OneSpin 360 MV is an integrated family of formal assertion-based
verification (ABV) tools for ASIC, SoC and FPGA designs. For additional
information on OneSpin's 360 MV, clickhere.
thank you for your inquiry. Some remarks related to your questions:
The two-teams-approach, of course, was only possible due to the fact that we had some funding by the german government for this.
Yes, we also tried to re-simulate bugs, which had been found with formal simulation.
The typical procedure, when a bug was found in the formal domain, was to contact the block designer. The block designer took the stimuli of the counterexample, tried to reproduce the bug in simulation to understand what's happening and corrected it. Then we redid the formal verification part to check if the bug really disappeared and no new bug had been introduced by the fix.
We tried to run the SVA assertion in simulation as well, but they did not run out-of-the-box. The reason was that currently formal and simulation tools support different SVA subsets and interpret some SVA constructs differently (see article). So it was quite a hard job to migrate formal properties to simulation. Moreover, doing a migration in the direction as proposed seems to me to be of an academic nature. With the claim of the formal approach to be exhaustive, simulation can only detect bugs, which have been found previously by formal.
Nice to see 2 teams running in parallel to really assess dynamic and static verification benefits (usually case studies first use a verification then teh other and conclude only on the missed bugs of a method, not a real comparison!)
You've run the Formal verification on the design that was only verified and debuged via simulation based verification.But did you try the simulation based verification environment on the Formally verified only design?
By the way, did you run the SVA assertions dynamically as well? were they able to catch the bugs without running Static Fomal analysis?