Introduction This article is about our experience in applying formal verification techniques to an ASIC design in a large communication system.
When we, in the Alcatel-Lucent IC design group in Nürnberg, Germany, first encountered formal verification methods three years ago, our goal was to assess whether formal analysis has the potential to become an effective part of our standard functional verification flow. During our assessment, we had a lot of surprises. We found bugs in both legacy code and recent RTL code, and we learned a lot about specification gaps. Later on, we tried to leverage the complementary strengths of simulation-based verification and formal analysis, but discovered that this is more difficult than we expected. In this article, we share our experiences and best practices.
Formal verification techniques have been in industrial use for more than a decade now, but were considered as "emerging" methods for most of that time. They were viewed as an approach requiring high levels of expertise, and with limited applicability to industrial designs. This was the estimation of many IC design departments and also of the design team at Alcatel-Lucent in Nürnberg until 2007. This view significantly changed with our participation in the HERKULES project , supported by the German government.
Over the past three years, this project teamed chip-design engineers at leading system providers and research institutes with developers of commercial formal verification tools. Our goal was to develop a right-first-time formal verification approach for digital and mixed-signal designs – and to ensure that it is widely applicable to the development of automotive and telecommunication IP, which must comply with very high quality standards. Alcatel-Lucent participated as an application partner, using newly developed techniques and methodologies in normal design work, and cooperating very closely with the tool developers in order to evaluate and improve usability and concepts.
This article is structured as follows. First, we give a brief description of the ASIC’s functionality. In the next section, we talk about the design aspects that we tackled with formal verification. Then, we compare simulation and formal verification with respect to effort, capabilities and required skills based on the experience from the application project. And finally, we describe how simulation-based and formal verification methods can be brought together in a combined verification flow using SystemVerilog Assertions (SVA) .
Functionality of the XCON ASIC The device to be verified was a telecommunication ASIC for optical transmission network (OTN) cross-connection in the range 160Gbps to 1360Gbps. Figure 1 illustrates the layout of the device.
Figure 1: The layout of the XCON
Input data to be cross-connected are logically organized in frames of a specific length. The input data organization is reflected in the circuitry by data stream processing units, which get a frame counter value to indicate the stream position.
thank you for your inquiry. Some remarks related to your questions:
The two-teams-approach, of course, was only possible due to the fact that we had some funding by the german government for this.
Yes, we also tried to re-simulate bugs, which had been found with formal simulation.
The typical procedure, when a bug was found in the formal domain, was to contact the block designer. The block designer took the stimuli of the counterexample, tried to reproduce the bug in simulation to understand what's happening and corrected it. Then we redid the formal verification part to check if the bug really disappeared and no new bug had been introduced by the fix.
We tried to run the SVA assertion in simulation as well, but they did not run out-of-the-box. The reason was that currently formal and simulation tools support different SVA subsets and interpret some SVA constructs differently (see article). So it was quite a hard job to migrate formal properties to simulation. Moreover, doing a migration in the direction as proposed seems to me to be of an academic nature. With the claim of the formal approach to be exhaustive, simulation can only detect bugs, which have been found previously by formal.
Nice to see 2 teams running in parallel to really assess dynamic and static verification benefits (usually case studies first use a verification then teh other and conclude only on the missed bugs of a method, not a real comparison!)
You've run the Formal verification on the design that was only verified and debuged via simulation based verification.But did you try the simulation based verification environment on the Formally verified only design?
By the way, did you run the SVA assertions dynamically as well? were they able to catch the bugs without running Static Fomal analysis?