It is well known that verification is a major time and effort drain in today’s design process. For some time now, advocates of formal verification and property checking have promoted this technology as the timely solution to the ballooning verification pain. Indeed, the arguments are compelling: there is no need to develop a standalone testbench or to generate stimulus to cover corner cases, and once a property passes it is 100 percent correct. All one needs to achieve these benefits is to write a set of properties and run a property checker to prove or disprove the compliance of these properties by the design. For these reasons in the early to mid 2000s EDA companies large and small dedicated much effort and money to develop and capture a segment of this new market.
Over the years that followed, as many companies and engineers learned through firsthand experience, there are some major obstacles to overcome to make the formal verification argument a reality in practice. In my view, there are two main challenges (1) writing assertions is complicated, (2) debugging property failures can be significantly more difficult than that of a human developed testbenches.
Being a pragmatic believer of formal verification and property checking, I believe that the challenges can be overcome to reap in the benefits of improved efficiency. Thus, I sought to study the main challenges and devise a practical solution.
The challenge of writing assertions can be slowly overcome with time and through a unified effort. The first step is to ensure that management is on board and willing to give formal verification a serious shot. For this, engineering and financial resources must be allocated while key players must be given a mandate to improve the verification efficiency. Due to the difference between “traditional” simulation-based verification and formal verification, the engineering staff must also be educated in both temporal assertions and the technology underlying formal verification. This requires training, attending seminars and workshops, and most importantly hands-on experience and time. In due time, in-house expertise will develop, more assertion champions will emerge and sophisticated formal verification processes will mushroom across the company.
Unfortunately, the challenge of debugging assertions cannot be overcome with the sophistication of the user alone. Indeed, an expert user may debug formal verification problems faster than a novice user by knowing how the EDA tools work. However, debugging different design blocks and properties developed by different engineers requires a different context of insight and knowledge. More specifically, to debug formal verification failures the engineer must be able to determine between the bugs in the design, errors in the assertions or problems with the constraints or assumptions. Once the problem is narrowed to one (or multiple) of the three above components, the low level root cause analysis process of debugging begins. In other words, what makes debugging failures a hard task is that it requires advanced knowledge of the tools and assertions while also requiring intimate knowledge of the designs and blocks being verified.
Through my research I devised a simple methodology that results in a significant reduction in debugging time specifically for a property checking verification flow. A key component in the proposed flow is the automated debugging tool called OnPoint, developed by Vennsa Technologies. An automated debugger is a tool that can diagnose verification failures and determines the root cause automatically. Unlike traditional debugging tools based on waveform viewers, automated debuggers do not need any manual guidance. The automated debugger provides insight to the user both at a high level and low level. Figure 1 illustrates the different components of the methodology.
Writing assertions is not a simple task if we really want them to acheive functional coverage. It takes a lot of effort and design insight which accoutns for a major chunk of enineering time for verification. This being the case the problem is this tool is addessing is at a later stage when we have robust assertions and a clean flow using the same, else we end up spending a huge part of our time just addressing the mistakes in ones property description. And we have not even accounted for cases which are prone to race. The problem with deplying assertions widely is that it requires good design and tool knowledge to write one effectively and when that happens you need an even more undertstandable person to debug any failures. So, hmm.. this tool is trying to bidge this gap I suppose, should be interesting.
Great article! very interesting to read about your methodology to study and compare the new approach against the former. The reduction in test time from 36 to 21 hours is a great benefit. This is a field where it will always be necessary to improve and reduce times as time to market may be sometimes crucial.