The tremendous growth of the semiconductor industry over the past 40 years is in part attributed to advancements of the EDA industry that caters to chip design companies. Although most design steps have been automated, a significant aspect that still remains primitive is that of register transfer level (RTL) debugging. Since the verification complexity is expected to increase as much as 675 percent by 2015, according to Dr. Aart de Geus of Synopsys, so will the manual debugging effort.
Design debugging will continue to add tremendous costs, risks and jeopardy to the electronics design industry that faces shrinking time-to-market deadlines. Recent articles and surveys report that RTL debugging takes more than 50 percent of the verification effort, due in large measure to the increasing size and complexity of the designs. Another factor is that engineers may not readily possess all the knowledge needed to quickly debug problems. For instance, some develop expertise in specific blocks, while others become more familiar with a broader but less detailed view of the design. Furthermore, engineers must often work with unfamiliar code or with third-party intellectual property (IP), a challenging task indeed.
Today, verification has become the major bottleneck in design closure, a task that is not complete until the bugs are removed and the design is correct. Inevitably, the tedious debugging step has become a core part in this process.
Significant technology advances in verification over the past decade have targeted the discovery of bugs. For example, constrained random stimulus generation and intelligent testbenches are more efficient with verification methodologies such as the Universal Verification Methodology (UVM) and the use of SystemVerilog. A wide offering of linting, clock domain crossing (CDC), property checking and other advanced verification tools have also improved the efficiency in discovering bugs. And yet, once the existence of a bug is confirmed, the verification engineer has little automation at his or her disposal to aid the localization of the root-cause of failures.
Most existing debugging methodologies are based on manually driven, primitive GUI solutions that package together waveform viewers, source code navigation tools, break-point insertion and basic what-if procedures. All these require expert knowledge of the design to manually drive the tools, a practice that results in a disproportionally enormous time and effort by the user.
In essence, up until recently, the state of affairs in debugging resembled that of logic synthesis in the 1970s and early 1980s. In those days, design was performed by the human with GUI editors who placed gate after gate manually until the specification was relentlessly implemented. At that time, it was evident that automation was the only viable avenue to retain the productivity pace and control the costs.
As history has shown, automation is the only exit to the vast debugging labyrinth today, so we allow engineers to spend time doing what they do best. And, this is certainly not the repetitive click of a mouse button.
With more than 20 years of research, debugging solutions that automatically localize the errors and present them to the engineer have become a reality. Less clicks, more automation and better use of the engineering resources. It is about time!
"...As history has shown, automation is the only exit to the vast debugging labyrinth today..."
Albert Einstein said, "Intellectuals solve problems; geniuses prevent them." It has been my experience that many of the defects found in chips would not have been there if some common sense design practices were employed. Automation is not the only way out of the verification debug problem. That problem can be reduced by preventing defects from entering into the design in the first place. For example, do not mix interrupt bits and read/write bits in the same register. That will avoid testbench and system integration defects. Do not take out unused, low-overhead portions of a block. Leave it alone and you will not break the IP, not break the testbench, and not break system software. And there are many more that I've collected and employed.
To understand the issue better we can look at 2 aspects of this verification problem. First, as you mentioned debugging needs intimate knowledge of micro-architecture. The ain't an easy one to solve.... There are tool that provide GUI or schematic view, but have inherent limitation on ease of use and performance. This problem is getting more severe with IP reuse and legacy design to deal with.
A more sever aspect is what if the checker in testbench is wrong. That is the testcase falsely passes since the header of a packet was not checked. This is something most tools today can't find out. The saying goes, one writes a testbench to verify the RTL, who verifies the tesbench itself.
I whole heartedly agree. A tremendous number of systems, solutions, whatever you want to call them, require the user to spend 1/2 or even more of the time debugging the tool rather than the real problem at hand. Isolating the real problem and presenting to the user will improve efficiency in the double digits for sure. I hope more solution providers take note and implement this sort of technology/practice into their products.