Not to be mistaken with MTV, the Music Television channel, the Microprocessor Test and Verification (MTV 2010) workshop is an annual gathering of leading experts in test and verification held in December in Austin, Texas. Last month’s workshop was no exception with approximately 70 attendees from industry and academia, including strong representation from companies such as AMD, Apple, ARM, Cadence, Freescale, IBM, Intel, Mentor Graphics, Oracle and Synopsys, among others.
Debug was one of the hot themes of MTV this year and it was addressed in two key sessions. First, an invited paper session introduced the latest research in automated debug. Papers presented by groups from the University of Tokyo, the University of Toronto and Bremen University appealed to the electronic design automation (EDA) industry intent on automating the nagging debug problem.
The second was a highly anticipated industrial panel session entitled, “Verification is a problem, but is debug the root cause?” The panel united industry heavyweights from all major vendors, including: Alex Wakefield, principal engineer at Synopsys; Bindesh Patel, technology manager at Springsoft; Harry Foster, chief verification scientist at Mentor Graphics; and Michael Stellfox, distinguished engineer at Cadence.
As the Chief Technology Officer of Vennsa Technologies, a company focused on debug efficiency, I was honored to moderate the panel. Panelists were carefully selected not only to represent EDA companies, but also to provide a broad view on the state of debug across our industry.
The first question posed was: “Is debug the biggest problem in verification today?” Anticipating controversy and different points of view, the audience braced itself for a back-and-forth debate from the panelist, only to find that the ruling of the panel was unanimous. In their view, debug is the largest time and resource drain in the verification process today.
The panel presented slide after slide of data from independent surveys showing that debug takes approximately 30-35 percent of the entire design process. In other words, engineers spend one third of their time understanding why certain failures are occurring, finding the root cause and rectifying the problems.
This represents a paradigm shift from how we currently view verification, where traditionally, the bottleneck is thought to be discovering bugs, not fixing them. With the advent of new verification tools and methodologies, it seems that on a day-to-day basis, more failures are discovered than the engineers can cope with. As one panelist opined, the entire hardware design process has become “debug limited.”
Reasons for the increase in debug effort stem from the growing design size, escalating testbench complexity, challenges presented by new languages and methodologies, as well as more bugs being found at a faster rate.
Answers varied based on panelist’s experience when they were asked, “Which is the most challenging today: RTL design debug, post-silicon debug or testbench debugging?” While some said that all components are hard in their own right, others mentioned that testbench debugging has become significantly harder in recent years.
Another interesting discussion thread was whether the debug and verification effort could be reduced “if the bugs were not inserted in the first place,” suggesting more usage of high-level synthesis and verified design intellectual property (IP).
The panel provided two points of view. In such cases, the debug task is pushed at a higher level or to the component interfaces. And, indeed, studies have shown that both techniques reduce the number of bugs found in designs.