DVCon 2007 was a success, and its sponsoring consortium, Accellera, will be very pleased with the results. Yet, as with all conferences, some things went better than others. So here are the highlights and lowlights of the conference.
DVCon 2007 was a success, and its sponsoring consortium, Accellera, will be very pleased with the results. The conference was celebrating its twentieth anniversary. Born in the days of the VHDL/Verilog war as the VHDL Users Group meeting, it has transformed itself into the premier Design and verification conference in the industry. Yet, as with all conferences, some things went better than others. So here are the highlights and lowlights of the conference.
Registration was higher than ever with over 700 attendees that registered for either the full conference or one-day attendance, the exhibit floor was sold out, and the program was interesting. This year the program committee, headed by Tom Fitzpatrick, added a new track to the program to provide academic researchers with a forum in which to highlight research efforts that may evolve into EDA tools for design and verification. The 32 papers presented dealt mostly with verification issues, and indication that the problem is yet to be solved. Over 250 registrants attended a tutorial on Wednesday, with the SystemC TLM tutorial completely sold out. All four leading EDA companies sponsored at least one of the events, and OSCI, once again, collocated it North American Users Group event with DVCon.
Thursday keynote speech by Moshe Gavrielov, executive Vice President and General Manager of Cadence's Verification Division showed how a skilled executive can perform insightful marketing research using readily available information. Moshe used The Economist, USA Today, and a few other publications to gather data that justify the need of project management tools in EDA. Unfortunately it seems that work like his is rare in EDA. The industry continues to grow by evolving established methods instead of innovating into new areas. Verification is an example. The major problem with verification is that it is rewarded negatively. All the metrics used so far are negative: the design is good when no more bugs are found, the designers did a poor job when many bugs are found, and the verification engineers did a poor job when they do not find bugs. Notice how nothing is positive. As a manager you cannot say when the design is free of bugs. You can only say that for a given amount of time no bugs were found. It is time to stop playing cops and robbers and really embrace the concept that quality is built in, not legislated in.
The lunch panel on Thursday, moderated by Richard Goering, addressed one of the most contentious topics in our industry: low power standards. What most impressed me was that no one from the audience asked any questions. Is it a case of "nobody cares"? I do not think so. But I think that the problem is isolated to few companies doing leading edge designs, and those people have full support of their EDA vendors, are often beta sites for new technology, and produce working silicon with methods that are as unique to each design as required. Standards are most useful when they address a wide range of users, and the EDA industry is finding out that the majority of designs are not done at leading edge process technologies. Although publicly disagreeing, the technologists and managers involved in the development of the standard continue to privately assure every editor they can find that the matter will be resolved and that they will avoid the VHDL vs. Verilog debacle. Have you noticed that this latest controversy pits Cadence against the rest of the industry, just like in the days of VHDL vs. Verilog? And have you noticed that, in spite of the public protestations by Cadence that CPF is in the hands of Si2, there is no one from Si2 defending the format and all the advocates of CPF on panels and contributed papers are from Cadence?
The Troublemaker's Panel was the low light of the conference. Even as Chair of the conference I did not succeed in managing this event and so DVCon continued to show the less productive, more divisive, gossip hungry sides of out industry. We must do better and the invited panelists must do better. This was a very poorly prepared panel. With the exception of Rajeev Madhavan who did have his facts together, and Jue-Hsien Chern, pressed into action at the very last minute when circumstances derailed Joe Sawicki's plans to attend, the other panelists have no justification for not being prepared.
May be they too see this panel as irrelevant and only intended to entertain at the expenses of content. In particular it was not clear why John Cooley invited Bret Cline, aside for the fact that Bret was in charge of the usual poor taste joke one finds in Cooley's panels.
Ted Vucurevich showed an incredible misunderstanding of the definition of a standard and the standardization process within the IEEE and the EDA industry in general. This is incredible coming from the CTO of the leading EDA company and given the fact that he must have expected questions about Cadence's behavior with respect to openness and standards in EDA. John Chilton, Senior Vice President, Marketing and Strategic development at Synopsys, experienced a handicapped public speaking day. I think someone in his position should be able to string together two sentences without always saying "You know" in one of them.
And finally Gary Smith was really unprepared. He might be excused since he did remark before the panel started that the quality of the questions submitted in advance was exceptionally poor, but I would have liked a correct answer from him when he was asked about the difference between the market study his new company plans to sell versus EDAC MSS report. Instead we got an advertisement for his report. Gary meant to say that his company analyzes the data and edits it where necessary using their experience, but he used the term "scrub the data" which implies a willingness to edit and modify to adapt the data to one's perception of how things should be. Is this the root of the "ESL problem" in the last few years of the Dataquest report?
Actually there are a few differences between the reports. The one easily explained during the panel is that the MSS is compiled almost exclusively from data provided by EDAC member companies, while the Gary report is compiled with data available to Gary and his team, including, I am willing to bet, the MSS data.
This year, as always, the panel dealt too much with back end subjects at a conference dedicated to front end issues. The result is that if it is not a comedy act, the panel fails. We did not have comedy, just gossip, unprepared panelists (how could, for example, Atul Sharan, CEO of Clearshape, not remember the Blaze DFM/Aprio deal?), and a bit of marketing and self-promotion.