By Michael Santarini, EE Times
Like jumbo shrimp and military intelligence, the term "EDA quality" is an oxymoron for most working
engineers. Electronic design automation companies are forever trying to catch up
with Moore's Law by writing quality code to address the complex nuances of chip
design. But for many tool makers today, the quality summit remains as elusive a
goal as Mount Everest's peak once was for climbers.
Yet EDA vendors believe that tool quality is steadily improving despite some
inherent challenges that make these tools harder to develop than software for
most other industries. The big three in design automation-Synopsys, Cadence and
Mentor-have quality assurance programs in place to swat the bugs before their
customers get an itch, and smaller operations have their eye on quality as well
(see story, below).
A Carnegie-Mellon Software Engineering Institute study published over a year
ago found EDA software wanting compared with software written for most other
industries. On a scale from one to 10 in terms of quality, EDA's best half dozen
tools scored only a five and most products came in at three. In reviewing the
study, Giora Ben-Yaacov, Synopsys Inc.'s quality architect, said that no EDA
tools ranked at six or above ("excellent") let alone at nine ("world
Ben-Yaacov estimated that just 20 percent of tools scored a "good" rating,
while the remaining 80 percent were in the "poor" to "needs improvement"
categories. In all other software industries, 50 percent ranked "poor" to "needs
improvement," 35 percent ranked as "good," 10 percent were "excellent" and 5
percent were deemed "world class." The highest quality was found in NASA space
shuttle software, where quality is literally mission-critical. Synopsys'
analysis was reported at the International Symposium on Quality Electronic
Design, held earlier this year.
The estimates are not vigorously disputed by Ben-Yaacov's quality assurance
peers at other EDA companies and certainly not by tool users, whose struggle
with tool quality is documented daily at online user groups like
www.deepchip.com on the EE Times Network.
"One of the advantages of other software companies, like a Microsoft or an
Intuit, is that they have the advantage-and pressure-of selling literally
millions of units. Those numbers and the kind of people that are using that
software force them over time to drive up the quality of their software
products," said Mark Noneman, worldwide vice president of quality at Cadence
Design Systems Inc.
EDA companies, said Noneman, sell far fewer units and in many cases their
tools solve much more difficult problems than commercial software. Moreover, the
industry is constantly chasing a state of the art that is ever-evolving.
The quality of a given tool largely scales with the difficulty of the problem
it is trying to address, said Mike Stabenfeldt, a former Cadence and Avanti
architect who now runs his own shop, Stabie-Soft Inc. "Quality also depends on
whether you are developing a tool to address a new, hard problem or are
addressing a problem that is well-understood," said Stabenfeldt. "Obviously, you
are more likely to make mistakes if you are moving into new territory."
The back story
Gary Smith, chief EDA analyst at market research firm Gartner Dataquest,
sheds some light on the industry's quality problem. Smith said there are only
roughly 5,000 R&D engineers in the entire commercial EDA industry, and they
are tasked not only with building new tools but also with maintaining or
updating multiple versions of existing ones as silicon problems arise. And with
each revision comes the likelihood that new functionality will add bugs that
affect tool quality.
Most launches every year, however, are not fresh technology but improvements
of known tools-a better simulator, fancier static timing analyzer, faster place
and route product. Indeed, the tools that tend to have the most quality problems
are either brand new or more than eight years old, experts said.
EDA tools, especially those addressing new issues that pop up with
bleeding-edge processes, have commonly been released in rough shape. The price
of the new functionality is bugs and sloppy graphical user interfaces.
Nevertheless, power users have been more than happy to beta test a new tool, or
the latest version of an existing one, if it provided some edge in getting
designs done faster or solving a problem not otherwise addressed.
But the free ride may be over. "Customers are less tolerant of poor quality
today in new releases than they were in the past," said Steve Aho, the
engineering council chair at Mentor Graphics Corp.
Dataquest's Smith agreed that EDA vendors now must wring the bugs out of
their tools by the time they reach the second year in production. That's because
there is usually a two-year lag between a tool's initial release and its uptake
by mainstream designers, he said, and mainstream users have minimal CAD
organizations to help with workarounds and tool interoperability.
If new tools are difficult to debug, products that are eight years old or
more are difficult to maintain, EDA vendors said. Synopsys' Ben-Yaacov said
older EDA technologies were originally developed in the C language, while newer
tools are created in C++ or Java. Finding C programmers to add new functionality
to old tools is difficult and the process often introduces new errors. As a
result, he said, maintaining and updating old standbys is more costly than
maintaining newer tools. This is one reason, but not the only one, why EDA
companies phase out "end-of-life" products in favor of newer tools.
Mentor's Aho said that certain techniques can help in updating older
technologies. Most of them "center around maintaining architectural integrity of
the product and must be applied on an ongoing basis," said Aho, noting that
Mentor, the oldest big company in the EDA space, has successfully updated its
Falcon line of tools, which came out in the early 1990s.
It takes teams
The EDA community understands that to raise the quality level, vendors must
equip tool architecture teams with the right mix of hardware design flow,
knowledgeable design engineers, software developers and algorithm geniuses. They
must get these folks to work together cohesively for a timely introduction of
tools to address silicon problems that are just beginning to be fully
On top of that, vendors need to ensure that quality is at an acceptable level
at the time of a tool's release and that it steadily improves during the life of
the product as more functionality is brought in to keep pace with silicon
problems. "When most people think about quality they think about the reliability
of the software that is, bugs and the ease of use," said Cadence's Noneman.
"But there are other things to consider." Noneman, whose 10-person group
oversees quality assurance for R&D developments, said Cadence uses a
software development model called Furps that had its genesis at Hewlett-Packard
Co. Furps, he said, stands for functionality, usability, reliability,
performance and supportability.
"Functionality, does it perform the function required? Usability, ease of use
and does it fulfill the need in terms of interfacing with other products or
data?" he said. "Reliability, is it free of bugs? Performance involves speed
and, more importantly in EDA, capacity. And supportability: Can it be installed,
configured; does it have the programmability like scripting languages needed for
Noneman's group at Cadence relies on Furps to set standards and criteria for
product releases as well as to help product groups develop better ways of
At Synopsys, Ben-Yaacov's group is roughly the same size and has a similar
function. He said that one key for developing quality products is ensuring that
R&D teams are well-rounded. These teams, said Ben-Yaacov, typically comprise
software engineers who know the latest software development techniques, design
engineers who bring practicality and usability into the mix, and algorithm
"These PhD algorithm specialists often are the best in the world at solving a
very specific emerging hardware issue," he said. "Many have written their
dissertations on the algorithms they are developing within EDA companies. But
typically, they are not good at creating production software, nor do they
completely understand how the algorithm they are developing would be applied in
the field by engineers. It is important to have all three types in your
engineering development team-they all complement each other."
'Good business sense'
Aho said each development team at Mentor has quality assurance members to
ensure that products meet strict standards for customer release. Development
teams meet regularly as a software quality assurance forum, where attendees
share information, best practices, techniques and questions, he said. This way,
said Aho, development engineers get to leverage the efforts of their peers,
"Improving product quality is just good business sense," Aho said. "We have
put strict quality standards in place that govern when products, or even product
updates, can be shipped. All products at Mentor are required to meet this
corporate-defined quality standard prior to release." In addition to product-,
design flow- and release-specific quality goals, he said, the quality standard
also includes criteria for the software development process, and legal and
Meanwhile, Dataquest's Smith pointed to the one-year subscription and the
three-year time-based tool licenses as another factor affecting quality. Terms
shorter than the perpetual, lifelong license once standard in EDA make it less
likely a customer will stay with a vendor that is not producing quality tools,
releasing appropriate bug fixes or honoring the promises of future functionality
the vendor's sales staff made during contract negotiations.
Smith called the switch to a subscription model "a godsend" because it has
put pressure on EDA vendors to release tools of better quality sooner.
But the quality gurus at EDA's big-three companies insisted that the new
licensing models do not pressure the R&D end and thus do not affect quality.
But Smith said that while a tool's initial release may not feel the impact, the
shorter licenses in fact prompt vendors to raise quality to the point where
mainstream designers can use new products without too many headaches.
A half century after Edmund Hillary proved the impossible could be done,
scaling Everest has become commonplace. Perhaps EDA vendors, too, will one day
bypass the barriers to quality tools. Perhaps it won't take 50 years.