NEW ORLEANS The embedded design world is at an inflection point in the use of tools to complete embedded system-on-chip (SoC) designs, according to a Wednesday (June 12) panel at the 39th Design Automation Conference. Asked to considered whether tools should be united or kept separate in the design flow, the panel's designers, tool providers and academic voted "maybe."
"We know that every three or four years we cycle between configurable designs and application-specific standard products," said panel moderator Gary Smith, chief EDA analyst at Gartner Dataquest (San Jose, Calif.). "The best we can do is to apply the tools and methodologies that reflect the needs of the cycle we are currently in."
"The cycle we are entering seems to be one of reconfigurability," said John Fogelin, a fellow at Wind River Systems Inc., the market's leading embedded tool vendor. "What needs to happen in this economy is for the churn of new products to be slowed, and reconfigurability can do that. It does nobody any long-term good to throw away products as soon as they are designed, like in the cell phone industry."
Kurt Keutzer, EE professor at the University of California at Berkeley, concurred: "The beginnings of real hardware/software co-design are coming out of companies like Tensilica, and lest I be chastised by Gary Smith for bringing up the "R" word, I do say that we are in the beginning of a 'revolutionary' change, where hardware and software development are done concurrently, albeit for a limited set of products."
Keutzer mentioned a few other startups "in which I do not necessarily have an interest" that are following Tensilica's steps, including Proceler, Catalytic Computer, and Ellipsis. Keutzer is a member of Tensilica's technical advisory board.
Grant Martin, a fellow at Cadence Berkeley Labs, said it doesn't matter whether or not tools are united in a design flow, since "everything is done in software. We need to get the industry thinking along the lines that both hardware design and software design are software-centric. We need to find the appropriate platform for designs and link, not modify, the appropriate abstractions to it," said Martin.
"As a former hardware designer I can't agree with that," said Gartner's Smith. "You look inside a cell phone and there's a lot of analog and RF circuits that I would not call 'designed in software;' maybe firmware is the term we ought to use."
Terminology was the crux of the panel discussion for the most part. "Hardware/software co-design is not the right term," claimed respected EDA thinker Hugo de Man from IMEC (Leuven, Belgium), who was enticed to comment from the audience by moderator Smith. "We are designing systems that need to be programmable. By all means, do everything that doesn't need to change in hardware. At the end of the day, we just need good engineers to get the job done, whether in hardware or software," de Man said.
"There are four hardware engineers, 10 software engineers and one firmware engineer on a typical project," said panelist Rick Chapman of SuperH Inc. (Bristol, United Kingdom). "EDA vendors would just love to sell a tool to those software guys whether they need it or not, and the firmware guy is also susceptible to their enticements."
Chapman said that different communities of users and customers need to share needed information to keep adding value to a design. "Building a super tool is mission irrelevant," said Chapman. "Instead, models need to be derived from common views and verified at the source."
That struck a nerve with panelist Brian Bailey, a fellow at Mentor Graphics Consulting Group and the VSI Alliance's leading verification standards proponent. "We are addressing the wrong problem," Bailey said. "Design is less important than verification and we need to be thinking what changes we ought to make in the design process to make verification easier. Executable system specifications will become the testbenches."