At first blush, the 2009 EE Times Global Salary & Opinion Surveyseems like yet another confirmation of the appalling reputation of formal verification. The math-focused methodology is at or near the bottom of the survey's lists of "most interesting" and "most promising" technologies. But as I see it, the survey data also contains more than a few seeds of good news for the few of us out there who care more about algorithms and proofs than applications and processors.
One benefit of my job as Mentor Graphics’ chief verification scientist is that I get to spend lots of time on the road talking to engineers. Though, unless I'm in the company of other verification-obsessed colleagues, I often feel like something of an unwanted party crasher. Invariably, when I first enthuse about mathematical proofs of algorithms I start to notice people averting their eyes and shuffling their feet uncomfortably.
Yet, in my recent travels, I've also noticed that any initial reticence about the methodology quickly gives way to real interest in how to best apply it. This spring, I spent several weeks in Europe giving a series of seminars on assertion-based verification, one lynchpin of formal verification.
The 10-city tour included stops in Herzliya, Israel and Oulu, Finland. The rooms generally were full and the discussions were animated. While I joke that the audiences were drawn to my glib wit and good looks, in fact, they came because engineers everywhere grok that three inexorable trends are pushing formal verification to the fore: the technology and tools have matured remarkably (see here); standards such as IEEE 1800 (see here) and IEEE 1850 (see here) now exist to express functional properties; and most importantly, an increasing use of SoCs and reusable IP presents no shortage of problems best addressed by formal proofs. I say more about each of these trends in an article I wrote for the DAC Knowledge Center in advance of the conference, held June 13-18 in Anaheim (see here).
Granted, I live and breathe verification methodologies in general, and I am particularly fond of formal verification. And the old saw does hold true: when you have a hammer, everything looks like a nail, or in my case, like an opportunity for model checking or mathematical reasoning. Still, on second glance at the survey, I see lots of reason for optimism.
Formal verification is second only to Linux on the list of technologies in which those surveyed expect to be involved in the near future, especially in China and India. That's no surprise when you consider the kinds of products being built by these two giants on the world stage. China and India dominate when it comes to designing and building consumer electronics, arguably the most dynamic tech sector. More than one third of Chinese engineers and one in five Indian engineers work on consumer electronics, according to the survey. Among engineers in North America, Europe, and Japan, just 7 percent, 8 percent, and 13 percent work in this sector, which includes high-end cameras, cell phones, tablets, and nearly every other IC-based device that generates media buzz, and more importantly, market growth. As just one example, Semico Research Corp. predicts the worldwide market for high-end cell phones will grow by 13 percent from 2009 to 2013 while the markets for desktop PCs and workstations will shrink.
I am one of those who are passionate about verification. I'm nearly always disappointed to see that the exciting new verification tools only work with 100% digital ICs.
I’ve always worked with mixed signal SOCs with digital circuitry in the analog feedback loop. Placing such digital circuitry in a black box with the analog circuitry allows one to use digital verification tools but leaves holes in the verification plan. Here are some mixed-signal building blocks with high-risk digital content. In particular, I’m talking about analog front-end sub-circuits with digital controls that are generated by signal processing the output of the A/D converter. For example:
* Automatic Gain Control
* Digital PLL (discrete incremental control over sampling frequency and phase)
* Adaptive Analog Filters
* Built-in Analog Self-Test
* Automatic Calibration and Offset Cancelation
A workaround is to create a "real" testbench to verify the mixed-signal SOC. In addition, create a "spoof" testbench which quarantines the mixed-signal content to make the SOC look like a 100% digital IC.
The real testbench includes assertions for verifying power-up sequencing, bias currents, reference voltages and the validity of digital controls. It also assigns variables such as gain, frequency, phase or other analog quantities versus time, for which assertions are written.
I haven't found a way to objectively measure and quantify the completeness of mixed-signal verification -- with or without assertions. I long for the day when verification tools are written to assist the mixed-signal verification engineer with the real SOC testbench.
I welcome further comments about mixed-signal verification. Has anyone found solutions to the problems I've described?
Contact me via my web page or LinkedIn profile if you have questions of your own, or need help with RF, Analog and mixed-signal SOC verification.
R. Peruzzi Consulting, Inc.
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.