Human beings progress through a predictable process of maturity with well understood characteristics for each age. What is striking to me is that all technologies seem to follow the same pattern...
Human beings progress through a predictable process of maturity with well understood characteristics for each age. It’s so predictable, in fact, that stereotypes abound to describe those stages. Doesn’t it seem like at least half of the new television shows and movies exploit those stereotypes?
What is striking to me is that all technologies seem to follow the same pattern. Luckily, technologies race through those stages much more quickly than human beings. Normally, it does not take 20 or 30 years for technologies to reach maturity!
Plenty of data shows that the lifecycle of technologies is accelerating. Witness the recent example of the shortening adoption cycles for the iPod, iPhone and iPad successively. An additional contrast between humans and technologies is the high “childhood” mortality rate of the latter. A positive elimination process weeds out weaker technologies so investments can focus on new promising concepts.
One example of this evolution process that I have witnessed during my career is the progression of functional design verification. This technology is anchored around the use of functional simulation for the verification of designs described at an appropriate abstraction level, typically the register transfer level or RTL. It started with directed tests defined as a set of predefined stimuli designed to exercise specific functionality in the design.
It quickly became obvious to engineers that this approach did not scale well for the verification of complex designs because it relies on writing targeted vector sets and analyzing detailed simulation waveforms, a time-consuming and error-prone sequence of events.
Directed test begat random testing to address key shortcomings of directed test. Random testing automates the generation of tests, thereby creating many more tests and testing corner cases that an engineer did not envision in his or her directed tests. The analogy here is that an engineer builds the factory that creates the tests. Of course, purely random tests are not meaningful, so constraints were applied to generators in the next evolution of the technology and called constrained-random.
Next came the realization that it was difficult to measure progress in verification: How does an engineer know when he or she has achieved the expected level of testing? From this emerged verification metrics, mostly assembled under the umbrella of coverage to track the completeness of the verification suite. Of course, once an engineer starts measuring something, the next challenge is to maximize the result. This yielded coverage closure tools and methodologies, a segment that is in rapid evolution. Along the way, verification engineers also decided that reuse would be a great idea and verification intellectual property or VIP came to life.
All of those technologies are essential parts of the Universal Verification Methodology (UVM), widely considered to be the state of the art for the development of IP- and subsystem-level verification environments.
This brief historical survey ignores the emergence of important functional verification technologies, such as formal techniques and emulation that today complement simulation in most advanced verification environments. It illustrates the point that technologies have a lifecycle with maturity phases similar to those living organisms go through.
And, what about System-on-Chip (SoC) verification? It is important to define the term SoC to start this discussion. A key element of SoCs is that they are architected around one or multiple embedded processors surrounded by on-chip memory, additional functional units, and interfaces to standard buses connected together by a bus or fabric.
What differentiates verification at the SoC and IP level is the complexity but, most important, the presence of processors. UVM-based methodologies are not well suited for the verification of processor-based systems; they usually replace processors with a Bus Functional Model (BFM).
The result is that full-chip SoC verification is often limited to checking the connectivity of the many blocks assembled and verifying the basic operation of the major blocks. I equate this cursory, targeted level of testing to early-stage directed testing of functional verification described above. The point is that SoC testing will evolve rapidly over the coming months and years to reach the level of maturity embodied in UVM methodologies at the IP level.
And, just as the iPhamily of products where each new format was adopted more quickly than the previous one, I expect SoC verification to benefit from the lessons learned during the development of UVM.
We will witness soon the emergence of SoC verification solutions that include randomization for the automated generation of complex test cases that run in the SoC’s embedded processors. Coverage views that better represent system behavior and support a more sophisticated SoC verification plan are coming soon. Another verification solution on the horizon is reuse that enables test cases to be leveraged from IP block level to full chip or full system and from simulation through emulation.
One such solution comes from Breker Verification Systems and is the reason why I am excited to be associated with this EDA company. Its solution automatically generates self-verifying C tests that run on the SoCs embedded processors with no operating system or other production software required. These test cases are multi-threaded so that they exercise many parts of the SoC in parallel to stress-testing the design before tapeout.
SoC verification is growing up fast. It is climbing the rungs of the predicable maturity ladder and appears to be a welcome solution to the vexing problems associated with verifying that an SoC will work as intended.
About the author
Michel Courtoy began his career at Intel in design engineering and software engineering. He managed product marketing for layout verification software at Cadence Design Systems. As vice president of marketing for Silicon Perspective, Courtoy created the market for silicon virtual prototyping and was a key player in its acquisition by Cadence in 2001. He served as a vice president at Cadence before becoming the CEO at Certess, leading Certess through sales growth to a successful exit by acquisition. Courtoy holds a Bachelor of Science degree in electrical engineering from University Catholique de Louvain, Belgium; a Master of Science degree in Electrical Engineering from University of California, San Diego; and an MBA from Santa Clara University in Santa Clara, Calif.
If you found this article to be of interest, visit EDA Designline
where you will find the latest and greatest design, technology, product, and news articles with regard to all aspects of Electronic Design Automation (EDA).
Also, you can obtain a highlights update delivered directly to your inbox by signing up for the EDA Designline weekly newsletter – just Click Here
to request this newsletter using the Manage Newsletters tab (if you aren't already a member you'll be asked to register, but it's free and painless so don't let that stop you).