Embedded Systems Conference
Breaking News
Blog

Design for Verification: A Natural Next Step?

NO RATINGS
View Comments: Newest First | Oldest First | Threaded View
MikeBartley
User Rank
Author
Re: Bug Avoidance
MikeBartley   1/8/2015 1:25:37 PM
NO RATINGS
Hi

Thank you for your comment

TVS is currently verifying a design written in SystemC and all seems good - but they are the exception

I agree it may help to eliminate SOME bugs but you still need to get a good micro-architecture, design the algorithms, make sure the corner cases are handled, etc.

Regards

Mike

MikeBartley
User Rank
Author
Re: Bug Avoidance
MikeBartley   1/8/2015 1:20:50 PM
NO RATINGS
Hi Bryan

Thank you for your comment. I like your choice of words - putting more effort on "Bug Avoidance" rather than always thinking about "Bug Hunting". And also asking "How do we avoid this bug in the fure?" rather than "Why didn't verification find it?"

Looking forward to your talk at Verification Futures http://goo.gl/CGxYSJ

Regards

Mike

tudor.timi
User Rank
Author
Re: Bug Avoidance
tudor.timi   1/7/2015 4:08:50 PM
NO RATINGS
I'm not a designer so don't take my word for it, but one idea for better bug avoidance might be high level synthesis. Working at a higher level of abstraction means less code, which often means less bugs. I'm pretty sure that the switch from gate level design to RTL also had this benefit.

bdickman
User Rank
Author
Bug Avoidance
bdickman   1/7/2015 10:28:41 AM
NO RATINGS
Thanks Mike for raising the profile of this important topic. 

I think that both EDA and product engineering teams have been highly focused on the "verification challenge" for some time now, and thanks to that focus, we have seen some fantastic improvements in verification tooling and methodologies over the last decade or so. I remember what a revelation HVLs were with the approaches of constrained-random stimulus coupled with functional coverage. This of course is standard practice nowadays. And as you point out there has been a massive take up of assertion based verification and formal verification in that timeframe also. The focus today is predominately on "Bug Hunting" and "Bug Finding". 

However, the verification problem is not getting any easier, and one might conclude that such advancements are merely keeping up with the ever-increasing design complexity of modern SoCs and complex IP components such as Processors and GPUs. I'm not saying that there isn't a lot more advancement that can and needs to be done with verification tools and methods, particularly around more analytical approaches to coverage and finding new ways to measure the completeness of verification, and especially better ways to make verification more efficient and effective since the supporting infrastructures such as compute farms and emulators cannot continue to scale with a quantitative approach to verification testing. 

As you have identified, because verification has become such a complex task requiring highly specialized skills, there is a gap between the task of creating and capturing the design, and verifying that it works.

What I would like engineering teams to focus more on is "Bug Avoidance", i.e. How can we prevent putting the bugs into the design in the first place, and how can we simplify and accelerate the verification effort? Verification is a non-tractable problem due to non-tractable state space. Good tooling, best practices and highly skilled verification engineers apply best efforts to demonstrating functional integrity. However, complex bugs will always escape verification and may result in rare but critical failures. When such cases arise we need to ask not only "why did verification miss that bug, how can we improve verification?", but also "why is that bug in the design and how can we avoid it in future designs?" Was it a copy-paste error? Something that could and should have been picked up with more robust reviewing? Or complexity in the design that has introduced unknown corner cases leading to emergent behaviours. 

Since finding all bugs is an unrealistic expectation, let's also think about how we can reduce the number of bugs in the first place. 

More Blogs
With the impending close of Microchip’s $840 million acquisition of Micrel, Ray Zinn, Micrel’s CEO for the past 37 years, looks to a future that includes mentoring high-tech U.S. startups.
Another approach taken by IBM and Macronix to address phase change memory drift was to find a means of the reducing the statistical spraed of the resistance levels of MLC
What will come next for us? Internet of Everything, wearables, the smart car, smart home, smart city, 3D printing, quantum computing, robotics, the cloud, Big Data, the maker movement, drones?
There are still plenty of exciting challenges out there for companies or individuals who are not yet locked into the quarterly revenue drumbeat.
Is 3D XPoint non-volatile memory really just a version of phase-change memory? A patent search support the idea.
Flash Poll
Like Us on Facebook

Datasheets.com Parts Search

185 million searchable parts
(please enter a part number or hit search to begin)
EE Times on Twitter
EE Times Twitter Feed
Top Comments of the Week