Great comments from everyone. I work with Arif at Synopsys and we meet often with design teams from different companies. We are encouraged by the trend to consider test as part of the design (indeed - it is), and surprised to find that some teams still consider and execute it as a seperate activity.
We agree that 'test' is essential, high value, and can be accomodated as an integral part of the design process starting with synthesis.
Hank, I totally agree that "test" is not something that foisted onto the "design" -- it is an integral part of the design.
It is essential to get test engineering & design engineering talking to each other very early in the project, so there are no surprises or misunderstandings about test capability and cost. Designers also need to learn that the test engineer is their friend. When something is not working, you will be SO glad you have the fault coverage, the BIST, the test buses to bring out internal signals, the JTAG, etc.
I don't quite understand why the author so emphasized the point about inserting test logic at the RTL level, before synthesis. Haven't we all been doing this for at least the last 10 years? It's been a long time since I've seen anybody use "bolt-on methods" -- for exactly the reasons the author states.
While I agree with these comments, I think the slant is that "test" is something foisted onto the "design". As was said long ago, customers won't pay much if the chips aren't tested, and they won't pay if the manufacturing test cost is too high. So test is every bit as much a design constraint as functionality, area, power and delay. Design would be so much easier if we didn't have to worry about test, but it would be so much easier if we didn't have to worry about area, delay or power either.
The article does not mention the fact that the fundamental properties of DFT - controllability and observability - are also useful for debug, which is also a design requirement.
This is a thoughtful article concerning the increasing need for DFT in the author's design domain. It is part and parcel of a larger problem, though. That is the desire to perform hardware and software design simultaneously. Once called co-design, this activity was something of a chimera except in special cases. Modern design tool chains, however, can support co-design by providing a platform that can either simulate hardware or incorporate hardware in loop for both the control electronics and the "plant", the hardware being controlled. This comes at a price: the tool chain isn't cheap; a fair amount of training is needed to use the tool set effectively, and there is no single tool chain that will bridge the spectrum from IC design through, say, turbine engine FA controllers. Still, tool chains do exist, and it seems to be more important from day to day to employ them on large and even medium-size designs.
I agree with the over all picture of DFT and design being two orthogonal objectives leading to complexity in design cycle. Not only the time line or schedule, we pay price on chip overall functionality too e.g. due to DFT the clock scheme get even complicated leading to suboptimal Clock tree synthesis which means more power while phone is operating or more heat up when processor is operating.
Replay available now: A handful of emerging network technologies are competing to be the preferred wide-area connection for the Internet of Things. All claim lower costs and power use than cellular but none have wide deployment yet. Listen in as proponents of leading contenders make their case to be the metro or national IoT network of the future. Rick Merritt, EE Times Silicon Valley Bureau Chief, moderators this discussion. Join in and ask his guests questions.