MOUNTAIN VIEW, Calif.--As the level of chip and component integration increases, so too does the problem testing the ever more intricate products, with engineers struggling to keep up.
“The industry needs high volume test at low cost,” said corporate vice president and CTO of AMD's client division Joe Macri at UBM’s Design Con in Santa Clara recently, noting that the high levels of interconnect were beginning to cause significant test problems.
“The way we’re integrating today is different,” he said adding, “it’s not just about putting everything on a single die, at times it’s how you slice different parts of the system on multiple die.”
Brad Davis, Broadcom hardware design engineer and winner of UBM’s prestigious Test Engineer of the Year award admitted that the testing dilemma was “a challenge.”
“Test companies are getting prototype test hardware out as firms are getting test chips out,” he said, noting that firms were having to work more and more closely with test houses in the early stages of product development.
When problems are discovered, both the chip company and the test house work together to see whether the issue is related to the testing equipment or the product itself, said Davis.
“It really requires a tight relationship with the test company,” he said.
Macri told EE Times that design for testability used to be something design engineers laughed at, but said it was probably becoming one of the more important skills in the design community today.
“If you don’t design for test, it might work, but you won’t be able to go into high volume with it, and if you can’t go into high volume, it basically didn’t work,” he explained.
The conundrum makes it all the more critical for design engineers to provide visibility hooks into what they’re doing, said Macri.
“In many cases we’re pulling pieces of the test programming into our silicon or into our own test software,” he added.
Check out the video interview below, and if you have any thoughts on how to tackle the test problem, let us know in the comments.