Santa Clara, Calif. -- As the EDA industry focuses on design-for-manufacturability (DFM), the older problem of design-for-test has almost been forgotten. But ICs built at 90 nanometers and below pose new and troubling challenges for DFT tools and techniques, according to providers who will take part in this week's International Test Conference here.
At those geometries, small delay defects become a major contributor to chip failures, but they can't be detected by conventional automatic test pattern generation (ATPG) tools. Low-power ICs--which will include most chips by 65 nm--demand new approaches to scan design and ATPG.
As manufacturing challenges grow, there's a drive to use test data for "yield learning." Test data run over many dice and wafers can provide valuable diagnostic information that helps foundries and designers ramp up yields. In this sense, DFT meets DFM and becomes a critical element in the struggle to mitigate process variability.
"There's a lot going on with DFM problems, and DFT has really taken a backseat," said Laurie Balch, research director at Gartner Dataquest. "There really haven't been a whole lot of innovative advancements in DFT."
"Today, test is a standalone entity," observed Michael Campbell, vice president of engineering at Qualcomm Inc. and an International Test Conference (ITC) panelist. "There is no easy way to go from the output of somebody's tester back to your layout, and see if a scan chain failure correlates statistically to a hot spot on your design using your DFM tools." A better interface among design, test and foundry data would let companies like Qualcomm optimize yields faster, Campbell said. Tools will help, but standards would offer a "bigger bang for the buck," he said.
Just as test data can help boost yields, information from design can minimize test problems. The key to improving test quality is "physically aware DFT," said Dwayne Burek, product director for Magma Design Automation's design implementation business unit. That means design implementation would take pains to avoid potential test problems, such as single vias. Test would be targeted to areas of the design where failures are most likely to occur.
Learning about yields
One company addressing "yield learning" at ITC is LogicVision Inc. "At 90 nm and below, we're seeing a lot more performance-related defects," said Steve Pateras, senior director of strategic technology. "Test allows you to efficiently screen for these defects, and hopefully characterize them for you. So it's not only testing, it's diagnostic as well." Useful test data, Pateras said, includes transition defects for logic, and bit-level, timing-related failures for memories. It's important to look not just at failures on individual pins, but at performance-related issues inside the die, he noted. "If you're testing memory, you may find that a particular cell is underperforming, and that would lead you to put some design marginality on that cell," he said.
LogicVision's ETDiagnostics tool extracts subdie performance information from automatic test equipment. What's important from a yield perspective, however, is to accumulate data from a variety of die and wafer runs, analyze it and look for trends. To that end, LogicVision this week will roll out Yield Insight, a tool that analyzes data captured across multiple dice, wafers and lots, and identifies systematic yield issues. The initial release is geared toward memories.
Foundries are using Synopsys Inc.'s TetraMax ATPG program to diagnose vector mismatches, said Chris Allsup, marketing manager for test automation products at Synopsys. Now it's time to take the next step and pass that information on to yield-management systems, he said. Synopsys this week will announce a link between TetraMax and the Odyssey Yield Management System that will bring "volume diagnostics" to Odyssey. The tool can perform data mining, run cross-correlations and come up with ways to ramp up yields faster.
Test data is typically just thrown away, said Greg Aldridge, director of marketing for DFT products at Mentor Graphics Corp. But customers are starting to monitor failure information from scan tests, run it through diagnostics and collect data on what's causing the failure, he said. Mentor this week will add an "automation server" to its YieldAssist product that can provide a more automated way of handling massive amounts of data, he said.
"Among leading-edge customers, there is a very strong intersection between DFT, DFY [design-for-yield] and DFM," said Sanjiv Taneja, vice president for Encounter test at Cadence Design Systems Inc. "The first question is, how can I make my test generation yield-driven and lithography-driven, so I can detect defects early during the process ramp-up." Cadence's Encounter Diagnostics product provides volume diagnostics from test, Taneja said.
Using test data for yield learning is a complicated process, said Mike Kondrat, senior director for technical marketing at test provider Credence Systems Corp. "Today's test systems collect reams of data that needs to be sorted, analyzed and acted upon by the EDA software, which takes time," he said. "Test compression, commonly used in DFT products, exacerbates this time because the compression algorithms are proprietary."
Meanwhile, low-power ICs present two challenges for DFT, said Synopsys' Allsup: generating power-aware ATPG patterns and coexisting with low-power design techniques. Low-power test patterns are needed, he said, because scan is the mode that consumes the greatest amount of power. "If you keep turning up the frequency of your scan, you could conceivably burn out your chip," he said. "To avoid that, you have to scan information in a way that doesn't create as much charging and discharging."
LogicVision's Pateras noted that low-power designs use a lot of clock gating. That is all disabled during scan testing, he said, because you need to clock all the flip-flops for scanning in data. The result is very high power consumption during test. A hierarchical approach that tests different sections of the chip, such as intellectual-property cores, separately, can help, Pateras said.
The second low-power challenge is adding test circuitry to designs with multiple voltage islands that may switch off and on. The software that generates scan chains has to wrap the chains inside the islands or, if it crosses between them, make sure the appropriate level shifters and latches are provided. "We want to minimize the number of crossings from one voltage island to the next by making sure the system understands what will happen with the voltage islands," Allsup said.
Synopsys and Cadence both claim to have power-aware ATPG. Cadence will also provide an ability to test different power domains independently by adding new capabilities to its Encounter Test GXL suite, Taneja said.
At ITC, Synopsys will describe a new ATPG capability for handling small delay defects. At 90 nm and below, Allsup said, process variations can introduce small delays that affect timing-critical paths. While transition-delay ATPG supports at-speed testing, it lacks the precise timing information needed to target small delay defects, he said. Synopsys last week revealed new technology that uses pin-slack information from the PrimeTime static timing-analysis tool to bring critical-path information into the ATPG process. It was developed in collaboration with the Japan-based Semiconductor Technology Academic Research Center (Starc).
Mentor Graphics says it has worked with Starc on small delay defects. At ITC, Mentor and LogicVision will announce a combined capability that lets Mentor's FastScan ATPG and TestKompress test compression tool find small delay defects.
Magma has an upcoming ATPG product, Burek revealed, that ensures the company's unified data model can handle small delay defects. Cadence claims to handle them through its TrueTime timing-driven ATPG technology.
See related image
See related image