Regardless of the test methodology employed, the goal of manufacturing test is to identify, or screen out, defective devices before they are embedded into a system or shipped to the end customer. More effective (or higher-quality) testing means fewer "bad" parts assembled into systems or shipped to end customers.
The standard manufacturing-test methodology for multimillion-gate designs is scan and automatic test pattern generation (ATPG). Scan provides a relatively nonintrusive technique for testers to access all the internal sequential elements (flip-flops and latches) of the design, enabling an efficient structural-test methodology that lends itself to a high degree of automation.
The predominant type of ATPG patterns used for manufacturing test today are deterministically generated "stuck-at" test patterns. Deterministically generated patterns provide the highest test coverage, since all faults in a design are deterministically targeted during test pattern generation. Stuck-at test patterns target faulty nodes in a circuit that are "stuck" at a logic 0 or logic 1 state. These are static tests and are not well-suited for detecting the myriad speed-related defects that are emerging as process technologies shrink to 130 and 90 nanometers.
To ensure effective testing for these smaller process technologies, stuck-at or static testing must be supplemented with some type of "at-speed" test. Historically, this has been done using functional test patterns and running them at system speeds during manufacturing test. As device sizes and clock speeds continue to increase, two major issues have emerged with the functional approach to at-speed test.
First, the effort needed to create high-coverage functional patterns for multimillion-gate designs has made this a costly and unpredictable approach. Second, the clock frequencies of these devices are exceeding the clock frequencies that today's automatic test equipment can accurately supply.
A scan-based at-speed methodology is now emerging that allows engineers to use deterministic ATPG techniques to generate high-coverage at-speed tests reliably and on predictable schedules.
Scan-based at-speed test patterns utilize a different fault model called a "transition" fault model, and are generated in much the same fashion as static stuck-at patterns. The difference is that several at-speed clock cycles are issued during the active period of the test so that any speed-related problems on a device can be accurately detected. To generate accurate clock cycles for these tests, new ATPG techniques are utilizing the devices' own internal clocks to generate the test clocks needed.
An external signal or an internal register can be used and programmed through the ATPG process itself to generate the various internal clock sequences needed on a per-pattern basis. Utilizing this technique, test engineers can explicitly define specific clocking sequences that are valid and restrict the ATPG engine to only those valid sequences.
A scan-based ATPG solution offers a highly automated approach to generating high-coverage "at-speed" patterns. Although this approach can effectively improve test quality so that defect rates (as measured in defects per million) are minimized as devices move to smaller process technologies, there is a cost to this additional testing. Scan-based at-speed tests increase the test data volume by three to five times that of traditional stuck-at patterns alone.
There is clearly a shift in test requirements affecting today's complex system-on-chip designs, resulting in a huge increase in pattern count. Ignore the requirements and more defective devices will escape manufacturing test, which could result in enormous costs and could damage a company's reputation. On the other hand, implementing tests that could be 10 times larger and take 10 times longer to apply during manufacturing will have a significant impact on production throughput and profitability.
Several embedded test technologies are available to help control the cost of test as test quality is improved. One methodology utilizes a built-in self-test approach. Logic BIST embeds a pseudorandom pattern generator into the design so that test patterns can be generated on-chip as opposed to stored externally on the tester. Since logic BIST can also support a scan-based at-speed test, it is also capable of detecting the speed-related defects that are prevalent in smaller process technologies. Although logic BIST is ideal for applications like in-system test, where tester access is not possible, there are several issues associated with trying to adapt this methodology to support high-quality manufacturing test.
One of the most significant issues is that the effort requires changes to the core or functional design, and thus the design process itself-something that most design groups vehemently resist. Since logic BIST uses random patterns and does not deterministically target faults in the design, test points must be added throughout the design in order to get test coverage to acceptable levels.
Logic BIST also compresses test pattern responses into a signature register, which has the restriction that all inputs be known values. Since most designs have unknown states (or "x" generators), the logic BIST approach also requires that additional logic be added throughout the design to "bound" any unknown states so that they are not propagated to the signature register. Unless the design and design schedule can tolerate the functional changes required, logic BIST may not be a viable solution for high-quality manufacturing test.
Hybrid solutions like deterministic BIST use compressed external patterns that are then expanded on-chip. Although these solutions resolve the need for test points, since they utilize deterministic patterns as opposed to random patterns, they still require modifications to the functional design for "x" bounding. Today's deterministic BIST solutions also have limited encoding capacity, which makes them impractical for certain design styles and for generating some of the sequential at-speed patterns needed to ensure high-quality testing.
A third approach, called embedded deterministic test, offers compression levels of up to 100x while mitigating many of the disadvantages associated with other embedded approaches. http://www.eet.com