If semiconductor firms are highly sensitive to the cost of field escapes, they will generate more test patterns to target more silicon defects and utilize compression to reduce the increased volume of test data so that the size of the entire test pattern set can be stored in tester memory. This approach to increasing test quality takes advantage of different types of tests to target different types of defects.
For example, path delay and transition delay tests target delay defects whereas bridging tests target resistive shorts. Typically the increase in test data volume that results from applying at-speed and bridging ATPG patterns in addition to stuck-at patterns is in the range of 4 to 6x.
Figure 3 models the relationship between field escapes, as measured by defective parts per million (DPPM), and compression for an example production IC. Each point on the graph represents the number of field escapes observed for a given compression factor, assuming the compressed ATPG pattern set just fits within the dynamic memory of the tester.
Figure 3 Quality versus compression
Increasing compression by a small amount makes room for more patterns in memory that detect additional defects, thereby decreasing the number of field escapes. The cost of imperfect test quality shown in Figure 2 is directly proportional to the field escape rate and, depending on the manufacturer’s tolerance for field escapes, this cost component may be the largest single contributor to total test costs.
When the key benefit of compression is linked to improving test quality, is TDVR needed beyond the level of 4 to 6x originating from additional DSM tests? ATPG fault models and techniques in use today are limited in their ability to resolve all possible types of physical failure mechanisms, including small delay defects that could lead to functional, speed- or noise-related failures.
This limitation places an upper bound on the quality that can be achieved for any given design by generating more test patterns and utilizing compression to reduce the test data volume. This is why, all other factors being equal, TDVR beyond around 10x is associated with diminishing improvements in quality, as shown in Figure 3.
The role of Moore’s Law and future expectations
As designs increase in circuit complexity, the number of scan elements and length of scan chains increase approximately by the same factor so that test data volume and test application time also increase. However, using today’s existing ATPG technologies, the previous observations regarding compression limitations are valid for any given design: TATR beyond 20x and TDVR beyond 10x achieves diminishing returns on DFT resources as a percentage of total cost savings. Incremental savings related to further improvements in test application time and test data volume, assuming pattern size is less than tester memory capacity, are small compared with the design's total test cost structure.
While these conclusions are valid for any given design, design organizations will base their maximum compression requirements and hence their compression technology on future expectations, or the anticipated needs of many designs over a time horizon of several years. Moore’s Law predicts that gate density will double every 18 months. Therefore semiconductor firms may reasonably expect an order magnitude increase in circuit complexity over several years encompassing several generations of products.
They will require relatively higher levels of compression if they plan on utilizing their current tester equipment over the same period until fully depreciated. Higher compression must compensate for the limitations of older testers and thus permit gradual deployment of newer, more expensive testers with greater memory capacity and higher clocking frequencies.