The electronics and semiconductor industries have relied on a modular building block approach since the dawn of time. By creating a limited number of interfaces and using these to connect components, this approach has enabled a fairly simple separation of the functional pieces. These distinct functional pieces are designed separately and integrated either at the chip, board or system level. This has often been compared to the “Lego” building-block approach.
Another common design practice has been to minimize the frequency of communications between the functional blocks because those interfaces generally have long latencies (compared to processing speeds) and are often the congestion points in a system. This also simplifies the integration process because it decreases the number of problems that can be created due to temporal interactions. For many years, companies making the most complex system on chips (SoCs) have been quite successful performing the bulk of their verification at the block level. When the components are integrated, a small number of system-level tests are run to ensure that the blocks were properly interconnected.
This strategy, often called stitch and ship, is increasingly leading to failure because of growing complexity at the system level. In addition, increasing amounts of functionality are defined at this level. New verification strategies are required to bring system-level verification into the mainstream development flow.
Introduction System complexity used to be driven by Moore’s Law and, while this is still alive and well, two additional laws have been added to the mix. The first is Amdahl’s Law to describe the total throughput in a system being constrained by the slowest piece of the system. When processor speeds stopped increasing, the industry transitioned to multiple processors and, in the embedded world, these are usually heterogeneous in nature. Processors, memories, buses and peripherals now have many more connection points than they did in the past and it has become increasingly difficult to analyze these both functionally and in terms of performance.
The second new law is Metcalf’s Law, which addresses complexity and the utility of systems where multiple independent pieces are able to communicate with each other. Systems today have many pieces of interconnected functionality that can be combined in numerous ways to create any number of user experiences. This can be compared to the single processor, single function systems of yesteryear. In addition, SoCs of today have functionality not confined to leaf blocks. Voltage and frequency adjustments on power domains are system-level functions and these have established a new set of verification challenges that cannot be performed at the block level.
All of this leads to a growing myth within verification that if the individual blocks work and the communications fabric works then, when integrated, the system will either work or can be fixed in software. This is rarely true and the number of failures in the field is testament to the fact that system-level verification has been ignored for longer than it should have been. Continued insufficient allocation of resources will lead to an increasing number of failures at this level.
Join our online Radio Show on Friday 11th July starting at 2:00pm Eastern, when EETimes editor of all things fun and interesting, Max Maxfield, and embedded systems expert, Jack Ganssle, will debate as to just what is, and is not, and embedded system.