Verification of complex system-on-chip (SoC) designs, especially in the networking space, is an enormous challenge.
Traditional simulation-based verification techniques are being pushed past their limits in an effort to produce functionally correct first silicon. It is not uncommon for SoC teams to write hundreds or thousands of tests and back them up by many months of pseudo-random simulations in an effort to test all chip functionality and hit all the corner cases of the design.
Ultimately, trying to verify large SoCs by simulation alone is a losing battle. The very nature of simulation means that exercising a particular part of the design is probabilistic. Writing more tests and running more pseudo-random vectors raises the probability by a tiny amount, but the size and complexity of SoCs are growing much faster than development teams can generate tests and servers can run them. A fundamentally different approach is needed in order to address this dilemma.
The only viable alternative to simulation is formal verification the use of mathematical analysis to prove properties (assertions) about a design or, when proofs cannot be found, to generate diagnostic data to show exactly how the design is inconsistent with the properties. Formal verification has been around for decades in academia, used in a few large companies for nearly as long, and available in commercial EDA products for about ten years.
The types of design blocks that make up networking chips are well-suited for formal verification. They tend to have complex state machines, flow-control logic, and handshake protocols to exchange packetized data with other blocks. Writing properties to describe proper operation of these functions, while not trivial, is a task that designers and verification engineers can accomplish with a little training. In addition, most formal verification tools generate automatic properties that can detect common design errors.
One of the biggest complaints about formal verification is that the current algorithms cannot handle an entire chip. This is only partly accurate. It is true that throwing millions of gates and thousands of properties into a formal tool will produce only a small number of proofs. However, there is a very well-established "assume-guarantee" method that allows adjoining blocks to be formally verified independently while also verifying the connections between them.
The basis of this method is that each block is verified relative to assumptions made about its inputs, and then each block driving those inputs is verified to guarantee that its outputs match the assumptions. This technique is very effective with handshake protocols between blocks in a networking chip. Further, the same technique can be used to verify higher levels in the SoC design hierarchy based upon formal verification of the blocks at the lower levels.
The result is hierarchical formal verification of major portions of the chip, or even all logic portions of the entire SoC. Chip-level formal verification is usually applied as a supplement to simulation, since expressing all chip functionality in the form of properties or assertions is hard.
Chip-level simulation tests are effective at verifying end-to-end behavior and interaction with software. Formal verification is effective at verifying internal behavior and the interconnections between the blocks in the design.
Although formal verification is still an emerging methodology, virtually all major developers of networking chips use it as part of their toolkits. As SoC size and complexity continue to grow, and simulation-only approaches hit the breaking point, more verification teams will adopt formal techniques.
Fortunately, with a decade of commercial development behind them, EDA vendors are ready to meet the challenge of making formal verification an essential component of SoC verification.
Prakash Narain is president and CEO of formal verification provider Real Intent, Inc.