In the second part of a panel discussion on SoC verification, Gary Smith raises flags and warnings with industry experts about how complexity now bridges hardware and software, and how continuity is breaking down.
In the first part of this panel discussion on verifcation, the experts talked about the state of verification technology and the progress that has been made. It concluded by bringing up the subject of software verification. Taking part in this discussion are: Gary Smith, chief analyst with Gary Smith EDA; Paul Martin, senior manager for debug, trace and performance modeling at ARM; Rajeev Ranjan, CTO with Jasper Design Automation; Harry Foster, chief verification scientist at Mentor Graphics; and Varesh Paruthi, senior technical staff member at IBM.
Brian Bailey: Is it our responsibility, as an industry, to take on software verification?
Gary Smith: Yes
Varesh Paruthi: In bits and pieces. We don't want to conquer the whole space but, for example, aspects of the security path need to be trusted. It needs to be verified stand-alone, and then integrated into the overall system. So, you do the entire path checking with hardware-based methods and then you do the end-to-end verification.
Rajeev Ranjan: That verification has been happening for a long time. Some aspects of software can be verified through hardware verification methods. We do that today for low-power emulation and we need to extend that. If you do emulation the coverage space is larger. Brute-force techniques are only going to go so far. We have to think of new ways to tackle the coverage space.
Harry Foster: A lot of the problems are due to a lack of good concurrent engineering practices between the hardware and software teams. In fact, if done properly, introducing a level of abstraction between them minimizes the amount of verification. We saw one company that was forced to respin the hardware because the software couldn't do what the hardware assumed -- it made assumptions, timing wise, such as resetting the circuit in zero time.
Paul Martin: The problem you have is this software piece is vast. We're not really trying to verify application software -- this is the validation space -- just the pieces that are key. If your OS is in there, you're abstracting a lot of that application software away from the hardware. If it's the middleware, that's an issue. That's highly connected and you are actually using techniques like virtual prototyping to pre-validate software before you take it to hardware.
Harry Foster: Even in the application space, in the case of power, that's becoming more of an issue.
Paul Martin: When you are talking about power, are you just talking about switching a device in different power modes or is it actually managing energy?
Gary Smith: Apple is requiring the apps guys pass a power test.
Paul Martin: From our perspective, performance verification is starting to become a key issue. If you have coherent systems, one of the big issues is checking that it delivers the correct performance. You could have an architect do some modeling that delivers the correct performance. If they take it through to an implementation and it fails because the verification engineer and the software engineer have different interpretations of the specifications, you're not going to see that until you actually run through the tests. Performance verification is becoming a key component of verification.
Gary Smith: One of the big problems is that we don't have enough power information that can reach up to the architect. The architect does the architectural design and hardware/software partitioning, and hands it down. The hardware team is part of the silicon virtual prototyping team and these guys are re-architecting the design. They are putting in accelerators, GPUs, and MPUs. The result? The architect has lost control over this whole process and we don't have any way to verify what's going on.
Paul Martin: There's a distinction here between system architects and the SoC architects. You're talking about SoC architects.
Gary Smith: Yes, I'm talking about SoC architects. They change it so much that it's no longer what the system architect intended. There's no way to verify that they are really doing the same design. That's a crisis in the market right now.
Paul Martin: That's about partitioning functions into different implementations, whether in software and in hardware. Is that a verification problem? It's more of a design problem. It's about meeting system requirements. Designers need to have a way to verify that the new partitioning actually meets your system requirements.
Gary Smith: The apps guy doesn't know what he's programming anymore. He gets a view from the architect that it's a one-CPU design and here's the memory. You figure it out -- don't worry about the rest.
Paul Martin: The software you are talking about sits in the middle. Functions, such as hypervisors, sit somewhere between the firmware and the OS. We are developing techniques that allow software engineers to get a view into the hardware. Software engineers are starting to get visibility into how their tasks make use of hardware resources. They have to be aware of the energy requirements of tasks and how to make efficient use of the available hardware resources.