Behavioral modeling has caught on quite fast in the analog verification community. A RTL like description on analog, RF and mixed-signal blocks has opened up more possibilities of thorough top-level verification for these cores. Now, the power and finesse of digital verification is being brought into the analog domain with the aid of improved modeling and test bench methodologies that are influenced by the “digital way” of doing verification. This influence has been both a help and hindrance to analog verification.
First things first, digital verification is much more advanced and time-tested than the more recent analog verification idea. The digital design methodology of RTL, equivalence checking and formal methods compliment the verification processes beautifully and everything fits in. Most often, the digital design engineers double up their role as verification engineers and run extensive test cases on their designs. I strongly believe that this strong bonding between digital design and verification makes the process work smooth. Analog verification is a different beast. The designers and verification engineers don’t play well together, mostly because of incompatible methodology issues. While the analog verification camp is beginning to believe that digital verification methodologies have to be adapted (like monitors, assertions and test benches), they are involuntarily alienating analog designers, who are not concerned about these aspects of verification. Analog design engineers are super-smart, keen and intuitive. However, most of the time, they are engrossed in the performance of their own designs. They don’t worry about the integration issues that their block/module might face with a digital core or other analog cores. They offload that worry to module/integration leads, which I believe is not necessarily the right thing to do. While a verification engineer worries about capturing the connectivity and functionality issues pertaining to the whole chip, design engineers worry about the performance of their blocks. On the other hand, the analog verification engineers these days don’t have an appreciation for analog design. There is a chasm between what they think analog design is and what it really is. Because of this lack of understanding/appreciation, there are communication issues between the modeling/verification teams and design teams.
Modeling and verification can only be successful when the design and verification engineers are part of the same team. There should be a team chemistry between the two engineers, who actually are working for the same cause, a successful product. In my personal experience, remote interaction between a modeling/verification engineer and design engineer does not work that well because of the continuous interaction that is required to make this process flow smoothly. Normally, verification teams raise a number of concerns like:
• We cannot start modeling early. The schematics and symbols are not stable.
• There is too much model churn because of design changes that keeps us occupied.
• Design engineers do not respond to us in time.
• There are no specifications to base the models on.
• There is no appreciation for verification efforts.
Similarly, design teams have several concerns of their own:
• It is too much work to support the modeling.
• We have to spend time to explain the functionality and later spend time to validate the models.
• We run verification on our blocks, we don’t need models to do that. Because verification is more than connectivity for us. It is performance as well.
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.