"ESL" is a hot topic these days. You get somewhat different definitions of ESL depending on whom you talk to, but the common theme is around having system-level design and verification environments (or processes) enabling co-design and co-verification of hardware and software. More narrowly, an ESL solution (or tool suite) can be thought of as a combination of tools used early in the design process to model architectures at high-levels of abstraction (C, C++ and SystemC), develop software and synthesize logic with a path to implementation and RTL verification.
Raising the level of design abstraction to increase engineering productivity has always been the "story of EDA". From the early days of SPICE/transistor-level design, then to gate-level/schematic-capture, and finally to RTL, each methodology transition has enabled 10-100x improvements in designer productivity. Since IC design transitioned to RTL about 15 years ago, designs have grown in size/complexity by 1-2 orders of magnitude. Many IP blocks today are larger than entire chips were back then. Although certain details have changed, the basic methodology has remained the same; if you deep-froze a capable Verilog RTL designer in 1993, and woke her up today, she could join a typical project on a Monday morning and be productive by Friday afternoon. The main reason is that all the design methodology developments have remained constrained within the domain of hardware design.
Development practices and methodologies, like life-forms, evolve according to Natural Selection. Today's hardware and software development methodologies became highly adapted to their respective environments, but the difficulty now is that the overall environment for electronics development has changed: the amount of software running on SoCs has increased dramatically, and to meet time-to-market goals designers are having to develop hardware and software in-parallel. The goal of ESL has always been to enable that, but the environments for hardware and software development are extremely different, with fundamentally different paradigms for specifying, implementing and verifying functionality. Despite the immense challenges associated with bridging between these two domains, in just the past 5 years however, there has been genuine progress, with EDA and electronics companies working together to develop new open-standards, new technologies, and new design methodologies to tackle the challenges of system-level design.
The essence of system-design is taking pre-existing functional blocks, and integrating them in new ways. The majority of today's SoCs contain over 90% legacy IP (internal or 3rd party). "IP-based design", a term first coined in the late 1990s, has become the norm. By enabling the capture of an entire "databook" for hardware and software IP in a standard format, the SPIRIT IP-XACT standard enables EDA tools and flows to read this information and automatically "plug-and-play" entire SoCs together and generate scripts to guide tool-flows for implementation and/or verification. This results in designs that are "correct-by-construction" (rather than "constructed by correction"), and a drastic reduction in the amount of manual-scripting required to connect EDA tools together into design and verification flows. The benefit is that engineers get to spend less time doing "donkey work" and more time innovating.
Since about 2004, processor-speeds stopped increasing with Moore's law, causing software-based simulation speeds to peak. The only way to simulate larger, more complex systems at acceptable speeds is to raise the abstraction-level (i.e. "transactions" vs. "signals") or increase "horsepower" (a major reason hardware- assisted verification is one of the fastest-growing segments of the EDA industry). OSCI TLM 2.0 defines standard ways for transaction-level models to inter-communicate, and therefore enables the creation of virtual platforms that simulate at speeds where hardware-software co-design and co-verification become practicable. On the RTL simulation front, the peaking of software-based simulation speeds has caused hardware-based emulation/acceleration to become the de facto standard methodology for integrating/verifying hardware and software. Moreover, as testbenches have grown in complexity along with the systems they verify, getting those test-signals onto the hardware-platform has become a major bottleneck. Another standard, SCE-MI 2.0, leverages the same OSCI transaction-level modeling methodology and defines standards for interfacing transaction-level models running on software-based simulators with designs running on hardware-based emulators/accelerators. This enables engineers to move portions of their testbenches onto the hardware-platform, and dramatically accelerate their testbenches. As a result, hardware-assisted verification techniques become the system-integration point for combining software together with new IPs designed at high-level of abstraction, along with legacy-IPs originally developed at RTL.
Any viable design methodology requires tight links to implementation, and to meet this need a new generation of High-Level Synthesis (HLS) tools is emerging, based on SystemC. Of course, the idea of High Level Synthesis is not new; many companies have tried and failed to deliver such solutions in the past. But what is different now is the existence of a standard extension library ("SystemC") for the C++ programming language making it easy to add the hardware synchronization points needed to automatically generate logic that will interface with the RTL world. Related to this is another side-benefit of the new OSCI TLM 2.0 standard mentioned earlier: it facilitates integration of SystemC models provided by IP vendors with those generated from high-level synthesis tools. This enables one to create virtual platforms for system-architecture exploration/software development, and generate RTL, starting from the same SystemC models.
All these recent developments will positively impact designer productivity in major ways. Today many new IP blocks are bigger than entire chips were a few years ago. With engineers able to write only about 70-100 lines of code per day (on average, regardless of programming-language), keeping up with Moore's Law means each of those lines must "count" for more gates-on-chip. Using latest-generation HLS, 100 lines of SystemC code can now be translated automatically within minutes to 50K gates. The "trick" to making HLS work well is having very tight integration with the downstream implementation flow. Just as logic-synthesis eventually had to be tightly integrated with floorplanning/P&R, high-level synthesis requires tight integration with logic-synthesis. The reason is that you need a tight loop (that includes the timing, area and power data) for feeding-back physical information up to the top; "integrated circuits" are called "integrated" for good reason.
Finally, we must remember that the heart of systems engineering is integration and verification, and when the systems become large and complex, this latter step becomes the most critical challenge. Today's most advanced coverage/metric-driven verification techniques automate the process of deriving a system verification plan from system-specifications, creating test-cases and measuring progress vs. that plan. Another major advance recently has been the extension of metric/coverage-driven verification techniques from the hardware domain to the software domain. These now enable engineers to apply concepts such constrained-random testing with coverage-measurement to jointly verifying hardware and software together, systematically and thoroughly.
Given all these recent developments, the good news is we can finally say that all the key technology building-blocks required to design/verify at the system-level and "hand-off" to implementation exist " (see chart 1). The next step is to create a methodology enabling EDA companies, IP providers and SoC developers to integrate all those pieces together into cohesive system-level design and verification flows, with tight links to implementation. To be sure, the challenge and complexity of that integration cannot be underestimated, and one could argue that only companies which have all of those building-blocks "under one roof" are positioned to do that integration successfully.
About the Author:
Michael (Mac) McNamara helped start Chronologic in the early 1990s, which brought VCS (compiled Verilog simulation) to the world; then later he co-founded SureFire Verification (which became part of Verisity) to improve the state of the verification software. After Cadence acquired Verisity, Mac was asked to lead the effort to improve high level design, and serves as the general manager of Cadence's incubation project targeting this area.