In the 30 years and 1,225 issues since EE Times first appeared, perhaps no human pursuit has changed as profoundly and fundamentally as the practice of electronics engineering. It's not just that the raw materials have been transformed since the early days of the IC. Even the way designers think about systems and the methodologies they use to convert requirements into implementations have changed in fundamental ways. In the end, the process of system-level IC design today bears as little relationship to the component-based design of the early 1970s as digital photography does to line drawing.
That statement is no surprise to anyone, but it needs to be said, since it is the motivating force behind one of the most fundamental changes EE Times has ever undertaken: the introduction of a new way of writing about the design process. To launch our new approach, and the Silicon Engineering section that will embody it, a detour tracing the evolution of design practice through the lifetime of the publication can supply some perspective.
Little black boxes
In 1972, after the advent of the IC, after the first tentative steps into medium-scale integration (MSI) but before the unveiling of the first commercial microprocessor, design meant component-based design. Engineers selected passives, transistors and ICs based on their data sheet characteristics and fit them together into systems. The process of design was the process of decomposing the design requirements into a network of commercially available black boxes vendors rarely gave much more than schematic information about what actually went on within ICs back then linked through well-defined bit-level interfaces.
Tools included paper and pencil for block diagrams and schematics, vector boards and prototype pc boards for building the designs, and signal generators and oscilloscopes for trying to figure out what they were doing. Even logic analyzers would come along later. Analog designers were beginning to evolve simulation tools, but they stood virtually alone in that regard.
The information designers got concerned the functions the little black boxes performed and the interfaces into and out of them. For passives and transistors, that information came in standard models. For increasingly complex MSI ICs, it came in text descriptions, occasional truth tables, or state diagrams and lists of timing data.
To write about design meant to write about this process of piecing parts into systems. In effect, it meant to write about the parts. EE Times did this for the most part by covering new products.
The microprocessor arrives
With the growing complexity of MSI and then large-scale integration, writing about design by writing about components became more challenging. It was no longer enough to just tell what a vendor said about the part in a data sheet or press release. It was necessary to report more: novel applications, disparities or omissions in the data sheet specs, user experience with the devices and industry debates about interface standards. Writing about design took on an investigative as well as a stenographic aspect.
With the advent of the microprocessor, that trend accelerated. Now the appearance of software broke the solid link between design requirements and hardware design. Increasingly, the methodology was to translate the design requirements into software and do a component-level design to make sure an MPU executed the code with sufficient speed. To the hardware engineer, the characteristics of microprocessors and memories loomed far larger than the behavior of the MSI and LSI components that surrounded them.
Naturally, this change was reflected in the editorial direction of EE Times. Microprocessors were far too complex for their behavior in a given system to be predictable from data sheets. We dove, along with our readers, into the depths of instruction-set architectures, memory systems, interrupt handling and bus timing. We reported on a number of conflicts, including the early architecture wars, the RISC vs. CISC dogfight and too many bus battles to even count. Drawing on the views of architects and analysts, the experience of design teams and the claims of visionaries became important adjuncts to coverage of specifications and interfaces. But still, to a great extent, we wrote about design by writing about components.
With the first flickerings of ASIC design began a fundamental change in the design process. In the beginning, ASICs were designed from a schematic: It was component-based design with a mysterious back-end process that happened out of the view of the design team. But then logic-optimization software appeared and morphed into logic synthesis. And a new kind of design methodology was born.
In this approach, the design requirements were implemented in a language that described digital circuits at the register-transfer level (RTL). Back-end design the part that actually involved gates, transistors and interconnect became a process of creating a hardware implementation of this RTL abstraction. This was not unlike microprocessor-based design in that a formal language captured the structure of an implementation, leaving the details up to serious hardware types.
The ability to design at the RT level rested on the ability of design-automation tools to manage abstraction and complexity. And hence, writing about tools became a fundamental part of writing about design. Just as the skilled analog or LSI designer had to understand components far beyond what appeared on the data sheet, the successful ASIC designer had to understand tools at a level far beyond what the EDA vendors told them.
Threshold of change
Today, spiraling complexity and the inscrutable problems of extreme-submicron processes have placed us as the threshold of yet another shift. Understanding design requirements, libraries and tools is no longer sufficient to produce a system-level IC.
It became clear perhaps two years ago, for example, that verification of system-on-chip (SoC) devices was so complex that addressing it would take fundamental changes in the design flow, supported by fundamental changes in chip sometimes in system architecture.
Now more issues are forcing themselves on the embattled design team. Testability, long an afterthought, is proving intractable unless addressed from the very front end of the design process. As failure modes shift from defect-driven to feature-driven in aggressive processes, designing for yield has become another front-end issue with implications throughout the flow. Even failure analysis, once totally uncoupled from design, is demanding an intimate link into the design process. These issues are closely tied to decisions made by the process engineers, the mask makers, even the chip equipment manufacturers.
Hence, we are once again evolving the coverage of EE Times. In the new Silicon Engineering section, we will attempt to cover the process of design from the point of view of the leading-edge SoC design team. That will mean talking to design teams about design practice, about the methodologies in which the tools are lodged, about the victories and pitfalls that are the sinew of day-to-day design. And it will mean reporting on the new issues from verification and testability to mask making, process engineering and failure analysis that advanced design teams can no longer ignore.
At the heart of it all lies one goal: to write about what leading-edge engineers are actually doing in their work. The undertaking will require, above all, feedback from readers who are involved in SoC design, to keep us on course and to help us spot the vital issues that always seem to lie just below the surface. With help and perseverance, we will succeed in being a valuable partner in this brave new world of engineering in silicon.