In the world of circuit design, we often hear that analog doesn't scale. This, apparently, is hogwash.
"You can scale it [analog]. You just have to do some optimization," said Joseph Shor, a principal engineer at Intel Corp., in a presentation made that the International Solid State Circuits Conference in San Francisco Tuesday (Feb. 28).
Shor, who presented a paper about a thermal sensor included on Intel's newest microprocessors, said his group demonstrated the ability to scale the sensor by a factor of 3X from the 32-nm node to the 22-nm node. "It sort of busts the rumor at Intel that analog doesn't scale," Shor said.
Shor said that since his team presented data about the processor at an internal Intel conference last year, other groups at Intel have been scaling analog.
According to Ajith Amerasekera, a Texas Instruments Fellow, the power dissipated in analog circuits does not scale as readily as digital circuits for two main reasons. The first is that, at smaller dimensions, matching of analog circuits becomes more difficult. The second is that the lower voltages in the scaled technologies result in less voltage headroom and a lower signal-to-noise ratio, requiring greater effort to compensate for.
Amerasekera, who said TI has been successful in scaling down the power consumption of its analog chipsets, said analog scaling is accomplished through paying attention to power dissipation and innovative design techniques that eliminate circuits in a design that are unnecessary.
"It's a question of how hard do you want to work to get there," Amerasekera said. "A lot of people give up and say that you cannot scale analog circuits."
Amerasekera said digital circuits had the same issues that analog is dealing with in terms of power scaling, particularly around the 90-nm node. Chip makers overcame these hurdles with new design techniques for power management and advancements in EDA tools.
"It's the same concept for analog. It's just more complicated because there are so many more different types of unique circuits," Amerasekera said.
Gene Frantz, a TI Principal Fellow, said a lot of power scaling issues can be overcome through asking fundamental questions about design techniques and procedures. "It's always worthwhile to question the status quo, particularly in technology," Frantz said.
This is the kind of debate where people argue about semantics. What do you mean by area scaling? Do you mean, "an analog circuit's performance scales similarly to a digital circuit on a logic process with a 0.5x reduction in active device area?" if so, it has been known for a long time that headroom collapse, device noise, variation, and increase in spec (such as VCO gain increase) make this impossible. Analog on advanced logic processes (which Mr Shor's application concerns) has dwelled in a "relaxed" scaling zone where elimination of margin and over-design, and statistical design techniques allow approximately a 0.7 to 0.8x scaling per generation. This is not sustainable and we are already seeing the need to increase power for iso-performance. Now, if you mean, "can an analog system's area be reduced by 50% per generation through optimization, digital assisted design, circuit innovation, etc", the as we have seen the answer is yes for now. We will see big reductions as designs move to the digital domain. I have had the pleasure of designing analog circuits on Intel's lead technology for almost 20 years, but I do not speak for Intel in this forum.
The message in this article would be much more believable if a complex analog circuit were presented. When Intel gets to the point that it is combining power amplifiers, analog signal conditioning, and frequency synthesizers in a single chip at 22nm, I will stand up and take notice. Until then, this reads like me adding a flip flop to one of my circuits and declaring I'm a digital designer.
Analog does scale.
But nobody so far has mentioned that its performance starts to really suck. That's what Shor slyly refers to as "just need to do some optimization".
Painting analog with a broad brush from a temperature sensor implementation in a microprocessor, using a 'digital" process is a joke. Intel speaking against TI? Come on, EE Times - who's the analog expert among those two and who is the RTL jockey in terms of credibility?
Are there any engineers in da house? All I hear is bleating.
Analog does scale, but not to the same degree as digital -- and some analog circuits scale better than others. This isn't really a new revelation, although the issues of voltage headroom and matching are no doubt worse at 32 or 22 nm than they were back at 130 or 90 nm.
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.