MONTEREY, Calif. In American folklore, John Henry represents man's struggle against obsolescence. Legend has it that John Henry and his sledgehammer beat a steam-powered drill in a tunnel-digging contest, but his heart burst in the effort. An evening panel at FPGA 2000 entitled "The John Henry Syndrome" recast that legend in the FPGA world, asking whether software tools can ever outpace human intervention. The answer was a resounding "No."
The joking barbs and good-natured heckling that characterized the panel started as early as the introduction by moderator Herman Schmit of Carnegie Mellon University. While all involved agreed that the tools have their shortcomings, debate arose over whether tools are good enough in a majority of cases, and whether the nature of required human intervention is changing.
The main source of grief against software was summed up by Ray Andraka, president of Andraka Consulting Group, who noted that tools are best at synthesizing "fat, dumb, slow" logic. He and former consultant Stephan Wasson, now director of reconfigurable logic for MorphICs, produced examples where hand-crafted designs grossly outperformed synthesized designs.
One extreme example from Andraka was a FIR filter where the synthesis tool placed gates into two opposite corners of a Xilinx Virtex FPGA. Andraka's hand-drawn version of the design, which naturally had all the circuits in one place, increased speed by more than 30 percent.
As one of the designated defenders of software, Altera Corp. vice president Tim Southgate noted that tool-synthesized designs are fast enough for most projects. Designers are growing less interested in tweaking and more obsessed with time-to-market, Southgate said.
"We see customers now that just press the button and get their design, and they're happy," he said. "Sure, you can beat us in speed, but most of the time, you don't need to."
Wasson actually agreed, noting that "people are willing to pay [the price of] more fat in the design in exchange for time-to-market."
Even agreeing that "John Henry" work will live on in FPGAs, audience and panel members wrestled to predict the future of that work. One attendee pointed out that the type of tweaking required is becoming higher-level all the time, pertaining more to algorithms than to actual gates. But Andraka argued that some knowledge of gate-level design will forever be required. "To do a good algorithmic design, you need to understand what the pieces of that algorithm are," he said.
In addition, there was some contention that hand-tweaked designing is made possible by designers' ability to visualize a sensible layout. Some concern arose that the increasing complexity of designs and algorithms could eclipse that kind of talent. "In a few generations, I don't think you can comprehend what is a good layout," said Satnam Singh, senior staff engineer for Xilinx Inc. At that point, the reliance on tools is likely to be much greater, he said.
Users are partly to blame if tools are lackluster, said professor Jason Cong of UCLA, noting that ASIC synthesis tools sell for six figures while FPGA software is practically (if not literally) given away. Some question arose as to whether that's the fault of overzealous marketing, but Southgate nailed the problem as one of perception.
"Why can't we get money for tools? It's because we're viewed as selling chips," he said. Customers who already spend big money on silicon get testy when asked to shell out more for software, he said.
But Wasson stressed that tools won't be improved unless vendors are willing to take input from experts.
"I've gone to the vendors and tried to share the knowledge of what I know works," he said. Wasson and Andraka agreed that their suggestions get labeled as "the 5 percent designs from hell," and are rarely acted upon by vendors. One notable exception, for which Andraka commended Xilinx, was that company's decision to add a floor planner to its software for the Virtex FPGA, a part that initially was marketed as having enough routing to make manual placement unnecessary.