SAN JOSE, Calif. – A DesignCon panel probed the leaders of the major test and measurement companies on a broad range of issues. All agreed test problems are becoming increasingly thorny, and at least one member of the audience gave panelists flak for not doing enough to solve customer’s toughest challenges.
All sides agreed test engineering increasingly involves multi-disciplinary skills.
“There is some shortening of the technical waves as customers now look across multiple disciplines of software, silicon and physical-layer channels—and that pace is quickening,” said Greg Peters, general manager of the component test division at Agilent Technologies.
“Engineers used to focus on just analog, digital or RF--but now it’s all three at once,” said Kevin Ilcisin, chief technology officer of Tektronix. “When engineers were going through school they probably only focused on one domain, but now they are dealing with all three in their designs,” he said.
“Engineers tend to forget the debug part of the cycle--we want to believe it’s not going to be there--but you have to plan for it because it’s probably one of the top reasons things fail to get to market on time,” he added.
For customers verifying wireless SoCs “the interactions between all those embedded radios and protocols is very complex, involving multidisciplinary layers in the stack,” said Eric Starkloff, vice president of systems platforms at National Instruments.
In terms of their technologies, Agilent is the sole large test company to have its own fab where it uses unique III-VI processes for its analog chips. But the company also uses straight CMOS ADCs, said Peters.
Both LeCroy and Tek make their chips in IBM’s silicon germanium process technology. The technology provides key “bandwidth, noise isolation and sample-rate capabilities,” said Ilcisin of Tek which has been working with IBM for 15 years.
From the audience, Ransom Stephens, a former test engineer turned speaker and consultant, chided the test vendors for not collaborating closely enough to solve thorny jitter and crosstalk problems.
“Our first cut at a crosstalk analysis tool is in the right direction,” said David Graef, chief technology officer of LeCroy.
Measuring jitter is “a tough problem” said Peters. “The sources are driven by the physical structure of the channel, [but] I think you can drive [the process of tracking it down and eliminating it] back into the design process,” he said.
Peters said data rates will clearly go “substantially higher” but predicted copper still has a long life ahead. “Every time someone predicts the death of FR4, someone else finds a way around the problems,” he said.
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.