National Instruments’ 2013 edition of the Automated Test Outlook aims to capture the latest trends in test technology. The company surveyed customers as well as its technology suppliers and advisory panels during 20 global events in order to understand what’s going on in test. NI grouped the results and identified six major trends for 2013: test economics, software-centric ecosystems, big analog data, test software quality, and Moore’s law meets RF.
The first trend identified has to do with what NI calls “Test Economics.” The research indicates that 2/3 of a test budget is spent on maintenance, and only 1/3 on new equipment. Test managers need to justify their budgets, so they need tools to compute total cost of ownership (TCO) of their tools. Accurate TCO calculations should include upfront development costs, deployment costs, and operational costs. And, test managers need to take into account the TCO when considering the replacement of test equipment.
NI cites a case study with Philips Home Healthcare Solutions, where the test division determined that improved TCO management of its test tools led to 86 percent reduction in embedded software defect capture cost, 347 percent increase in test application development productivity, and 73 percent reduction in verification test manpower. “If you can show this type of reduction in software defects, then your company will buy into test,” says NI’s Luke Schreier Senior Group Manager, Automated Test at NI.
NI also concludes that the software-centric nature of technology will change the way automated test is done. In response to this trend, the company is aiming to build a software-centric network with its LabVIEW product. NI’s Bill Driver, Senior Product Manager, Automated Test notes that “there are other vendors and platforms that could foster this software-centric network too.”
Conducting tests more efficiently and collecting more data points are great advancements in test, but what do you do with all of the data? The next trend identified in the report is “Big Analog Data,” where companies need to leverage their IT infrastructure and analytic tools to make quicker decisions on test data. For instance, one transatlantic flight generates 640 terabytes of data. So, the questions before the test industry now are: where to store this data and how to mine it. The challenge is that there is no integrated solution to draw a distributed automatic test node (DATN; oscilloscope, spectrum analyzer, etc.) into the IT server with the ability to mine it. “Test and measurement will need to work with IT as DATN vendors add the capability to link to servers and mine data,” says Driver (see Figure).
Figure: The test and IT industries need to develop best practices to link test equipment with servers in order to mine test data.
The next big trend identified in NI’s report is the need for quality standards to be applied to software. This will require test and engineering departments to work together, then store data for years over the life of the product. To do this, companies will need a data storage plan that includes best practices for storing test data. This trend is being driven by new standards in the automotive and aerospace industries that are requiring software tool qualification.
For the last trend in the report, NI identifies what it calls “Moore’s Law meets RF.” Finally, RF test equipment is seeing the benefits of technology advancements that drive up performance and drive down cost. NI has found that a lot more can be done in DSP (using FPGAs), allowing the RF function to move to CMOS. All of these changes are leading to lower power consumption, and, therefore, less heat, which is making smaller, modular form factors possible for more RF test equipment.
More Information: NI’s Automated Test Outlook web page
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.