Automated measurements have evolved from computer plug-in cards to external peripherals. Now the computer is being embedded into the measurement system, or is data-acquisition being added to embedded systems?
An LSI-11 chassis.
Computers have hosted data-acquisition systems for over 40 years. In the 1970s and 1980s, DEC's PDP series was the computer of choice, supplanted by the LSI-11. Analog I/O boards were designed to mate with the LSI bus structure. Also in the 1980s, Intel's SBC-80 (single board computer) came along and hosted peripheral boards of equal size. Many of the applications were industrial or scientific.
The choice of computer changed abruptly in the 1980s with the PC from IBM and its many knock-offs. It offered the right price, an operating system, and the ability to use it in most applications; not only industrial (using a more industrial version) and scientific applications, but its wider use in IT and commercial applications drove the price down and further widened applications in many countries.
The PC went through much iteration over the next 25-30 years, improving in all aspects. Intel and Microsoft benefited greatly from their initial selection by IBM. They further benefited from improvements in speed, power capability, and greater utility in many types of commercial, scientific, and industrial applications. Other companies benefited as well leading to greater usage in almost all systems.
The use of plug-in boards for data-acquisition applications changed in 1999-2000, when USB capability allowed easy connection of modules. Measurement systems could then be implemented more easily, at lower cost, and with portability because power was available through the USB connection. Performance further improved as USB 2.0 offered significant leaps in connection speeds while maintaining compatibility to previous peripherals. During this time frame, Ethernet also became an alternative connection capability in larger systems.
The cost of the PC and Windows has become a bigger factor as applications strive for the lowest cost approach. This has given rise to different schemes to get the highest performance at the lowest cost.
For data acquisition the success of the PC/Windows era is in the later innings. Sure USB and Ethernet are dominant in most data acquisition applications, but the arrival of ARM capability and embedded solutions are giving more options for users.
The architecture of data-acquisition systems has expanded from pure multiplexer inputs to an ADC per input. The lower cost and performance of ADC and amplifier chips, plus their wide availability, now let engineers implement simultaneous measurement of many channels without interference between them. Multiplexer configurations have always been prone to such crosstalk, especially as throughput speeds have increased. The questions now forming are the ability to make calculations with an on-board ARM processor. Is it fast enough, how many channels can one run and at what throughput rate?
USB data-acquisition module.
Today's off-the-shelf data acquisition modules give choices that could only amaze past users of such products. Inexpensive complete small board computers, such as Beagle Bone Black and Raspberry Pi, allow direct connection through USB or Ethernet. Linux is the software choice for these inexpensive approaches.
The overall summary from an historical perspective is all good: Performance is many times better, costs are many times lower, size is much smaller, application software is readily available, and signal measurement integrity is unequaled. Measurement capability can be used widely in many different applications by relatively non-technical folks. This high integrity data flow has expanded its use far beyond the early adherents dream.