>> Because people already know how to use the physical instrument. No learning curve.
But I though all tablet software is intuitive :)
Seriously, it should be possible to make DMM & Oscope tablet interfaces pretty intuitive while still taking advantages of tablet's advantages (otherwise, what's the point?). Does the Oscium mimic a traditional scope? Does PicoScope's PC software mimic a scope?
Red Pitaya's Kickstarter campaign reads: "Technologies yesterday available only to research labs and industry turn your iPhone, tablet or PC into an amazing instrument." The main advantage is that this network attached device is not limited to any specific platform or operating system.
Also, RS Components / Allied Electornics just signed exclusive global deal with Red Pitaya to distribute this revolutionary new open-source test and measurement instrument.
Why do all these "UI Experts" just copy the look of the physical instrument?
For another example, look at Redfish Instruments: why in the ******** world do they have to just imitate the look of a physcial DMM, and not do a totally different UI that really takes advantage of a multi-touch screen? (BTW, I do think the iDVM lte is a neat concept, although of no use to me personally).
Besides, what's wrong with both (remote/web/tablet and physical)? Why slavishly follow the current fad? Pinch and Zoom might be great, but so are physical buttons (there's a reason I do most of my content creation on a mechanical keyboard, not a tablet's on-screen keyboard).
And, no, I don't think portable test equipment will ever loose the buttons - at least not until we have something that works reliably in the cold, with big gloves.
I too used to like knobs--you could glance at the scope panel and see what your settings were, very handy while trying to figue out where you where and what you were seeing. But now, all scopes annotate the settings directly onto the screen, and you can also store/recall/outout the data and the settings pretty easily.
SO the need to be able to glance at the front panel is lessened--in fact, you get all the information in one place--the screen--instead of having to look all over the unit's panel. Plus, the trend to soft knobs and buttons means lookin at the panel doesn't tell you what you wanted to see about the actual settings anyway. So the touch screen makes even more sense now.
@TonyTib, remember that altough engineers develop these new technologies,thay are often lagging in adopting them. Gestures will take 20-30 years to get to test equipment. By then our phones won't need gestures, swipes, or even voice comments. We will need only think about what we want it to do. Test equipment? maybe in 50 years.
NASA's Orion Flight Software Production Systems Manager Darrel G. Raines joins Planet Analog Editor Steve Taranovich and Embedded.com Editor Max Maxfield to talk about embedded flight software used in Orion Spacecraft, part of NASA's Mars mission. Live radio show and live chat. Get your questions ready.
Brought to you by