USB is too many layers, too much of which is unknown to the user. Too many brands do not comply to the agreements, often it is implemented very badly and last but not least: My own experience.
Do yourself a favour and try the following. Get an USB analyzer. Get 2 computers and a bunch of USB devices. 1 computer is a Windoze PC and the other Linux or MAC. Now try out the different devices, and show the screen what it all does. it is HORRIFIC. What a piece of crap most devices are. Even HIT-devices are not always implemented properly, what a shame. Even a hobbyist does it better. Trying things out on linux or MAC show you a ilttle less traffic, but even then I would not dare to use it long-term stand-alone.
Sorry, but my first oscilloscope was (still have it, worked last time plugged in) a Heathkit 104. THOSE analog oscilloscopes the knobs could be used for fine adjust. Digital oscilloscopes, unless told otherwise ("vernier mode"), generally step in increments when one adjusts parameters. Typically these increments are 1,2,5,10,etc. Also, I normally like my zero point at some "rounded" vertical location (at least on a minor, dot, location on the vertical axis, with preference for major, line, locations). As such, and as a long-tme digital scope user (early '90s), I am VERY ready for a touch screen tablet/phone type interface on a digital oscilloscope. The problem with most digital scopes is that there are a lot of knobs for a 4-channel scope with cursors, trigger, and horizontal controls. Alternately, one spends an inordinate amount of time going through button menus and submenus to share the controls provided. I quit counting how many times I have been frustrated by changing from the Agilent to the Tektronix to the Fluke, etc. interface and operated the WRONG controls reflexively. One BIG request to all of your oscilloscope interface designers out there: Please get together and make at least most of your human-machine-interface consistent.
I also am one of those old fuddy duddys who like KNOBS on my scope. Sorry. Touch screen control are very imprecise. The click of a knob that is always in the same location, with a minimum of menues is my kind of scope.
My engineering guts feeling tell me that the only thing to replace IEEE interface is a proper 10baseT or 100baseT interface. It is sooo easy to connect those, transformer coupled, wonderful interfaced devices. But hey, then it depends on the software running on your newly bought equipment if (and how) usable it can be for you. Also, it depends on the application on your PC or MAC.
Lots of devices have USB interface. In field use USB absolutely is not stable enough to do long term measurement jobs, not to speak about stand-alone applications. So for quick and dirty measurements it is nice to have USB at hand for screen dumps, etc, but nothing else (in my opinion)
In my younger years I have written assembly language. Over the years I grew towards an analog/RF engineer. Realizing the amounts of firmware in measurement devices I see that it often became too complicated to use. As an example: I love the old LeCroy home brew OS. Compare it to the newer ones with Windoze... Horrific. But on the other hand I realize that a windoze PC in a box is handy to share spread sheets etc. But for IEEE lots of free software is around to be able to get measurements into your PC or -preferable for me- my MAC.
Also. you already know about my "give it away" statement. The *reason* for software deveiopment written against a piece of measurement hardware (spectrum analyzer, RF generator electronics, etc) is to *expand* and *ease* the possibilities of the equipment. So I never have understood why equipment companies ask money for this and that and all the rest per each bit they have programmed. It is already there, finished, tested and programmed. I am happy to interface to the IEEE interface, as long as I can generate that specific modulation form, or analyze I2C, so why not give these options away.... ;-)
I'm one of the fuddy duddies that Agilent has apparently been dragging their feet for (or listening to). Smudged screens and imprecise back-and-forth attempts to get a touch-screen interface to guess what I want - yuck! Give me a dedicated knob with detents and I'll get there in 25% of the time with 10% of the swear words.
Still, I welcome the presence of a touch screen interface (even though one of my pet peeves is co-workers who cannot point out something in a waveform without leaving fingerprints) as long as the knobs are available. If (I guess I should say "when") I have to pay extra for a knob-interface box, I'll scrape up the $$ somehow.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.