Not clear that the chattiness is always bad. What if G map consumes less MB because it is chatty, for example maybe A buffers larger margins (so you can scroll) which wastes MB, while G might have a more responsive infrastructure so they can use narrower margins but have to send more requests?
There are two sides to this. One is the impact on the phone, one is on the network. It does not seem clear that the story on the phone is constant - couldn't the chips and OS be optimized to do the brief messages efficiently? How do the operators decide that it is the wrong strategy for battery life? It likely depends a lot on the software.
Then the other side is how well the operators are organized for chattiness. Long data flows are relatively easy in their backhaul. Short messages are a nuisance because they have no steady pace. They also may have some issues in how they configure the wireless signalling. A decade ago the operators in NA had huge headaches with SMS because they did not anticipate the demand and set up their cells with a correct traffic allocation. Are they doing the same again? I assume there are aspects of the 3G and 4G protocols which require that transient messages (chats) compete on signaling and scheduling with predictable long data flows, and that is what really bugs them here?
It would be interesting to have guidance for network impact of different signalling patterns, for example how the networks assume the balance of short, medium, and bulk messages will be in the traffic when they set up the network, and what the cost in latency or overall capacity is if these fractions change.
I see your point Junko...not sure whether I agree...the networks operators might claim they don't want to build more infrastructure to handle bandwidth...but their pricing plans seem to indicate otherwise...here in Vancouver I can get 1Gb data for $30, 2 Gb for $35 and 10 Gb for $40, and 100 Gb for $50...this does not scale! it is more logarithmis function...they actively encourage me to consume more bandwidth...(no luck with me as I am having difficulty filling in 1 Gb a month, I am a sissy, I know)...Kris
I always pay attention to your articles, Junko! :)
It would be interesting to get a normalized listing of these different apps, so you get a better idea of how well they are written. Obviously, a very popular app will create a greater overall load, but an elegantly written app has its own appeal.
We need obsessively compulsive designers, who don't mind going back to work the next day and fix what they did yesterday, which has kept them on edge all night. And managers who understand why this is important. In the rush to get things out, there's a lot of half-*ssed work going un-optimized, I'm afraid.
So in spite of Comcast's much-advertized worries, Netflix creates a little less data volume than YouTube, and is much better than most of the listed apps for signaling efficiency. Not bad, I'd say, for a service that streams HD movies.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.