One of the most frustrating things for reporters is this. When we ask chip vendors about the performance of their apps processors for smartphones, for example, their answers tend to be specific numbers based on the benchmarking results of CPU and GPu; or they would respond: "Well, it all depends on use-case scenarios."
I'd love to have some answers based on system-level benchmarks -- a tool that measures the performance of a whole phone, based on specifics that consumers care about.
Structured in a manner that will emphasize the strengths of their offerings?
It reminds me of the comment "The nice thing about standards is there's so many of them!"
I don't have a problem with BDTI being funded by Qualcomm to do this, as long as they are open about their methodology, what they are measuring, and what conclusions might be drawn from it. Someone has to pay the costs of developing such a thing,
Once they have something they think works, the next step is to pitch it to the appropriate industry consortium as the standard way you measure what the benchmarks track. Then you make popcorn and sit back and watch the fun.
DMcCunney, I agree. As long as BDTI makes its benchmarking methodology transparent, smart people in the industry should be able to spot if there are any anomaly, convenient omissions or modifications done to the testing itself.
Ideally, I want something that is effectively open source: I want to be able to reproduce the test environment, run the same tests on the same gear, and get the same test results as the people posting thier benchmarks. If I don't, something is off somewhere.
Benchmarks should be renamed to Benchmarketing. (Did I just coin a new term?). I have been involved with becnhmarks since the PREP benchmarking days for FPGAs in the late 1980s and the early 1990s which were gamesmanship at best. Benchmarketing - that's what it is - your mileage may vary ...
The reality is, though, no matter how skeptical and cynical you might get with benchmarketing, benchmark results are often touted and marketed aggressively by those who have money, power and vested interest.
Like the "benchmarketing" term. (I noticed BDTI is marketing its benchmarket study in this article--marketing benchmarketing.) It will be interesting to see the results of the study. I'd like to compare BDTI results to Consumer Reports'.
I like #2, too. But the problem is that BDTI is attempting to develop hardware benchmarks, and the number of strokes/touches needed to do something is a UI matter, and will be dependent on the OS, the UI it presents to the user, and the apps the user runs. It will have only an indirect relation to hardware, and a well designed phone might handily beat the competition on that measurement while having the least poswerful hardware.
Well, maybe it's high time for some bright marketing people to come up with UI benchmarketing!
That should be hilarious if someone tries, because UIs are so highly subjective. Your idea of a good UI might drive me screaming from the room. Which is "best"?
Consider Linux, where you have a variety of desktop managers - Gnome, Kde, Xfce4, Lxde, Unity...each somewhat to very different from the others, and each with a following that insists it is best. (One reason many folks like Linux is because you have that range of choice in UIs.)
"The least keystrokes/touches needed to accomplish a task" may not be a meaningful benchmark. What if it only takes one or two to run the app and perform a function, but if you use more you get extra options and better control? (Like when you're taking a picture with your phones camera, and you might get a better result if you don't just take the phone app's presets) An app that take one click to bring up the photo app and snap a picture might win on simplicity, but a competitor that used more taps might take a better photo. Who wins?
I'll settle for marketing and reviews that do a decent job of explaining the design assumptions and giving a feel for how you use the device.
That said, simplicity of a user interface does win a lot of "average" consumers' hearts.
Indeed, and that's been a major factor in Apple's success. They put a lot of thought into their UI, and users of Apple gear can tap an icon or select a menu choice and largely expect it to do what they think it will do, the way they think it should do it.
Apple also enforces UI guidleines on third-party apps, so Apple's overall approach is "Have it our way." Apple preserves the quality of the user's experience by placing restrictions on what that experience is. Since Apple users can generally do what they want, or get an app that does, they don't see this as an imposition. Folks who want more control over and ability to customize their device won't agree, but aren't likely to buy Apple gear in the first place.
As an average consumer, I would pay attention to such matters
Which is why the tech sites all carry lengthy reviews that delve into usage, to give users an idea of what to expect if they get the device.
even if that is not a kind of things benchmarks are good at in quantifying.
And you touched on the key point. Benchmarks quantify, and UI quality is the sort of thing that largely can't be quantified. Benchmark values will affect it, by measuring capacities of the underlying hardware to do what the user wants when they tap something. You can assume more powerful hardware will be more responsive, and may even make possible actions that can't be performed on a less powerful device. But this, too, is something I expect a review to cover.
I think the question boils down to this: since an average consumer of smartphones is not all that benchmark savvy, do these metrics really matter when selecting a smartphone?
Consumer Reports, for example, offers the service of comparing mobile phones. But I am not sure if that's regarded as highly as their reports on cars. Part of the reason is mobile handsets are cheaper (than cars) and they reflect more individual tastes.
But I believe what BDTI is trying to do is not for marketing their benchmark results to consumers directly, but to give the electronics industry an opportunity to look at the perfomance of a whole phone (rather than individual component).
Looks like the industry loves benchmarking -I didn't know the hangover from Dhrystone & Whetstone would last this long! I find this whole exercise amusing and largely place this on the bright marketing folks to find that 'differentiation' now that the mobile computing space is getting more competitive.
I think one point lost in this 'system-level' benchmarking is the definition of the 'system' itself! Smart phones are much more than yesterday's pure computing devices. A phone with the best hardware benchmark can end up losing when it lacks the best operating system the user interacts with.
There is a good conversation on a separate but related topic going on over at SemiWiki. It's a follow-on thread to an excellent post by Paul McLellan specifically about benchmarking processors. http://www.semiwiki.com/forum/content/2675-how-benchmark-processor.html
Which phone is best? Its like which shirt fits everyone... Every user has different expectation from phone. I have Galaxy S4 and if I dial down the network usage it can sustain battery ife for upto 4 days for me.. good enough, beats iPhone. But for other users where they are looking for stay connected iPhone performed better (for one person I know :) ).
Benchmarks can just compare particular aspect of phone. Ok graphics is genious in a particular phone, but how many seconds (not even in minutes) in phone's life have we pushed graphics to that need? LTE can achieve 100Mbps speed but how many pllaces we get signal to hit that speed and even if we hit how many need it? Battery life is purely per user's settings.. a phone can vary its batter life from 1x to 7x (or more) depending upon what function and how often we enable it..
I dont believe in a benchmark that can clearly tell which phone is better from other.
One of the most effective aspect of benchmarking - perofrmance and other aspects - is to give these devices to different group of teeenage schoold children. Probebly they are the most prolific user of these devices and hence can test them best. Their feedbacks are more authentic, less prejudiced. If teenage likes and approves mobile device, it will be a good indication of success too.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.