On my desktop, I don't care about power consumption. It's plugged into an outlet, and always on. There's no battery to drain.
On my laptop and notebook, I start to care, because they can be used without being plugged in. I don't care much, because I don't normally run them solely off of battery. If I travel, they get set up and plugged in once I'm at my destination. I don't normally use them when I'm actually in transit.
On my cell phone and PDA, I care a lot, because they are normally running off of battery, and I've made it a reflex to plug them in to a charger whenever I'm not actually out and about so I don't find myself suddenly running dry.
Yes, more powerful processors make possible more sophisticated applications, at the trade-off of increased power usage.
Power concerns are becoming relevant in the server market. ARM has a shot at the server market because data centers are increasingly larger with increasingly greater numbers of servers in racks, and power requirements are continually escalating. ARM's planned 64 bit processors stand to make significant design wins, because the power they use will be a lot lower, and power costs are a significant fraction of the cost of operation of the data center.
The CPU doesn't have to be the most powerful available - it just has to be powerful enough, especially as applications move to parallel processing, and multiple CPUs will be engaged on any particular task.
In the mobile market, the big challenges in processing power I see are in the GPU. The "killer apps" tend to be those that demand video performance, and we are seeing devices with screen resolution and 3D accelleration that used to be the province of the desktop and laptop. Intel is far behind in GPU performance. (I have Intel graphics on board in my machine. It's adequate for what I do, because I don't do things like serious gaming. If I did, I'd be looking at shifting from mobo graphics to a dedicated video card, or getting a whole new machine.}
The desktop market is shrinking, as tasks formerly performed on the desktop migrate to laptops, notebooks and tablets. The mobile market and the server markat are booming, as things increasingly more to the cloud.
Power consumption is the new critical factor, and Intel is playing catchup.
The issue is that no real workloads will ever benefit, not from these optimizations and not from ICC.
Android uses GCC as the default compiler, so ICC is irrelevant, even if it happened to be better than GCC. By secretly having AnTuTu replace GCC with ICC in this closed-source benchmark and adding specific optimizations which only speedup this particular code, Intel is cheating by manipulating the benchmark scores.
If changing compilers, settings and adding specific benchmark busting optimizations is OK, what if someone wrote a hand-compiled version of the benchmark - would that be legitimate too? After all, the best compiler is still a human.
You're quite right that software and compilers are important. Intel could, like ARM, invest more in GCC and speed up real Android workloads rather than showing off cheated ICC results while pretending they are in any way relevant for Android performance.
"What's wrong with Intel getting ahead using better compiler technology?"
Nothing, if we're talking about making real applications run faster.
But that's not what we're talking about here.
What we're talking about here is the compiler removing portions of the benchmark, contrary to the intent of the benchmark. As a result, the benchmark results become meaningless.
As Reinhold P. Weicker, co-author of Dhrystone, wrote in 1988:
...optimizing compilers should be prevented from removing significant statements. It has turned out in the past that optimizing compilers suppressed code generation for too many statements (by "dead code removal" or "dead variable elimination"). This has lead to the danger that benchmarking results obtained by a naive application of Dhrystone - without inspection of the code that was generated - could become meaningless. [http://bit.ly/1atTdWZ]
What's wrong with Intel getting ahead using better compiler technology? I understand if the gains are only on a single benchmark, but if a broad range of real workloads benefit then it's definitely legitimate. Good compilers are an essential part of any microprocessor platform. If ICC only supports x86/64 then it's one of Intel's strategic assets, just like they don't share their superior fabs with ARM. Software is really important. People need to better appreciate this. ARM needs to invest more in compiler technology.
Remember that when exploring the microarchitecture design space, simulations are done using benchmarks as input. Benchmark tuning also happens at the hardware level.
I'm not convince x86 is a losing battle. It depends on the direction device platforms go. I rather think Intel is attempting to use what they experienced in the PC world. An increase in performance allows ever more complex and capable software, and once those apps are available nobody wants to go back. So just as with PC evolution, an increase in performance that allows for the next "killer" app will up the performance bar for all players.
I see no reason to believe that the evolution of mobile devices will not closely follow the evolution of the PC. So while ARM is playing to their strong point and pressuring Intel to lower power usage, Intel will be pressuring ARM to increase performance. In a stagnant software world, ARM would surely win. But it is not a stagnant software world, so it is still unclear whether or not x86 will compete in the mobile market.
Great article! I have always been surprised at the gullibility of many people when it comes to such sensational "benchmark" results. People should take some stock and critically analyze the hypotheses, experimental set-up etc. before coming to a conclusion. It does not take a genius to realise that the recent Intel claims are just marketing nonsense with no solid scientific foundation. It also does not take much to realise that Intel is fighting a losing battle as long as it is sticking with x86. Superior fabrication technology will only take you so far (incremental linear gains at best). The much higher gains are in the architectural hardware and software levels. Indeed, anyone who optimized software would tell you that a bit more care in how we code can often get you 10x performance gains. Try and get that gain at the Transistor level....So even if Intel are ahead with FinFet etc. they will not be able to compensate for their inferior power management technology and inadequate software stack. They might reduce the gap a bit every now and then but they cannot maintain doing that forever. One thing that has really changed since the 80's is that the competition is strong and diverse. Consumers have a choice, and they will vote with their pockets.
What makes you so sure Intel will be better than ARM on cost, performance or power? Given the large complexity, overhead and cost of x86 that is by no means certain even with a manufacturing advantage. We all know how bad Atom really is despite the marketing claims - even OEMs aren't fooled: Atom has just 0.2% mobile market share. Who knows, in 10 years Atom may well be remembered as yet another Itanium, iAPX 432 or i860...
The main point is that Intel is closing the ARM gap quickly. Whether they are already better or just will be better in the next generation is not really important. They decided to be number one in the performance/power game for mobile processors and if history is any guide they will be. Whether in 6 months or 18 months matters little.
Agree. Benchmarks are a pure marketing tool. And I agree 100 percent that Intel will not get a single design win based on the benchmark. It did what it's basically designed to do: give Intel a PR boost.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.