Atom is Intel's current foray into low power computing. Before that, they spent 30 years perfecting the standard platform, which used power like there's no tomorrow---the goal was speed, and the only limitation was the 150W TDP region where aircooled heatsinks are no longer sufficient. So maybe that first attempt at low power as primary driver is not that great, but don't discount Intel: their x86 CPUs use leading semiconductor technology (sub20nm geometry, FINFET transistors, high-K dielectrics, exotic metalization) and Intel's proprietary layout tricks.
ARM may have a better architecture, but they use standard cell layout and more standard processes.
I can't possibly go wrong by predicting that ARM will improve the speed of ARM chips, just like Intel will improve the Atom. In the end, what may become the differentiating factor is that because of all this fanciness Intel's cost structure is higher and they can't sell Atoms for around a dollar, which is where ARM seems to be. Therefore, they will keep occupying high performance end of the market, in particular serving those unfortunate folks that can't port their software away from x86 binary dependence. ARM will make inroads on the strength of better native performance and superior power efficiency. Emulation may matter to some but more as a checkbox item, a security blanket that makes it easier to switch platforms. I can't see it having significant practical use.
I'm citing Anandtech's benchmark:
If the Atom (or the x86 architecture in general) can't cut it running on a cell phone, what makes you think it's all that superior on a data center? (I'm suggesting a reasonable apples-to-apples comparison here, I'm sure given time and semi-infinite available power ARM will be able to extend their concept even further into the "big iron" marketplace. After all the cell phone and server arenas differ far more in their power budgets than they do in their S/W.)
Right, I read the article and it says that they emulate x86 with 40% (or even 80%) of native performance OF THE ARM PLATFORM (not the native x86 as you seem to believe), which is significantly lower than the performance of the real deal server-class x86 from Intel. Why would anyone implement a datacenter on those ARM chips if Intel software compatiblity was important to them?
Don't get me wrong---I love ARM, as an embedded platform and even as an energy-efficient server platform. I just don't think using them to emulate x86 makes sense.
By the way, what does it even mean that "iPhone 5 is the world's fastest cell phone"? It calls faster? Even if it was fast on some metric (video decoding? running games?) you're talking about smartphone market segment, which is irrelevant to data centers.
Like I said READ THE ARTICLE! Elbrus has developed binary translation technology, they can currently demonstrate 40% performance of native x86 but will shortly get it to 80% that they can currently foresee. And servers ARE the "cloud-based back end" where compatibility is still quite important. Also ARM doesn't generically "lag in performance", ARM-based iPhone 5 is the world's fastest cell phone despite there being several Atom-based platforms currently on the market to challenge it.
I just don't understand how an x86 emulation on ARM can help. ARM lags in performance, but makes it up in its market segment by superior power efficiency. Emulating x86 on such slower chips is bound to be slow-squared. The only reason to do it would be legacy applications, but the whole point of the post-PC era is that it's no longer the Wintel monopoly: backwards compatibility requirement is over.
The new software is portable---either because it's Open Source (like Android) or because it is written in Java or HTML5, or is really a light client running against a cloud-based back end.
Right, but look at the current article about Elbrus Technologies creating an efficient x86 emulator for ARM chips, effectively negating the major reason behind Intel's dominance in server markets. (Intel had an opportunity to develop ARM technology but sold it to Marvell if you recall.) If Intel becomes less relevant then so does Wintel. That doesn't make Microsoft "irrelevant" overnight, but it sure isn't doing much to bolster its long-term survival. That puts two more major techs in the "aging dinosaur" column...!
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.