SAN JOSE, Calif. – Intel claimed its latest tablet SoC outperforms the Nvidia Tegra 3 while consuming less power. On another mobile front, the x86 giant will show Haswell Ultrabooks at the Consumer Electronics Show and discuss a sub-10W variant.
Intel’s Clover Trail, a dual-core Atom SoC for tablets, consumes as much as a half a watt less that Tegra 3 on some tasks, Intel told a group of analysts recently. In particular, the Imagination Technologies graphics block on Clover Trail eats less energy than the rival Nvidia block, the company said.
“The tablet performance is as good with Clover Trail as with ARM SoCs if not better and now the surprising thing is they are using less power,” said Nathan Brookwood, principal of market watcher Insight64 (Saratoga, Calif.). “I’ve always been a little critical of that Atom core because I think they went a little too far in emphasizing power savings, but even so Clover Trial is beating Tegra 3."
Intel compared a Windows 8 tablet using Clover Trail with a Microsoft Surface tablet running Windows RT on Tegra 3. The systems were hooked up to meters measuring consumption on the processors’ power rails on a variety of jobs.
The Nvidia chip uses four main and one helper core. Brookwood said Windows RT does not use the helper core as extensively as Android. Intel did not show Clover Trail’s performance running Android.
In the second half of 2013, Intel will roll out Bay Trail which is expected to use the first out-of-order Atom cores to bolster performance. It will also roll out Haswell, a new 22-nm PC processor geared for a variety of systems including ultrabooks.
Intel “hasn’t given up on Ultrabooks by any means,” despite sluggish market uptake to date, said Brookwood.
“They are convinced ultrabooks represent a journey,” Brookwood said. “They haven’t been able to quite deliver all the features and form factors they wanted to yet, but they think Haswell will give them the muscle they need” and better than all-day battery life, he added.
Intel is expected to show at CES a reference design using a Haswell chip that consumes less than 10W and fits into an Ultrabook as thin as an Apple iPad.
Separately, Intel demoed its Medfield chip running in Android smartphones, outperforming Qualcomm and Nvidia chips in some cases while having similar battery life. “The problem Intel faces is they have no hope of the U.S. market due to their lack of LTE support--that forces them to focus just on Asia and Europe,” he said.
Intel is becoming a joke.
Intel Board needs to clean all senior management (more than CEO).
Clover Trail should not be compared to a 2 year old shipping product (Tegra 3). No one is designing Tegra 3 into new hardware designs today.
Correct comparison is Clover Trial with/versus A6, or current snapdragon Krait, or if you want to compare to Nvidia compare to Tegra 4 which many OEM have samples (being designed into tablets now) and we will see announced in January at CES.
Arm A15 class products in 28nm make a joke out of Intel atom (32 or 22nm) and a joke out of Intel
Medfield is a joke. Lenovo Mobile and ZTE use Medfield in their not-selling-well smartphones because they want to "keep" the relationship. Further, I suspect that Intel is providing some kind of development funds to incentivize OEMs for the development work. In fact, any OEM with the sane mind will not spend any resource & money on Medfield development or products. The Bay Trail and its equivalent, on the other hand, are reasonable bet against those from Nvidia and Qualcomm because 22nm is in theory superior to the 28nm and the mediocre 20nm.
yes when atom compared to A15 (fair comparision) vs old tegra 3...."disastrous for intel"
I work on modem chip architecture. LTE is 3G plus a much more complicated 4G so in general it has higher power apples to apples benchmarking. So my experience matches this.
The medfield lead the most of the ARM (except the A15) in browser experience.
I think it depends on the use case to judge the performance. It depends on what the smartphone is used for crash numbers or browser internet.
As for the power consumption, it also depends on the usage pattern. Yes, LTE modern consume more power when activate, but it also spare the entire phone (include the CPU) more time in idle. And as a overall result, it actually save the power. The real world measurement support this:
You can see the phone with LTE enabled has longer battery life than without.
well of course the 4G chip would be more complicated than a 3G chip. everone knows that. even someone who hasnt worked on 4G. But just from a HUGS principle (hurry up and go to sleep), the 4G chip would complete the download quicker and go to a low power state quicker.
Good job Intel. You managed to beat a 40nm 80mm2 chip that came out a year ago, and wasn't that good to begin with. You should be truly proud of yourself!
Too bad you can't hold a candle to the Cortex A15-based Exynos 5 in Nexus 10, and the A6X in iPad 4 in both CPU and GPU performance (especially in GPU performance), and your chip is a lot more expensive than those, too.
But, sure, let's pretend those chips don't exist so we can all pretend then that Intel's mobile chips are better than ARM. But hey it's winter holidays, a time of magic. We can make belief.
Not an apples to apples comparison. Please, look what has been compared. The Tegra3 ran a WinRT which does not support the fifth (little) core integrated to lower idle power consumtion. As a result idle consumption is higher. Also WinRT needs more instructions to be executed to boot than the Android that ran on the Atom based tablet. Consequently the Tegra3 consumed more energy to boot. All in all that benchmark is pure marketing.