Intel is becoming a joke.
Intel Board needs to clean all senior management (more than CEO).
Clover Trail should not be compared to a 2 year old shipping product (Tegra 3). No one is designing Tegra 3 into new hardware designs today.
Correct comparison is Clover Trial with/versus A6, or current snapdragon Krait, or if you want to compare to Nvidia compare to Tegra 4 which many OEM have samples (being designed into tablets now) and we will see announced in January at CES.
Arm A15 class products in 28nm make a joke out of Intel atom (32 or 22nm) and a joke out of Intel
Medfield is a joke. Lenovo Mobile and ZTE use Medfield in their not-selling-well smartphones because they want to "keep" the relationship. Further, I suspect that Intel is providing some kind of development funds to incentivize OEMs for the development work. In fact, any OEM with the sane mind will not spend any resource & money on Medfield development or products. The Bay Trail and its equivalent, on the other hand, are reasonable bet against those from Nvidia and Qualcomm because 22nm is in theory superior to the 28nm and the mediocre 20nm.
yes when atom compared to A15 (fair comparision) vs old tegra 3...."disastrous for intel"
I work on modem chip architecture. LTE is 3G plus a much more complicated 4G so in general it has higher power apples to apples benchmarking. So my experience matches this.
The medfield lead the most of the ARM (except the A15) in browser experience.
I think it depends on the use case to judge the performance. It depends on what the smartphone is used for crash numbers or browser internet.
As for the power consumption, it also depends on the usage pattern. Yes, LTE modern consume more power when activate, but it also spare the entire phone (include the CPU) more time in idle. And as a overall result, it actually save the power. The real world measurement support this:
You can see the phone with LTE enabled has longer battery life than without.
well of course the 4G chip would be more complicated than a 3G chip. everone knows that. even someone who hasnt worked on 4G. But just from a HUGS principle (hurry up and go to sleep), the 4G chip would complete the download quicker and go to a low power state quicker.
Good job Intel. You managed to beat a 40nm 80mm2 chip that came out a year ago, and wasn't that good to begin with. You should be truly proud of yourself!
Too bad you can't hold a candle to the Cortex A15-based Exynos 5 in Nexus 10, and the A6X in iPad 4 in both CPU and GPU performance (especially in GPU performance), and your chip is a lot more expensive than those, too.
But, sure, let's pretend those chips don't exist so we can all pretend then that Intel's mobile chips are better than ARM. But hey it's winter holidays, a time of magic. We can make belief.
Not an apples to apples comparison. Please, look what has been compared. The Tegra3 ran a WinRT which does not support the fifth (little) core integrated to lower idle power consumtion. As a result idle consumption is higher. Also WinRT needs more instructions to be executed to boot than the Android that ran on the Atom based tablet. Consequently the Tegra3 consumed more energy to boot. All in all that benchmark is pure marketing.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.