In my opinion " Why does it need ... ?" is the wrong question. I prefer the "Is it more economical to ... ?" pattern.
One big cost driver here in Europe and I think also in the USA and Japan are human resources. The big human resource specific advantage of the ARM eco system is, ARM M*, R* and A* all share the same ISA concepts and the same tool chain. As a result teaching/learning efforts are minimized.
@daleste, your question might have a very simple response that is not about complex architectural analysis on 8/16/32 bits. Per ARM's own claim, the M0 core is 32-bit and is the smallest ARM core available. ARM jumped a bit over developing 16-bit cores.
I'm very interested in knowing the power consumption and more importantly, the operating condition. 32kB flash and 4kB SRAM do not sound a lot. I've the same thought as Daleste. For the application areas mentioned in the article, I feel like 16bits might very much be enough.
The issue of power consumption is a tricky one and it depends on what one employs as a system, whether using the one above or others like i.MX283 also from Freescale. If the IoT is deployed / configured using ZigBee, the battery life is in years; in days for Bluetooth and in hours for WiFi!
Typical WiFi 802.11/a/b/g/n versions will restrict to nodes in IoT within a 30m reach node-to-node; one can always have router nodes in the mesh and extend its reach but it is impractical for outdoor & low data applications (like weather / infrastructure health monitoring).
For ZigBee, Freescale has MC1322x SoC which I believe consumes sub mW power in sleep mode /quiescent state.
If you were designing a single product for today, then 8 or 16 bit may be fine. If you are starting out in the architectural decision stage of a project targeting Internet connectivity of some sort, then would you really want to have the potential to be hobbled by an 8 or 16 bit architecture?
Comparatively, a 16 bit architecture is going to offer limited code advantage over 32 bit ARM and with expected die shrinks over time, that pretty much goes away.
With 8 bits you will be quickly running into performance issues not to mention limited off the shelf support in terms of software IP (stacks, etc.)
Add in potential time to market advantages from common tool chains and you have what is likely the best architecture for the targeted market.
The hard thing about IoT, one Berkeley researcher told me recently, is that many of the systems address Luddite markets where they are replacing mechanical or no systems at all--a hard sell, and a more fragmented market than even the so-called catch-all embedded sector.
This will take time and hard work.
Rick, you are hitting on one of the hurdles M2M faces in the existing (brown field) and new (green field) markets. Industrial automation (SCADA, CAN) have many applications but penetration there has been a tough sell for wireless M2M nodes. Legacy implementations from the likes of GE, Rockwell, are not going to be displaced any time soon. Naturally some of the attention has been turned toward medical monitoring where many new startups have sprung lately in the Silicon Valley (I think I introduced one of them to you at DesignCon 2013).
I hope 2013 is the year a better picture emerges for IoT. There is a lot that is needed in software solutions like analytics and prognostics.
In India IoT market is currently growing well and most of the chipsets we use here are either from TI or from chinese vendors .... Freescale does not have potential network in India and their support quality is also very low .... It's very doubtful if this chpset will get considerable amount of Indian market .....
There's ongoing discussion what wireless architecture will be "the one" for (hopefully) uprising IoT market. Low-power WiFi, Bluetooth LE, 802.15.4 Zigbee are leading the market, but there's also more than a dozen of so-called "optimized for IoT" architectures (ANT+, Z-Wave, EnOcean, MyraNet, DASH7, WirelessHART... you name it). Many uses 2.4GHz but some uses sub-GHz band.
Freescale offers two types of "Kinetis W" series radio-integrated MCUs, KW01 with proprietary sub-GHz radio and KW20 with 2.4GHz 802.15.4. As long as I know, there's no Freescale Bluetooth (LE) chip has been released yet. It is interesting to see how they can do well with "multi-mode radio" chip, and how do they marketing discrete radio chip versus KW series radio-integrated SoCs.
Why 32 over 16 or 8 bit? Many reasons...
First of all, ARM 32 bit cores scale well from small M-class up to high performance A-class - one can re-use applications written for M on higher performance cores when more performance is needed (i.e. moving from sensors more into the cloud for IOT). Another reason is that memory is costly and it "usually" requires fewer 32 bit instructions to accomplish a given task than if one used 8 or 16 bit instructions (ARM's M0/M0+, by the way uses the thumb (16 bit) instruction set)) thus requiring less memory for code storage. Performance is also typically improved using 32 bit devices (larger registers, greater addressing range). More power may be consumed switching 32 bit registers and supporting a 32bit pipe-line but, again, as fewer instructions may be executed compared to 8/16 bit, one is using more energy but over shorter time.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.