"Don't knock it just because has 2MB of flash."
Who's knocking that much flash ? - I'd love the laziness 2MB flash can bring!!. It does underline the point however, that the core matters less and less.
On my bench is an instrument that has Period, Time interval and Reciprocal frequency, and a 8 digit floating point display.
It uses an 8748, which google tells me has 1K code and 64 RAM.
regarding ARM low power "multi-core on SOC"
hi jon, that's true OC ,i do wish they had started using and integrating some of Transputer Dave's IP in their design's back then though, so we could be far more advanced today with low power "multi-core on SOC" just around the corner :)
still they can get him on the phone and make some arrangements for the future i guess
HP tablet choice of proprietary architecture instead of ARM hardly makes by sense. I feel the peripherals do make more difference rather than the core. But for tablets who care about which processor anyway.
PCs and even Notebooks are way too expensive for most new users in the Emerging economies and a sub $ 100 Tablet is what they want. Profit from selling billions of SoCs to this market will give new ARM licensees like MediaTek of Taiwan ( already sells Dual Core processors to Lenovo etc. ) become major players with $$$ and ARM will keep them supplied with the latest design A__ cores to attack even the high end market in the US or Europe.
The growing club of ARM based Processors will first destroy Intel's PC franchise but eventually they themselves will turn into standard commodity like DRAM and gradually disappear like the DRAM vendors.
Intel's strategy to survive will probably have to be like that of IBM 20 years ago when Intel inside PCs ate into IBM minis and even mainframes. Intel will now have to focus on Server chipsets ( every 70 SmartPhone / Tablet sold creates the need for one more Server ). In that niche Intel lead in Fab technology will still count. But of course they will need the funds to keep R&D going ( IBM does n't and has to shill its lame versions of Intel's leading edge technologies to oriental Foundries ).
But cell phones do not make the whole CPU market share, for heaven sake.
The way I read this thread, and using similar metaphor, it is a given that ARM is a nice and lite running shoe, but there is need for other kinds of shoes. ;-)
ARM a 12-year-old? That's mendacious. Maybe ARM as a corporate entity might be, but the ARM architecture is nearly thirty years old.
It's worth comparing and contrasting ARM and Intel architectures. What Intel was producing when the ARM was being taped out for the first time bears precious little resemblance to the current Intel chips. Those current chips are onion-layered instruction sets on instruction sets on instruction sets, all driven by the need for backward compatibility to chips no-one cares about any more. The fact that they're having to maintain those layers speaks volumes about the original designs.
By contrast, the ARM instruction set now, even on the Cortex-A15 MPCore, looks remarkably similar to the set I was programming at Acorn, at ARM's starting point, when I did the first port of the Berkeley Unix+GNU toolkit to the first standalone ARM hardware.
Basically, Acorn (and then ARM) got it right - and they still are.
My new thermostat has WiFi, and runs sophisticated adaptive algorithms for setting the cycle. It's a useful technology: it saves energy in the HVAC system as a whole, and may even use less energy than the electro-mechanical unit it is replacing.
Don't knock it just because has 2MB of flash.
I am glad that I did my latest design using an ARM based architecture. The Stellaris Cortex M3 part I originally designed with was plagued with multiple bugs and delayed the project over eighteen months (note to self, do not use TI again). I was able to salvage the project by moving over to a ST Micro ARM based part and tumbling the I/O pins. If not for the compatibility, we would have been out hundreds of thousands of dollars instead of tens of thousands.
There are also a lot of non-ARM low end MCUs floating around as well. The "eventual move to 32 bit" is still in contention be the 8-bit MCU vendors. ARM has certainly done a spectacular job of providing a very wide range of offerings, but it's far from won or lost in either case.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.