I agree that 32-bit applications will (and should) stay around a long, long time. Very few user applications need 64 bits. The OS is another story, since it needs to address the entire physical memory space and the sum of the virtual memory spaces of all running user applications may be many GB. However, for mobile computers the 40-bit physical address space of ARM Cortex-A7, A12, A15 and A17 is probably enough.
ARMv8 has both 64-bit and 32-bit operating modes. I believe that in AArch32 mode it's compatible with ARMv7 user-space applications. IMO there's no reason to expect ARM to drop the 32-bit mode any time soon, if ever.
I think even if 64-bit Android will take over soon (and that is very questionable), most applications will remain 32-bit for a very long time. As people have noticed on iOS, memory use increases a lot (both executable size as well as data size), so it makes sense to continue using 32-bit apps to conserve memory.
As for ditching ARMv7 support, the cost of supporting the ARM/Thumb-2 ISAs is small (it's not nearly as complex as the x86 ISA!), and since software will remain 32-bit for a long time, I don't think that will happen soon. For custom v8 implementations aimed exclusively at servers the answer may be different as there is no backwards compatibility issue.
I believe that A7 set the ball rolling for 64-bit processing in mobile devices and hence gained the first mover advantage. . Even though the ecosystem "does not require" 64 bit processing, it is pretty much evident that 64 bit would be a de facto standard in the near future. Migrating Apps to the new 64 bit format "should" not be a very expensive affair as the volume catches up.
The shift of mobile processors to 64 bit would increase the adoption of tablets and phablets as a replacement for PCs. A spinoff would be the prevalence of Grosch's model in computing.
I agree. You will see 64-bit android. It's just a matter of time. And, you will eventually need it in the larger and higher-end platforms like tablets and phablets, but many handsets will not be able to use th added memory capabilities due to power and space limitations. However, you will still see some performance improvement from the more advanced 64-bit architectures. In fact, more silicon vendors are working on or considering custom cores with the ARMv8 ISA than then with the ARMv7, and Intel will be more competitive in the 64-bit realm.
I pinged Google PR about 64-bit Android and got this response:
"I think it's premature to comment at this time, but as a bit of background, the open source community has been contributing 64-bit support to Android for a while. For example, the native code for java.* libraries were updated to 64-bit in Jelly Bean."
As far as I know from watching news and scant blog posts of google's engineers, google.com has no plans for 64 bit android any time soon.
The main rationalization behind that is that at the time when software developers are already fleeing their ecosystem, they don't want to anger them further by increasing their maintenance burden with yet another ISA to support in addition to 3 existing ISAs and 6 ABIs (with 2 of ABI being mandatory under "app quality" requirements for bigger publishers).
But since such major players as Mediatek already announced their intent to push 64 bit to clients, they should have a plan how to deal with that. My guess they will initially push 64 bit chips with armv7 compatibility mode for few generations, before ditching it from silicon.
I also think that as there will be an opportunity for smaller players running non-android OSes to go forward, as they are not as dependent on ISAs and binary distribution as Android phone makers.
The best strategy is not having it be significant - start the switch before you need, don't make a big deal of it and make sure you've planned enough time (say 2 years) so that when you need it, the ecosystem is (95% plus) already there. IMHO.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.