You don't need physical memory beyond 4GB to make good use of more than 32 address bits. The operating system can use some of them as "type" bits to partition different kinds of objects from one another and improve the security architecture in fundamental ways that don't work without the luxury of lots of address bits. As an example, see the Capsicum Project out of Cambridge (built on FreeBSD, an extremely close genetic relative of OSX and iOS). The sandbox container model that is evolving in both OSX and iOS point to much more fine-grained protection even within a single "user" domain. A general rule is that the smaller the objects and the more variety that need distinguished references, the more "address" bits required to do the job. In other architectures like the Burroughs B-series stack machines, the Cambridge CAP, and the Symbolics LISP machine use "tags" attached to each word in memory so that the hardware can do type enforcement. These days, the MMU protection machinery is used to provide rudamentary capability like "no execute" mode. The ability to segment the address space using address bits to serve as type tags can enable more aggressive rights enforcement by hardware without incurring unacceptable emulation costs.
The reason this processor is 64-bits is not related to smartphones. Apple is laying the foundation for future MacBooks based on Ax processors and possbily througout their whole lineup. This chip is powerful enough to run OSX today.
That test was interesting but it didn't take into account the different battery and display sizes. The iPad had to draw and display 2.4 times as many pixels using a battery with only 80% of the capacity of the Air. Despite that it ended up 7% behind on average over those 2 tests. The article mentions the iPad backlight uses 7W for the high res screen. It would be more fair to compare with an identical display so that the differences are in the CPU, not the screen.
Anandtech ran the MacBook Air on their battery test for tablets to see how the Haswell compared to the iPads. What they found was the Haswell power Air used less energy than the iPad. With the A7 using slightly more energy than the A6, an A7 powered iPad would use more energy than the MacBook Air and still not match the performance.
Is Haswell Ready for Tablet Duty? Battery Life of Haswell ULT vs Modern ARM Tablets
In addition, looking at the Apple A7 performance results, this is where the iPhone 5S is compared to the latest MacBook Air (using a 15W 1.7/3.3GHz Haswell) on single threaded Geekbench 3:
Apple-A7 1.3GHz: 1471 (INT), 1339 (FP)
i7-4650U 3.3GHz: 3024 (INT), 3003 (FP)
This is absolutely amazing - here we have a 28nm phone SoC getting half the performance of a 15W Haswell! So a quad core version would already be competitive with this dual core Haswell while using a fraction of the power. Next year on TSMC 20nm it might do 2GHz.
So it looks to me Apple's future strategy is quite clear. There is simply no place for Haswell in Macbook Air if Apple can get better performance and power with their own CPUs.
Rick, Intel has a maximum of 12 cores per socket, but that is a $2600 server chip. Nobody will ever have that in their PC! Typical desktops have 4 cores, and typical laptops have just 2 cores. Let's assume Apple is a factor 2 behind on single threaded performance with their new A7. That means they would only need 4 cores to make a Macbook Air that would run OSX with comparable performance. Note there is no need to beat Intel on performance, it just needs to be good enough. Slighly lower performance but longer battery life would be preferable to most people.
Yes it is all speculation, but the fact remains that Apple went 64-bit without getting any benefit from it at all as you explained in your article (note that modern ARM cores like A15 do support more than 4GB of memory while still being 32-bit). So why do it at all, unless it is part of a much wider strategy? I certainly won't be surprised if Apple releases 64-bit ARM based laptops.
Actually, I've thought that the functionality that something like the fitbit or some other wearables out there could be performed by the phone, given enough and apropriate sensors and the necessary logic... then... why would be the wearable required?
The phone is the same wearable right? unless of course if you need something to be worn 24 hours a day.
Let's see what 64 bit architecture does for innovation...
Replay available now: A handful of emerging network technologies are competing to be the preferred wide-area connection for the Internet of Things. All claim lower costs and power use than cellular but none have wide deployment yet. Listen in as proponents of leading contenders make their case to be the metro or national IoT network of the future. Rick Merritt, EE Times Silicon Valley Bureau Chief, moderators this discussion. Join in and ask his guests questions.