Back then when App store was opened, developers can register to become an iPhone apps developers and submit apps to sell, who would anticipate what the apps store has turned out and Apple has become.
64-bit mobile devices are seemingly overengineered. Ease of memory addressing may improve I/O throughput; yet, it would not be significant enough. What made Apple to decide to go for 64 bits on iPhone 5S. I guess Apple has been innovating and be first of almost everything. Now, Apple becomes the first company to launch a mobile device with 64 bits processor. Big deal! IMO, 64 bits mobile device will not beneift regular users. However, if the device is in the hand of curious engineers, I am sure there will be a lot of hacking going. Maybe, we will see a iPhone 5S being used as a regular computer with Linux being the OS. Maybe, there is a powerful parallel computer build with 1k iPhone 5S.
Sometimes, platform makers just need to open the door and the future will take us to where we are supposed to be.
PS: To Apple, what's the benefit? Maybe, Apple is checking out 64bits ARM to prepare for future Mac.
I've long thought that smartphones could effectively replace regular PCs for a good portion of the population. They clearly already have the power to do so, the CPUs in them may be weak compared to Intel's current PC CPUs, but those provide far far more power than the average person needs - I'm talking about those the people who use their PCs only for email, web browsing, online shopping, Facebook, writing term papers, managing photos, etc. This would obviously be inappropriate for the the engineers reading EETimes, so please don't tell me that such a solution wouldn't meet your needs. Even if it only works for 50% of current PC buyers that's a massive market for Apple to tap, and a huge potential problem for Microsoft.
The reason a smartphone doesn't replace a PC already for these people is that the interface sucks. The screen is way too small (even a phablet screen) and the keyboard is terrible for trying to do lots of typing. So provide a way to connect your phone to a big monitor, which has USB ports on it to connect a standard keyboard and mouse. Thunderbolt or HDBaseT 2.0 would work great for this, and either could be done with a Lightning interface cable that plugs into a Thunderbolt monitor or HDBaseT 2.0 supporting TV set in a year or two. When the iPhone detects this connection, it starts an 'app' that is basically the full OS X GUI, including all the userland libraries that OS X apps expect.
I think this is coming, but you really want 64 bits before you do it, simply because you need more memory in a phone to run a desktop GUI and desktop apps, and you don't want to bother short term support for a 32 bit ARM API for those desktop apps. This would be really simple for Apple to do. OS X already supports fat binaries, it could start producing fat x86/ARM64 binaries. iOS is based on OS X, and could be modified fairly easily to support almost all OS X binaries that were recompiled as a fat binary.
Excellent point. Targetting multiple ISAs is a real pain, and the sooner Apple can unify all its development targets, the better. Having a 64-bit ARM now means getting everything in the tool chain in place, as well as hardware engineering, so Apple could launch ARM8 based laptops within a year.
This is really a big question. This looks like new Apple devices including Macs will have 64-bit processor. This will also unify iOS and OSx. We may have sudden surprise with new powerfull applications from Apple.
Hey Rick: i didn't think that I would ever be interested in some kind of fitness app until my husband and I both bought FitBits. We are an extremely competitive family and the powerful thing about FitBit is that it shares your daily activity numbers with friends in your network. So if I slack off during the week, my friends and relatives will taunt me. Sounds sort of sick, but it is actually motivating me to work out more (and come in first!)
There are times when it would be convenient to have the FitBit pedometer functions simply as an iPhone app. But I think the app will supplement rather than replace the demand for the FitBit hardware devices. There are exercise situations in which wearing the FitBit device might be fine, but carrying your smartphone would be problematic. It will be nice to have both options.
Rick, you understand how much engineers love data, right? In all seriousness though, it's not so much the competition with others but the competition with oneself, and the motivation FitBit gives to do at least 10,000 steps a day, and praise messages for exceeding that number.
FitBit also integrates nicely with the myfitnesspal app, which is great for tracking your food intake & exercise.
This stuff is all still in its infancy, but the digital feedback is quite valuable & motivating. My current profile pic on EE Times is recent, and is a much slimmer me than the one from a couple years ago. I attribute that mostly to collecting & tracking this data -- an activity that is not only motivational, but quite educational.
Gee, I'm amazed to hear that, over the years, each new generation of CPU (8, 16, 32, etc.) has only brought me increased address space, and nothing more. I could have sworn that my 16 bit CPU could add 16 bit numbers a lot faster than my 8 bit one could. But, perhaps I was mistaken. Now, I suppose now one really cares about adding 64 bit numbers faster but, wait, doesn't fast encryption have something to do with math opeations on large numbers? But, I'm sure that the author of this article knows all about how fundamental security mechanism, such as secure hash algorithms won't be able to take advantage of 64 bit math and how this code will work just well on older, 32 bit CPUs. I'm so glad I have a a technical magazine like EE Times to keep me fully informed by such knowlegable authors.
I remember being told in 1979 that the 32k (yes kilobytes) of memory on my Commodore PET was all that I'd ever need for programming. It wasn't long before the "SuperPET" came out with 96k of bank switched memory and the memory arms race continues to this day. Today the start-up graphics on my computer use more memory than that. We have a nearly unlimited capacity to utilize computer memory and processing resources. I'm interested that Windows computers seem to co-exist in 32 bit and 64 bit processor versions. How much of a performance benefit is there for the 64 bits? My sense is that the down side is that many programs are not available in 64 bit versions.
CPUs already use special instructions for encryption, so the impact of 64 bits would be limited to different types of encryption that aren't supported in the ISA. So for instance maybe you have AES acceleration making the 64 bit transition moot, but need to use integer instructions for ECDSA and thus get a big speedup for that.
When we switched from IA32 to IA64 processors, we immediately got a 3x speedup with the same amount of memory. The main reason for this is that IA32 was register starved (8 not so orthogonal registers), while IA64 had 16 orthogonal registers.
The ARM 8 is a totally different ISA than ARM 7, not ARM 7 + 64-bit addressing. It goes from 16 not so orthogonal registers to 32 orthogonal ones. This will not be as big a boost in performance as Intel had, because the ARM7 wasn't so crippled, but it will be a boost. And that's exactly what Apple is saying right now - that the ARM64 platform means twice the number of registers for applications to use. And that can matter a lot.
My ideal device would be a portable thing I can carry with me all the time and be able to expand its capabilities (memory, parallel CPUs, graphics/video, input/output devices) when connected to the docking unit.
This 64bit CPU is far from it but maybe, just maybe, one step closer to it.
One thing that would be interesting is the ability to run a full virtual machine inside a smart phone, and I guess there 64bit addressing is kinda important.
I think the way things are going, the successor to the PC will be a smart phone (or a device based on smart phone internals) + cradle to link a full sized screen, keyboard and other traditional PC peripherals.
@jdes: ah, virtusalization! Good thoight. It is sweeping the server and comms world in a hige way but so far we have only seen dabbling in mobile virtulzation and nwhat it could mean using osme hefty 64bit CPUs.
Hopefully it does not mean a phone instance my carrier charges my company for and one it charges me for!
If 64 bit space will allow true speech recognition, then I say it is high time that it becomes the norm.
Maybe then, we can truly get rid of these silly little virtual keyboards to the technology graveyard, where the old slide rules and typewriters went to retire!
And maybe then, there will remain no excuse left for programmers to hide behind excuses of why we are still slowing down the masses in achieving true and speedy communications that have been yearning to come out of the shadows for the past 3+ decades.
Rick, you do realize that getting from current top-end ARM to Haswell performance is a relatively small gap? It's less than a factor of 2 in IPC and less than a factor of 2 in frequency. Basically Intel has stood almost still in the last few years while ARM CPUs have doubled performance year after year.
Apple's new 64-bit CPU may well be a 4-way OoO and very close to Haswell in terms of IPC. However it's unlikely frequency will match as their main goal is low power (hence high IPC at a low clock). We'll soon get 20nm 64-bit ARM server CPUs which are aiming for high performance, so things will get very interesting in 2014.
@Wilco: I don't think its a fiar characterization to say Intel has been standing still in performance while Apple has been making 2x leaps.
Also, in addition to performance there are a lot of compatibility issues, peripheral support and other plumbing Apple would have to install to make its own Mac SoCs. Not that they could not do that someday but it would not be trivial and may not be worth the effort when you mcan just keep writing purchase orders for lots fewer person-hours of work.
Well, do you remember the last time when Intel provided a large speed boost? Nehalem to Haswell gave approximately 4.5% speedup per year over 5 years. That's peanuts compared to Apple's claimed improvement of about 75% per year over the last 6.5 years. I have a Sandy Bridge system and am not planning to upgrade for many more CPU generations. There is just no measurable improvement otherwise.
You're right that Apple has to add some of the missing functionality to their current SoC. However there are already ARM SoCs with SATA, PCI, GbE etc, so I don't think it is that much work, especially for a company with as much money and talent as Apple. And yes, rolling your own is actually better for the bottom line. If you wanted say 10 million chips, and they cost $300 to buy from someone else but you had the ability to make them yourself for $100 with an investment of 1 billion, what would you do? What if it were 100 million chips? Still keep writing those purchase orders?
@Wilco: Ah, but speed has not been on the same track as p-erformance in PCs for years. Speed is staying roughloy constant to prevent current leakage from boiling over. One of the new performance metrics is cores per socket where Intel is at 4-20 and Apple is at....one? two?
Anyway, it's still juicy speculation an A8a that runs Macbooks and an A8b that runs iPhone and iPads. Maaaaaayyyybe!
Rick, Intel has a maximum of 12 cores per socket, but that is a $2600 server chip. Nobody will ever have that in their PC! Typical desktops have 4 cores, and typical laptops have just 2 cores. Let's assume Apple is a factor 2 behind on single threaded performance with their new A7. That means they would only need 4 cores to make a Macbook Air that would run OSX with comparable performance. Note there is no need to beat Intel on performance, it just needs to be good enough. Slighly lower performance but longer battery life would be preferable to most people.
Yes it is all speculation, but the fact remains that Apple went 64-bit without getting any benefit from it at all as you explained in your article (note that modern ARM cores like A15 do support more than 4GB of memory while still being 32-bit). So why do it at all, unless it is part of a much wider strategy? I certainly won't be surprised if Apple releases 64-bit ARM based laptops.
Anandtech ran the MacBook Air on their battery test for tablets to see how the Haswell compared to the iPads. What they found was the Haswell power Air used less energy than the iPad. With the A7 using slightly more energy than the A6, an A7 powered iPad would use more energy than the MacBook Air and still not match the performance.
Is Haswell Ready for Tablet Duty? Battery Life of Haswell ULT vs Modern ARM Tablets
That test was interesting but it didn't take into account the different battery and display sizes. The iPad had to draw and display 2.4 times as many pixels using a battery with only 80% of the capacity of the Air. Despite that it ended up 7% behind on average over those 2 tests. The article mentions the iPad backlight uses 7W for the high res screen. It would be more fair to compare with an identical display so that the differences are in the CPU, not the screen.
In addition, looking at the Apple A7 performance results, this is where the iPhone 5S is compared to the latest MacBook Air (using a 15W 1.7/3.3GHz Haswell) on single threaded Geekbench 3:
Apple-A7 1.3GHz: 1471 (INT), 1339 (FP)
i7-4650U 3.3GHz: 3024 (INT), 3003 (FP)
This is absolutely amazing - here we have a 28nm phone SoC getting half the performance of a 15W Haswell! So a quad core version would already be competitive with this dual core Haswell while using a fraction of the power. Next year on TSMC 20nm it might do 2GHz.
So it looks to me Apple's future strategy is quite clear. There is simply no place for Haswell in Macbook Air if Apple can get better performance and power with their own CPUs.
Actually, I've thought that the functionality that something like the fitbit or some other wearables out there could be performed by the phone, given enough and apropriate sensors and the necessary logic... then... why would be the wearable required?
The phone is the same wearable right? unless of course if you need something to be worn 24 hours a day.
Let's see what 64 bit architecture does for innovation...
The reason this processor is 64-bits is not related to smartphones. Apple is laying the foundation for future MacBooks based on Ax processors and possbily througout their whole lineup. This chip is powerful enough to run OSX today.
You don't need physical memory beyond 4GB to make good use of more than 32 address bits. The operating system can use some of them as "type" bits to partition different kinds of objects from one another and improve the security architecture in fundamental ways that don't work without the luxury of lots of address bits. As an example, see the Capsicum Project out of Cambridge (built on FreeBSD, an extremely close genetic relative of OSX and iOS). The sandbox container model that is evolving in both OSX and iOS point to much more fine-grained protection even within a single "user" domain. A general rule is that the smaller the objects and the more variety that need distinguished references, the more "address" bits required to do the job. In other architectures like the Burroughs B-series stack machines, the Cambridge CAP, and the Symbolics LISP machine use "tags" attached to each word in memory so that the hardware can do type enforcement. These days, the MMU protection machinery is used to provide rudamentary capability like "no execute" mode. The ability to segment the address space using address bits to serve as type tags can enable more aggressive rights enforcement by hardware without incurring unacceptable emulation costs.
Drones are, in essence, flying autonomous vehicles. Pros and cons surrounding drones today might well foreshadow the debate over the development of self-driving cars. In the context of a strongly regulated aviation industry, "self-flying" drones pose a fresh challenge. How safe is it to fly drones in different environments? Should drones be required for visual line of sight – as are piloted airplanes? Join EE Times' Junko Yoshida as she moderates a panel of drone experts.