We know what a 64-bit CPU can do, so the real question is why?

Apple's supporters and detractors are debating if the new A7 SoC is anything else more than marketing fluff, but here we want to share our take on what reasons could possibly made the company switch its CPU architecture.

A quick web search about the CPU architecture change from 32-bit to 64-bit implemented by Apple on it's new iPhone 5S will lead you to believe that it brings little more benefits -if any- than the ability of handling more than 4Gb of RAM. Those of us who witnessed the same architecture transition in the desktop computer segment (commercially started by AMD's Opteron in 2003) already know that even if it is the most obvious and first consequence/advantage of using 64 bits processors, there are ways, hacks if you want, to make a 32 bits CPU address a larger memory space (see PAE) and on the long run the benefits of the architecture change are deeper than the theoretical ability to handle 16 Exabytes of RAM.

However, our purpose here is not to explore the pros and cons of the 64-bit CPU design, for that we would need a rather lengthy and technical white paper, instead, we want to share our theories of why might Apple, a company known lately for betting on tested technologies when it comes to adding them to its flagship phone, had decided to change its SoC architecture.

I want a V8 on my Prius.

The chances of getting an iPhone with more than 4Gb of RAM in the next couple of years are slimmer than seeing Toyota releasing a Prius with a twin turbocharged 6.0L V8 under the hood, so it's very unlikely this was a compelling reason for Apple, as a matter of fact, the major reasons for adopting 64-bit architectures simply aren't found in mobile devices, yet. Then, putting in context Phil Schiller's words while introducing the iPhone 5S: "this is the most forward-thinking phone we've ever created" we can speculate that this chip will be the foundation for upcoming products.

Maybe this time Mr. Schiller did mean his speech and gave us a hint of what's coming next: "...forward-thinking..."

13" iPad + 13" MacBook Air = 13" Hybrid

Apple (read: Jobs) said at some point that 3.5" was the perfect screen size for smartphones and that 7" tablets were "dead on arrival" and yet here we have the iPhone 5/5C/5s and the iPad Mini. Last year Tim Cook said: "Anything can be forced to converge, but the problem is that products are about tradeoffs, and you begin to make tradeoffs to the point where what you have left at the end of the day doesn't please anyone. You can converge a toaster and refrigerator, but those things are probably not going to be pleasing to the user."

But these words don't mean that a hybrid tablet/laptop is something Apple would never make, instead, they mean that Apple will not do it until it can overcome those tradeoffs.

A recent article published by the Wall Street Journal mentioned the possibility of Apple testing thirteen inches displays for tablets and not so recent rumors have spoken about the possibility of switching to ARM architecture for laptop lines. At first sight, the idea of dumping Intel for ARM processors might seem insane, specially taking into consideration the market segment targeted by the MacBook Pro, and even the MacBook Air; but a 64-bit ARM based CPU brings Apple -and any manufacturer for that matter- one step closer to being able to implement it in a laptop or a tablet/laptop hybrid.

A 13" hybrid laptop would potentially include 4Gb of RAM or more, and it would make perfect sense as well: a base/docking station with a real keyboard and Intel CPU that runs Mac OS and a detachable touch screen that switches to iOS when taken away and runs on the new ARM architecture when working by itself.

Et tu, Brute?

Apple has made clear enough that it's mobile CPUs are custom designed. The purchase of PA Semi in 2008 and Intrinsity in 2010 provided them with the tools they might had lacked to do so, and starting from the A6 included in the iPhone 5 we have seen the best results of such transactions.The only problem (for Apple) is that those processors are not fully engineered inhouse, instead, they are built on top of ARM's foundation. And Apple is not the sole-single-unique ARM licensee: a quick look at ARM's licensees will make you notice the name of another 800lb gorilla in the list: Samsung.

Sooner or later somebody was going to implement ARMv8 architecture, that somebody most likely would be Samsung. No company resides in a total void that prevents it from being influenced by other's developments, designs and mistakes as well, given that it would be possible that Apple jumped so quickly to grab another "First" title yet: first fingerprint reader, first 64-bit CPU in a mobile phone. As a matter of fact, apparently a Samsung executive already confirmed that upcoming phones from the Korean manufacturer will also include 64-bit CPUs.

Even if Apple was forced by the market, or the knowledge of the plans of its archrival, it certainly won't hurt them to brag for a couple of years about how they were the first to one more thing.

Make me one with all!

Since we are exploring theories here, let me shout out a scientific wild ass guess: it is what I call (maybe somebody named it before, I don't want to claim copyrights on it!) the Unified Field Theory of Computing Use.

It has been extensively discussed and almost unanimously agreed that one of the strongest selling points Apple relies on is on its ecosystem and how the users are locked in. Pretty much it doesn't matter how "old" -or at least not cutting edge- the hardware can be, they still manage to offer a user experience superior to the offered by many other OEMs that boast the latest hardware technologies, which frankly, many times result in mere experiments.

When the iPhone was launched, Apple aimed at making the mobile OS look like the desktop OS, but since the last three Mac OS iterations it has reverted the concept and now is making the traditional OS look, feel and behave like iOS. Their hardware, software and services are more interconnected everyday and this tendency has spread to Microsoft operating systems, services and hardware; and even Google is taking serious steps towards this direction with Chrome OS and a strong push on the chromebooks. (You can read our review of the Acer C7 Chromebook here)

What if Apple is gearing up for the moment when your mobile device, your watch, your multimedia/entertainment center and your workstation will all be one? Theoretically, porting a software developed for Mac OS into iOS should be easier with 64-bit APIs and libraries, even though Intel's architecture greatly differs from ARM's. I can't foresee any professional user ditching his dual Xeon workstation in the next couple of years, but Adobe will probably be among the first taking advantage of those extra 32 bits and I won't be surprised to see a full-fledged version of Photoshop, for iOS this time, running in a 2Gb iPad pretty soon. Your mobile device would be beginning to morph into your workstation then.

But certainly, following the line of the unifying theory, Apple's new chip will help them reducing cost by allowing them to use the same core in the iPad and iPad Mini, specially if the last one gets the awaited and very needed upgrade to a Retina display.

How long "forward thinking" is?

It is very hard to tell right now, but we estimate Apple will start getting substantial benefits from the new CPU architecture somewhere between twelve to eighteen months from now. It is worth highlighting that what Apple is saying and what the press says Apple is saying is not exactly the same: it was never told in the September 10th event that the performance boost in the A7 compared to the A6 was because of its 64 "bitness", instead it is a combination of big improvements in their CPU, GPU, driver and software optimization and the addition of ARM v8 instructions.

Probably for the next months, the 64-bit character of the A7 will be just about marketing fluff for Apple, but the key word in the event was "can" and it was clear to us that the exciting aspect of the A7 is not what it does, but what it can do and where it leads to.

 

Article was originally posted on September 19, 2013