About 72 minutes into the annual iPhone launch event, Apple senior vice president of marketing Phil Schiller invited Sri Santhanam to come onstage and talk about the brand-new A13 Bionic chip found inside all three of the new phones. The slight and shy Santhanam, Apple's vice president of silicon engineering, then spoke for four minutes. In many ways, they were the four most important minutes of the entire event. Not that anyone noticed—the audience was seduced by the shiny new iPhones, the three-camera system, the magical Night Mode, the impressive video capabilities and, more importantly, the boost in battery power.

By the time Santhanam was done talking, all I could think of were the numbers. Apple’s new chip contains 8.5 billion transistors. Also, there are six CPU cores: Two high-performance cores running at 2.66 GHz (called Lightning), and four efficiency cores (called Thunder). It has a quad-core graphics processor, an LTE modem, an Apple-designed image processor, and an octa-core neural engine for machine intelligence functions that can run over five trillion operations per second.

This new chip is smarter, faster, and beefier, and yet it somehow manages to consume less power than its predecessor. It’s about 30 percent more efficient than last year’s A12 chip, one of the factors that contributes to the extra five hours per day of battery life in the new iPhones.

The launch of the iPhone 11 Pro and its siblings only reaffirms that Apple's real advantage over its competitors comes from owning the entire vertical stack: the software, the system hardware, and the chip design. You can see the benefits of these gains in the iPhone’s feature set, from its augmented reality capabilities to its computational photography modes like Deep Fusion and Night Mode.

"One of the biggest examples of the benefits of the performance increase this year is the text to speech," Schiller said when we sat down to talk about A13 Bionic and its capabilities. "We've enhanced our iOS 13 text-to-speech capabilities such that there is much more natural language processing, and that's all done with machine learning and the neural engine."

Clock Cycles

Apple has come a long way from the launch of the original iPhone in 2007. That first handset was slow and unable to perform even the most basic tasks like copying and pasting text. It had terrible battery life. Its camera would make a supermodel look like the Bride of Frankenstein. Multitasking was almost nonexistent in the original iPhone, which was powered by a chip that ran at 412 MHz. The handset was pieced together from components that included a chip used in Samsung DVD players. It was hard to imagine that such a device could one day upend the entire idea of phones, computing, and communication.

It quickly became apparent to Apple that it would need to build the entire stack—soup to nuts—if it wanted to stay ahead of its competitors, especially those in the Android ecosystem. Apple’s decision to design and build its own silicon was made sometime in 2008. At the time, the company had a mere 40 engineers working on integrating chips from an assortment of vendors. Then, in April of 2008, Apple bought a chip startup called P.A. Semi for $287 million. That increased the total number of chip engineers to about 150 and brought home expertise on what matters most on a phone: power efficiency. The fruits of this group’s labor were first revealed to the world in the iPad 4 and the iPhone 4. Those devices were powered by a processor named A4, which was a modified version of a chip design from ARM Holdings. The A4’s primary focus was to make the Retina displays shine.