Another September…another suite of new AirPods, iPhones and Watches from Apple. Don’t get me wrong: in a world rife with impermanence, there’s something comforting about predictability, no matter how boring it might also be. And at this Tuesday’s event nexus was the most predictable (albeit simultaneously impactful) announcement of all: a decade-plus-one years after unveiling the proprietary Lightning connector for its various mobile devices, replacing the initial and equally-proprietary 30-pin dock connector, the transition to Lightning’s successor has now also begun. This time, though, the heir isn’t proprietary. It’s USB-C.
The switch to USB-C isn’t even remotely a surprise, as I said. The only question in my mind was when it’d start, and now another question has taken its place: how long will it take to complete? After all, more than five years ago the European Union (EU) started making grumbling noises about whether it should standardize charger connections. A bit more than four years later, last October to be exact, the EU followed through on its threat, mandating USB-C usage by the end of 2024. Later that month, Apple publicly acquiesced, admitting that it had no choice but to comply.
With today’s iPhone 15, 15 Plus, 15 Pro and 15 Pro Max, and a cognizant charging case for the tweaked 2nd-gen AirPods Pro, the transition to USB-C has started in earnest. And as usual, the interesting bits (or if you prefer, the devils) are in the details. Since the iPhone 15 and 15 Plus are based on last year’s A16 Bionic SoC, the brains of 2022’s iPhone 14 Pro and 14 Pro Max, they “only” run USB-C at Lightning-compatible USB 2.0 speeds (recall that the connector form factor—USB-A or USB-C, for example—and the bus bandwidth 480 Mbps USB 2.0 or 5-or-higher Gbps USB 3.x—are inherently distinct although they’re often implementation-linked). This year’s A17 Pro (hold that thought) SoC, conversely, contains a full USB 3 controller.
The higher bandwidth potential of the new wired bus generation is particularly resonant for anyone who’s tried transferring long-duration 4K video off a smartphone using comparatively slothlike USB 2/Lightning or Wi-Fi. And Power Delivery (PD) support (assuming it actually works as intended) will be great for passing higher charging voltage-and-current payloads to the phone; the iPhone 15 series implementation is bidirectional, actually, enabling the phone’s battery to even bump up the charge on an Apple Watch or set of AirPods in a pinch. But I was curious to see what exact form this new bus would take, among other reasons due to the system complications it might create. Pre-event rumors had indicated that Apple might have instead branded it as “Thunderbolt 4” which, if true, would have offered the broadest system compatibility: with TB4 and TB3, as well as with TB2 and original Thunderbolt via adapters, and with USB-C and USB generational precursors.
Here’s the thing with USB-C; Apple still supports (although it doesn’t still sell) plenty of Intel-based systems containing only Thunderbolt 3 ports. And as my own past documented experiences exemplify, USB-C and Thunderbolt 3 aren’t guaranteed to interoperate, in spite of their connector commonality. Intel, for example, sold two different generations of TB3 controllers: “Alpine Ridge” (the chipset in my CalDigit TS3 Plus dock, for example, along with several other TB3 docks and hubs I own) is Thunderbolt-only, while the “Titan Ridge” successor also interoperates with USB-C devices (I plan to elaborate on these differences, along with the additional existing and future enhancements supported by Thunderbolt 4 and just-announced Thunderbolt 5, in an upcoming focused-topic post). If the A17 Pro SoC is really USB-C only, Apple will be facing a notable support burden (albeit decreasing over time, since all newer Apple Silicon-based systems support Thunderbolt 4, therefore also USB-C). That’s why I suspect that although Apple’s marketeers are calling the conector “USB-C” for simplicity’s sake, it’s also Thunderbolt-interoperable.
A few more notes here: Apple’s dropping sales of its Lightning-based MagSafe wireless charging accessories, a curious move considering they still work with still-sold iPhone 14 and 13 variants (RIP iPhone 14 Pro models, along with the iPhone 13 mini). And if you still want to use your Lightning-based charger or other accessory, Apple will happily sell you an overpriced USB-C adapter for it. Bus fixations now satiated, let’s broaden the view and see what else Apple announced this week.
The iPhone 15 family
You already know about the A16 Bionic SoC from last year’s coverage. And you already know about the A16 Pro SoC’s USB controller enhancements. But there’s much more to talk about, of course, beginning with the package-integrated RAM boost from 6 GBytes to 8 GBytes. Last year’s A16 Bionic was Apple’s first chip fabricated on foundry partner TSMC’s 4 nm process. This year, with the A17 Pro, it’s TSMC’s successor 3 nm process, with a commensurate increase in the available transistor budget (from 16 to 19 billion) which Apple has leveraged in various ways:
Performance- and power consumption-enhanced microarchitecture CPU cores, albeit with the same counts (2 performance, 4 efficiency) as before
An improved neural engine for deep learning inference, claimed up to twice as fast as before, but again with the same core count (16) as before
A six-core graphics accelerator with a redesigned shader architecture, claimed capable of up to 20% higher peak performance than before, derived in part from new hardware-accelerated ray tracing support, and
Enhanced video and display controllers, now capable of hardware-decoding the AV1 codec (among other things).
About that first-time “Pro” branding for the new SoC …on Monday, Daring Fireball’s John Gruber published an as-usual excellent pre-event summary of how Apple has historically transitioned its smartphone product line each year, and how it’s more recently tweaked the cadence in the era of the “Pro” smartphone tier. Although Apple has previously enhanced its smartphone SoCs core counts and other features to come up with iPad variants—from the A12 SoC to the A12X and A12Z, for example—this is the first time I can recall that the company has custom-branded (and high-end branded, to boot: usually you start with a defeatured variant to maximize initial chip yield) a SoC out of the chute. At least two options going forward that I can see:
Perhaps next year’s iPhone 16 and 16 Plus will be based on a neutered non-Pro variant of the A17, or
Mebbe they’re saving the non-Pro version for the next-gen iPhone SE?
The iPhone 15 and 15 Plus inherit the processing-related enhancements present in the last year’s iPhone 14 Pro and Pro Max, reflective of their SoC commonality.
Apple has also “ditched the notch” previously required to integrate the iPhone 14 and 14 Plus front camera into the display, instead going with the software-generated and sensor-obscuring Dynamic Island toward the top of the display. Speaking of displays, reflective of OLED’s ongoing improvements (and LCD’s ongoing struggle to remain relevant against them), these are capable of up to 2000 nits of brightness when used outdoors. And, speaking of cameras, there are still two rear ones, “main” and “ultra-wide”, the latter still 12 Mpixel in resolution. The former has gotten attention, however; it uses a 48 Mpixel “quad pixel” sensor in combination with computational photography to implement image stabilization and other capabilities, outputting 24 Mpixel images. It also supports not only standard but also 2x cropped-but-not-interpolated telephoto modes, the latter generating 12 Mpixel pictures.
Now for the iPhone 15 Pro and Pro Max (again, above and beyond the SoC and RAM updates already covered). First, they switch from stainless steel to lighter-weight titanium-plus-aluminum combo frames:
They incorporate a similar 48 Mpixel main camera as their non-Pro siblings, albeit with slightly larger pixel dimensions for improved low light performance, three focal length options, and the option to capture images in full 48 Mpixel resolution. And, as before, there’s a dedicated 12 Mpixel ultra-wide camera. This time, however, instead of the main camera doing double-duty for telephoto purposes, there’s (again, as with the iPhone 14 Pro generation) a dedicated third 12 Mpixel telephoto camera, this time with 3x optical zoom range in the standard “Pro” and 5x in the “Pro” Max, the latter stretching to a 120 mm focal length. A complicated multi-prism structure enables squeezing this optical feat into a svelte smartphone form factor:
Last, but not least, the previous single-function switch on the side has been swapped out for a multi-function “action” button. Here’s the summary:
Apple Watch Series 9 and Ultra 2
Although Apple claimed via its naming that the SoCs in the Apple Watch Series 6 (using the S6 chip), 7 (S7) and 8 (S8) were different, a lingering rumor (backed up by Wikipedia specs) claimed the contrary: That they were actually the same sliver of silicon (based on the A13 Bionic SoC found in the iPhone 11 series), differentiated only by topside package mark differences, and that Apple focused its watch family evolution efforts instead on display, chassis, interface and other enhancements.
Whether or not previous-generation SoC speculations were true, we definitely have a new chip inside both the Series 9 and Ultra 2 this time. It’s the S9, comprising 5.6B transistors that, among other things, assemble a 30% faster GPU and a 4-core neural engine with twice as fast machine learning (ML) processing as before. The benefits of the GPU—faster on-display animation updates, particularly for high-res screens—are likely already obvious to you. The deep learning inference improvements, while perhaps more obscure at first glance, are IMHO more compelling in their potential.
For one thing, as I’ve discussed in the past, doing deep learning “work” as far out on the “edge” as possible (alternatively stated, as close to the input data being fed to the ML model as possible) is beneficial in several notable ways: it minimizes the processing latency that would otherwise accrue from sending that data elsewhere (to a tethered smartphone, for example, or a “cloud” server) for processing, and it affords ongoing functionality even in the absence of a “tether”. As Apple mentioned on Tuesday, one key way that the company is leveraging the beefed-up on-watch processing capabilities is to locally run Siri inference tasks on voice inputs, allowing for direct health data access right from the watch, for example. Another example is the “sensor fusion” merge of data from the watch’s accelerometer, gyro, and optical heart rate sensor to implement the new “Double tap” gesture that requires no interaction with the touchscreen display whatsoever:
Reminiscent of my earlier comments about OLED advancements, the Series 9 display is twice as bright (2000 nits) as the one in Series 8 predecessors, and it drops down as low as 1 nit for use in dimly lit settings.
The one in the Ultra 2 is even brighter, 3000 nits max to be precise:
And both watches, as well as the entire iPhone 15 family, come with a second-generation ultra-wideband (UWB) transceiver IC for even more accurate location of AirPods-stuck-in-sofa-cushions and other compatible devices. Speaking of AirPods…
Second-gen (plus) AirPods Pro
As previously mentioned, the charging case for the second-generation AirPods Pro earbuds now supports USB-C instead of Lightning.
Curiously, however, Apple doesn’t currently plan to sell the case standalone for use by existing AirPods Pro 2nd-gen owners. The company has also tweaked the design of the earbuds themselves, for improved dust resistance and lossless audio playback compatibility with the upcoming Vision Pro extended-reality headset. Why I wonder, didn’t Apple call them the AirPods Pro 2nd Generation SE? (I jest…sorta…)
The rest of the story
There’s more that I could write about, including Apple’s (but not third parties’) purge of leather cases, watch bands and the like, its carbon-neutral and broader “green” aspirations, and the well-intentioned but cringe-worthy sappy video that accompanied their rollout. But having just passed through the 2,000 word threshold, and mindful of both Aalyia’s wrath (again I jest…totally this time) and her desire for timely publication of my prose, I’ll wrap up here. I encourage and await your thoughts in the comments!
—Brian Dipert is the Editor-in-Chief of the Edge AI and Vision Alliance, and a Senior Analyst at BDTI and Editor-in-Chief of InsideDSP, the company’s online newsletter.