- What
- iOS 27 "Extensions" API lets third-party AI apps (Claude, ChatGPT, Gemini) integrate directly with Siri
- When
- Expected announcement at WWDC, June 8, 2026. Ships with iOS 27, iPadOS 27, macOS 27.
- Source
- Bloomberg / Mark Gurman — not yet confirmed by Apple
- Default backbone
- Google Gemini anchors native Siri intelligence under a reported $1 billion deal. Extensions are additive, not replacements.
- Revenue model
- Apple takes its standard 30% App Store commission on AI subscriptions purchased through Extensions
- Privacy status
- Routing paths vary by provider — each carries different data handling terms users may not anticipate
The headline writes itself: Apple, having spent two years struggling to ship a competent AI assistant, will now let you pick your own. ChatGPT, Google Gemini, Anthropic’s Claude — take your pick. Siri becomes a universal AI interface. The most closed consumer platform in tech suddenly plays nice with everyone.
That framing is not wrong exactly. But it skips the part that makes this architecturally interesting, and the part that makes it architecturally suspicious.
What the Extensions System Actually Does
The mechanism Bloomberg is describing — reportedly called Extensions — is an API framework that allows AI agents from installed third-party apps to connect with Siri, the Siri interface, and related Apple Intelligence features across iOS 27, iPadOS 27, and macOS 27. If you have Claude or Gemini installed from the App Store, you’ll be able to toggle them on in Apple Intelligence settings and route Siri queries their way.
That much is straightforward. What’s less clear from current reporting is exactly how query routing decisions get made. One version — the simpler one — is a global default setting: you pick a provider and Siri hands off complex queries to it. A more interesting version, reported by Reuters but unconfirmed, is per-task routing: individual requests could be directed to different providers based on what you’re asking. Ask a coding question, it goes to Claude. Ask a creative writing question, it goes somewhere else.
Per-task routing would be genuinely novel. It would also be genuinely complex to build in a way that doesn’t confuse users about what’s happening to their data, since each provider carries its own privacy policy and data handling terms.
That quote predates the Extensions announcement — it was describing the existing dual-pathway problem with the current Siri/ChatGPT handoff. Add four or five more providers and the problem compounds.
The Gemini Situation
Here’s what most coverage is glossing over: Google Gemini is not just one option among many. According to reporting from multiple outlets, the native Siri intelligence in iOS 27 — the backbone that handles queries without invoking an Extension — runs on Gemini models. Apple reportedly pays Google $1 billion for this arrangement, with a version of Gemini processed within Apple’s Private Cloud Compute infrastructure.
This means the competitive playing field is not level. Gemini is the default. Extensions — including ChatGPT, which had an exclusive arrangement since 2024 — are opt-in additions that require a user to have the relevant app installed and manually enable it. Third-party providers are competing against the model that’s already embedded in the system by default.
The analogy Bloomberg Law draws is not subtle: this resembles the Google-Apple search deal, in which Google pays up to $20 billion annually to be the default search engine in Safari. The existence of alternative search engines in the settings doesn’t meaningfully dent Google’s traffic. Default placement in software systems is worth enormous amounts of money precisely because most users never change defaults.
Apple also collects its standard 30% App Store commission on AI subscriptions purchased through the Extensions marketplace. This creates a structural incentive to maximize the number of providers in the Extensions ecosystem — more competition, more subscriptions flowing through Apple’s payment rails.
What This Means for Privacy
Apple’s privacy framing for all of this relies on a distinction between on-device processing and cloud routing. Simple, routine tasks are handled by Apple’s Foundation Models framework on the device itself, without network transmission. Complex queries — the ones that actually benefit from large language models — route to cloud infrastructure.
When a query goes to a third-party provider through Extensions, it routes through Apple’s Private Cloud Compute servers first, then to the provider’s infrastructure. Apple characterizes this as maintaining privacy through architectural controls: the query is processed and discarded, not logged.
The Lumia Security research, presented at Black Hat 2025 before this Extensions announcement, found that even the current system has data transmission behaviors that aren’t obvious to users. Siri automatically scans for relevant installed apps in response to voice commands and reports that inventory to Apple servers. Location data accompanies every request. Audio metadata — what you’re listening to — is transmitted without explicit user visibility. Dictated WhatsApp messages pass through Apple servers in a way that researchers argue undermines the messaging app’s end-to-end encryption claims.
Apple disputed those findings, attributing some behaviors to third-party SiriKit integrations rather than Apple Intelligence itself. The dispute is unresolved. Adding more cloud-routing pathways to more providers doesn’t obviously make any of this simpler.
Why Apple Is Doing This
Two readings, both probably true in some proportion.
One: Siri has been genuinely embarrassing relative to the competition for years. The gap between what Apple promised at WWDC 2024 and what shipped is significant. Opening to external providers is a way to deliver capability Apple can’t build itself fast enough while it continues developing its internal models. Extensions let Apple say “yes, Siri can do that” even when the answer involves handing the query to Anthropic.
Two: This is a distribution play. Apple controls 1.5 billion active devices. Any AI provider that wants access to that install base at scale now needs to be in Apple’s Extensions marketplace, playing by Apple’s rules, paying Apple’s commission, and competing against a Google-powered default they can’t displace. The apparent openness creates a new chokepoint rather than eliminating one.
The Extensions system is a real architectural change that will give users meaningfully more control over which AI model handles their queries. That's a good thing. The framing of Apple "opening up" Siri deserves more scrutiny than it's getting.
Gemini is the backbone. Third parties compete from a disadvantaged default position. Apple collects 30% on the subscriptions. The privacy picture — already complicated with one cloud provider — gets more complicated with five. None of this makes Extensions bad. It makes it a business strategy that happens to give users more options, not a principled commitment to openness.
The thing worth watching at WWDC in June is whether Apple publishes the criteria for Extensions participation and the technical specification for how data flows through Private Cloud Compute to third-party providers. Without that, "we protect your privacy" is a marketing claim, not a verifiable one.