Siriously Delayed
The most compelling Apple Intelligence features could be delayed into 2026. Is there anything Apple can do to salvage its AI efforts?
In a statement to John Gruber on Friday, Apple admitted that Siri & Apple Intelligence's most anticipated features, including contextual understanding and cross-application actions, are "going to take us longer than we thought." Initially anticipated in the next iOS/macOS update, Apple has now pushed these capabilities to roll out gradually over the next year.
That effectively adds up to an 18-month delay for Apple Intelligence. And while some features under that umbrella—such as image cleanup, Genmoji, Image Playgrounds, and summarization—have started rolling out, the most compelling capabilities of Apple Intelligence will now be pushed back to iOS 19 and beyond.
Among these delayed features is Siri's ability to process complex, multi-app requests, such as searching emails, reminders, and messages to respond to questions like: "What's the best route to take to make my flight on time?", "What was the name of that restaurant Mark recommended?", or "What's Jamie's dog's name again?" To reiterate Gruber's statement, these requests are simple to comprehend but demonstrate complex computational gymnastics, requiring your device to understand, using the first example: 1. Where you are, 2. Where the airport is, 3. What flight you might be talking about, and 4. When that flight is scheduled to depart. All of these steps require very personalized information, tapping into multiple apps, and need to be delivered in a snappy, energy-efficient and very reliable way.
But from the moment it was announced, Apple Intelligence has felt like it was rushed to market, suggesting that Apple was unprepared for the sector's explosive growth and how quickly competitors would integrate these features into their products. From the outside looking in, Apple appeared so focused on advancing its silicon, launching Vision Pro, and figuring out what to do with its now cancelled Project Titan vehicle project that the talent and resources to devote to AI weren't there.
Apple's saving grace, for now, is the belief that many users are still oblivious to the full potential of AI in their daily lives and that Friday's announcement will create the most frustration among those deeply embedded in Apple's ecosystem who are also the cutting edge of technology.
I mention that last point because, for many of my friends and family, AI is either 1) a non-issue or 2) simply synonymous with asking ChatGPT stuff. But features will trickle down fast, and soon, manually combing through a dozen emails to find that one PDF or determining who said they'd bring what to the potluck from several messages will feel prehistoric relative to any other platform. And at that point, if Apple still runs a version of Siri that struggles to set multiple timers, they're f***ed.
So while Apple has the tiniest bit of wiggle room to figure out what the heck Apple Intelligence is/can/should be, I had three exceptionally uninformed thoughts about what it might do in the meantime:
Increase Integration with other AI Providers
My first move would be to work on integrating more with AI services like ChatGPT, DeekSeek, Co-Pilot, or Gemini. Right now, Apple makes launching out to ChatGPT a very stilted experience. But if there's a way to enable more seamless query handling, it could significantly improve the experience and satisfaction and placate most users until you can roll out the version of Siri you intended to.
Misdirection
Apple might be able to fool... I mean... misdirect users for a time by focusing on a handful of key AI features that matter most to people. Maybe most people do want conversational Siri, or perhaps they want excellent summarization tools or the ability to know which grocery store has the best prices on eggs. If Apple can deliver 3-4 features that will make the most meaningful improvement to most users, then they prioritize quality of experience over the breadth of AI capabilities.
Something Unique
Apple is racing to get on par with other AI services while balancing its privacy and security values. This duality weighs less on Amazon or Google, which is happy to slurp up your data and, as a result, have quickly upped their AI game. While Apple may struggle to match competitors in raw AI capability, it has the opportunity to develop AI in a way that feels uniquely Apple. Think of Dynamic Island: Apple transformed its notch into a polished, functional UI feature instead of chasing the smallest camera cutout. Apple's AI doesn't need to outmatch others in sheer power; it just needs to be more compelling. It might be a controversial example, but Genmoji is a small taste of that - people love emojis, so the ability to create images in that style to fit their specific use cases represents a unique and compelling use of AI.
In Sum
Apple's not in a great place with AI, but it's not curtains yet. So far, Apple Intelligence reminds me of hockey, when a team forgets their identity and starts to play a poorer version of its opposing team's strategy. To remain somewhat competitive, Apple should focus on integrating and partnering with more AI services to give people that option while at the same time identifying the three or four AI-powered features that provide the most real-world value. Most importantly, Apple needs to figure out how they can provide experiences that "only Apple could deliver," something I haven't seen from them up to this point.