I tapped Siri and got the old silence—then a source on the phone whispered a date: June 8. The room tightened; if that’s right, this WWDC won’t be the same quiet developer show you expect. You and I are about to watch Apple try to put its AI assistant into nearly every corner of iOS.
I’ve read Mark Gurman’s Bloomberg report and talked to people who track these builds. What follows is a clear reading of the leaks, and what they mean for how you’ll actually use your iPhone: more chat-like conversations, a separate Siri app, and tiny visual hints that swell into full answers.
Siri will appear as both a quick utility and a standalone app
A colleague pulled up Siri the way they always do—press and hold—and watched a tiny indicator animate. The leak describes two distinct experiences: the classic quick Siri utility you invoke with a press or “Hey Siri,” and a richer, dedicated Siri app that stores conversations.
Gurman’s sources say that the app will show prior chats in a list or a grid of rounded rectangles with text previews. Think threaded back-and-forth bubbles that resemble Messages, but built around an assistant that can accept voice and text. For you, that means past queries could be browsable, searchable, and resumable—no more starting from scratch every time.
When will the new Siri be released?
Expect an announcement at WWDC on June 8, when Apple plans to show iOS 27; but several personalization features—those that would let Siri read what’s on your screen or pull from personal data—are likely to arrive later, possibly around September, according to the same reporting.
The interface may begin tiny and expand into richer responses
At a demo, someone triggered Siri and saw a little pill-shaped “Searching” marker before the response grew down the screen. That description matches a flow where Siri first hides in the Dynamic Island on higher-end iPhones, then expands into a Liquid Glass panel when it has an answer.
Gurman’s accounts include small UX details—a glowing Siri icon, a “Searching” label, a downward expansion—that signal Apple is trying to make interruptions smaller and the assistant feel less jarring. The same build reportedly offers a “Write with Siri” prompt above the keyboard and an “Ask Siri” menu inside stock apps, letting you send selected content into a fresh conversation.
How will Apple integrate AI into iOS 27?
Integration appears to be both horizontal and vertical: horizontal in that Siri-like features could show up in keyboards, apps, and the Dynamic Island; vertical in that there will be a single app collecting conversations and context. If you use Google products, you’ve seen a parallel in Gemini’s presence across Search, Gmail, and Docs—Apple seems to be exploring a similar ubiquity.
Personalization features are arriving more slowly than the UI
At a meeting, someone shrugged when I asked whether Siri would automatically pull personal files or read the screen—most of the insiders expect that to come later. Multiple sources told Gurman that screen awareness and deeper data access won’t land at launch.
Apple’s conservative timeline matters because it separates two things: the polished visual and conversational surface, and the privacy-sensitive plumbing that gives Siri personal memory and context. The first may show at WWDC; the second may not hit until autumn, after more testing.
Compare this to Google’s model: Gemini spread across apps quickly, and users got immediate utility. Apple appears to be moving more deliberately, shipping the chat layers now and saving the personal awareness features for a later update—an approach that could frustrate you if you want instant, Gmail-like integration, or comfort you if you prefer slower rollout of personal data features.
Siri’s presence across iOS could end up creeping through the system like ivy on a brick wall, and at moments it may feel like a Swiss Army knife that folds into your pocket. The real question is whether that pervasiveness makes your phone smarter, or just noisier—are you ready for Siri in every tap?