You can’t miss the wealth of new artificial intelligence tools being pushed on us by the likes of Google, Microsoft, and Samsung. Now Apple is joining the party—though Apple prefers the term Apple Intelligence, because it wants you to think its AI is better than everyone else’s AI.
Many of these Apple Intelligence upgrades are heading to Siri, the digital assistant that’s been part of the iPhone experience since 2011 and the iPhone 4. The extra AI is intended to make Siri easier to interact with, and more useful.
Apple says these new Siri and Apple Intelligence features will be available in beta form when iOS 18 arrives in September, with the caveat that “some features” may take up to a year to arrive—so expect further updates across the course of 2025. We’ve rounded up all of the improvements Siri is getting below.
More control of apps
Siri in iOS 18 is getting its hooks deeper into the operating system. You’ll be able to rename documents in Pages, close tabs in Safari, apply enhancements in Photos, and flip between the front and rear cameras (plus much more), all by chatting to Siri.
What’s more, Siri is getting a better understanding of what’s currently on the screen. This screen awareness means you can interact with apps in new ways: So if you’re looking at an address card on your iPhone, you’ll be able to say “add this address to my contacts” and Siri will know what you’re talking about and what you want to do.
You’ll even be able to ask Siri how to do something on your device, if you’re not sure. Examples given by Apple include asking how to schedule a text message, and how to switch from light mode to dark mode on your iPhone. The instructions you need pop up at the top of the screen.
Natural language and context
The updated Siri should be better able to understand you when you stumble over your words or don’t say something exactly right, just like a real person would. The assistant will also remember previous context in a conversation, so you can ask for the forecast for a specific location and then set a reminder for a trip there, without having to say the name of the place twice.
Siri is also going to know more about you: If you ask when mom’s flight is landing, Siri knows who mom is and can pull the necessary details from your emails, texts, or the internet. You could also use a command such as “show me all the photos of mom, Olivia, and me” to get Siri to pick out the relevant images from your media library.
This kind of context-awareness also extends to commands such as “play the podcast my wife sent me yesterday” or “pull up the files James shared with me last week”. Potentially, you’re saving a lot of time browsing through files, folders, and message threads, because of Siri’s enhanced understanding.
Typing with Siri and more
If you want to have a text conversation with Siri rather than speaking to it, you can already do this through an Accessibility feature in iOS—very handy for those times when you need to be quiet. With Siri’s AI upgrade, this feature will be much easier to access: A double tap on the navigation handle at the bottom of the screen will bring up the type-to-Siri interface, where you can use the same commands as you would with your voice.
Siri will also be linked to the other Apple Intelligence features heading to the iPhone, such as the ability to generate AI images, make custom emojis, or rewrite text in a specific ton. These features can be controlled through Siri, as well as through taps and swipes inside whatever app you’re using.
There are even more Siri enhancements on the way—including the option to control compatible robot vacuum cleaners using your voice—but those are all of the main ones announced so far, making use of Apple Intelligence models and tools. It’s the biggest Siri upgrade ever—and it could transform how you get stuff done on an Apple phone.