The launch of the iPhone 16 has been one of the more unusual smartphone releases I’ve witnessed from Apple in nearly a decade of covering technology. A significant portion of the device’s new capabilities revolves around “Apple Intelligence,” a collection of smart features driven by artificial intelligence. However, these features are not available right out of the box. Apple plans to release a software update in October to activate these features.
Throughout my time testing the iPhone 16 Pro and iPhone 16 Pro Max, I used a developer beta version of iOS 18.1, which already includes several of these Apple Intelligence functionalities (though not all). Based on my experience, I feel confident in predicting how the phone will perform once these features officially arrive in October. That said, the beta version came with a few glitches. For example, while testing the phone’s performance, the game “Resident Evil Village” repeatedly crashed, regardless of whether I maxed out the graphics settings or not. Apple couldn’t replicate the issue and suggested reverting to the stable iOS 18 version to resolve the problem.
In general, the iPhone 16 Pro and iPhone 16 Pro Max come with several impressive enhancements. They offer top-tier performance, have excellent battery life, and feature some of the best cameras available on a smartphone. While Apple Intelligence brings occasional benefits to day-to-day use, there isn’t yet a standout AI feature that pushes Apple’s technology ahead of the competition or makes it the sole reason to upgrade.
On Apple Intelligence
Apple Intelligence, in some respects, feels like the company is catching up to competitors. Take “smart replies” as an example—one of the new AI-driven features integrated into Apple’s Mail client and the Messages app. When someone messages you, a suggested response pops up above the keyboard, providing context-based replies.
For those familiar with Android, this isn’t groundbreaking. I’ve been using smart replies in Gmail and Google Messages for years and appreciate the time saved when I don’t have to type out simple responses like “Thank you!” or “Sounds good.” Now, iPhone users can enjoy this same convenience. However, in my experience, these smart replies only appeared in the Messages app (probably due to a beta bug).
Siri also plays a significant role in the Apple Intelligence suite. Now, when activating Siri, the screen gets a pleasant glow around the edges. You can continue using your phone without Siri interrupting your workflow. If you make a mistake or change your question mid-sentence, Siri can still figure out what you meant and respond accordingly.
Siri’s understanding of the iPhone’s internal settings has improved. You can ask questions like, “How do I update apps?” and it will walk you through the process. Interestingly, I asked how to type to Siri—a new feature that allows you to interact with Siri without speaking—and the assistant didn’t understand my question. (The correct action is a double tap on the bottom of the screen.) This is a welcome upgrade for Siri. After all, Google Assistant and Alexa have supported typing interactions for years. (To be fair, Siri had this feature before, but it was hidden in accessibility settings and not widely advertised.)
Siri’s usefulness remains hit or miss. I appreciate being able to communicate more naturally and type to it when needed, but there have been times when Siri didn’t have an answer and just displayed Google Search results. In the future, Siri is expected to integrate with ChatGPT, leveraging large language models to assist with more complex queries—similar to what Google is doing with its Gemini feature on Android. Siri will also be able to interact with other apps, such as Mail and Messages, to answer personal questions like when your flight is scheduled to land.
Apple Intelligence also brings several “Writing Tools” throughout the operating system, powered by machine learning. Google and Samsung have also focused on similar features. When you select text on your iPhone, you’ll see the option for “Writing Tools” in the menu. From there, you can proofread, rewrite, summarize, and more. You can adjust the tone of an email to make it sound either more professional or more casual, check for grammatical errors, or summarize long blocks of text from a PDF with just a tap.
Using these tools will take some getting used to, as their presence isn’t immediately obvious. How useful they are will depend on your needs. Personally, I rarely need to summarize large blocks of text, so I haven’t had much use for that feature. The “professional” tone adjustment can be a bit too formal for my tastes. The proofreading tool is more practical, but I wish the option was easier to access.
Summarizing content seems to be a popular AI trend, and Apple Intelligence doesn’t shy away from this. You can summarize emails, messages, and even third-party notifications. This can be useful, for example, when the Mail app highlights an urgent-sounding email in its summary—one that I might have missed if I had simply skimmed through my inbox. However, more often than not, I just swipe away the summary and go through all the notifications myself.
Safari also has a summary feature, but it requires using Reader mode on web pages. These little details make it easy to forget about Apple’s smart features and hard to find them when needed. On the plus side, I managed to summarize an 11,000-word article and get the key points when I didn’t have time to read the whole thing. (Sorry about that.) Feel free to summarize this review as well!
As a journalist who regularly attends briefings, I found some of the most helpful Apple Intelligence features to be the new transcription tools in the Notes app, Voice Memos, and even the Phone app. You can hit record in Voice Memos or Notes, and the app will transcribe conversations in real-time. If you’re on a phone call, tapping the record button will notify both parties before starting the transcription, which gets saved to the Notes app.
The accuracy of these transcriptions largely depends on the microphone quality of the person you’re talking to, but it’s still better than not having any transcription at all. Unfortunately, there are no speaker labels, like in Google’s Recorder app, and you can’t search the transcriptions for specific quotes. (Technically, you can if you add the transcript to a note in the Notes app, but that adds an extra step.)
Camera Features
The Photos app has also received a boost from Apple Intelligence, with the standout feature being Clean Up. Similar to Google’s Magic Eraser, introduced more than three years ago on Pixel phones, you can now remove unwanted objects from the background of your iPhone photos. I was surprised by how much freedom Apple gives you to erase anything. In one selfie, I managed to completely erase my bracelet (Google’s version doesn’t allow you to erase facial features.)
I even erased a cup that was blocking my face in a photo, and Clean Up tried to reconstruct the hidden portion of my face—with somewhat creepy results. (For comparison, I tried this feature on a Pixel 9, and the outcome was similarly unsettling, though Google offered more options.) As one of my colleagues remarked, “It seems like both systems were trained on images of cartoon characters.”
There’s more to come with Apple Intelligence. A feature called Image Playground will allow you to generate images, and Genmoji will let you create custom emoji that currently only exist in your imagination. Siri is expected to improve by offering more contextually relevant information. I plan to revisit these updates when they’re rolled out later in the year. Just to clarify, Apple Intelligence will only be available on select devices, such as the iPhone 15 Pro, 15 Pro Max, and the full iPhone 16 lineup.
The security and privacy aspects of Apple Intelligence are another significant point. Apple is introducing these AI-driven features with a focus on privacy and security through its Private Cloud Compute technology. While security experts are still evaluating how well this approach works, Apple’s strategy of prioritizing user data protection might explain why some of these features are rolling out more slowly than expected.
The New Pro iPhones
Oh, and there’s a new phone too! While I won’t go into all the specs, I’d like to highlight a few changes. One request for Apple: Please start offering more vibrant color options for the Pro models. It’s disappointing that the iPhone 16 Pro is only available in neutral shades like Desert, with no options for bold colors like pink or ultramarine, which are available for the iPhone 16 and iPhone 16 Plus. (Of course, you can always opt for a flashy case!)
The Pro models are slightly larger than previous versions, and while the increase in screen size initially seemed minor—thanks to thinner bezels—it turns out the phones are also a bit taller. This isn’t too much of an issue with the iPhone 16 Pro, but the iPhone 16 Pro Max, which was already quite large, is now even harder to use one-handed, especially when reaching the top of the screen.
If you’re considering between the iPhone 16 Pro and Pro Max, it comes down to comfort versus battery life. The Pro Max offers significantly longer battery life, often surpassing six hours of screen-on time with over 30 percent battery left by the end of the day. You could easily carry it over into the next morning. On the iPhone 16 Pro, I achieved over seven hours of screen time, but that left the battery at 15 percent after 12 hours of usage. If you’re a heavy user, you might need to top it off before bedtime.
Wireless charging is also faster on these devices, reaching about 50 percent in 30 minutes when using a MagSafe Charger paired with a 30-watt charger. (Side note: Apple’s 30-watt charger is still much bulkier compared to other options available on the market.)