Apple intelligence still has a long way to go
The Apple Intelligence is Here, but it still has a lot to learn: A study with the Randall-Sundrum-Helmholtz-AdS Siri
This is v1, and Apple has made it clear that the Intelligence features will be added over the next year. Apple built its products to handle intelligence, and that’s why it put a big bow on every new device. Apple built it up to impossible heights, so it is a letdown right now.
Other features — like AI-generated photo memories and smart replies — do what they’re supposed to do but lack a certain human touch. I didn’t send any of the AI-suggested replies in my messages even though they conveyed the right sentiments. I might as well just respond to a text myself, rather than have an expert do it, you know what I mean? Isn’t that part of the point of texting someone? Photos made a memory of my child’s moments but Titled it impersonally, “Joyous Moments with Child.”
There’s also, of course, an upgraded Siri. It looks different and has a handy addition in typing queries but you don’t need to use it long to realize it’s the same old Siri with a new coat of paint. It has better natural language and includes product knowledge to help you find settings for your iPhone, but it is only one thing. Apple plans big updates for its intelligent assistant, including the addition of a chat extension by year’s end. But the big stuff — contextual awareness, the ability to take action in apps — is all planned for 2025.
Source: Apple Intelligence is here, but it still has a lot to learn
Using AI to Clean Up and Identify Objects in the Background of Photos, Texts, and Messages: An Evaluation of the Magic Editor and Other Features
The tool does a good-enough job, especially for smaller objects in the background. Sometimes it is better than the older Magic Eraser tool in photos, but it is not as good as the new Magic Editor that uses artificial intelligence to remove objects. That tool runs in the cloud, so it’s a little apples to oranges, but still. I can access the Magic Eraser on my phone, but the results are close to what I get with Clean Up on the iPhone 16 and not a great argument for the upgrade cycle.
The new Clean Up tool is available in your editing options. It is designed to take objects out of a scene quickly and it can be used to highlight or outline objects you want to take out. It runs on the device, so you can only wait a few moments before you see it disappear.
Notification summaries seem a little more promising to me — at the very least, it’s pretty funny seeing AI try to summarize a string of gossipy texts or a bunch of notifications from your doorbell. But it also surfaced a bit of important information in a string of texts from a friend, and had I not seen that summary when glancing at my phone, I might have read the messages much later. That was helpful.
When you open the Mail app, you will see summaries for the first few lines of an email, as well as an option to summarize individual emails. I didn’t find either of these features to be useful, perhaps because email has become useless. You know what feature we already use that summarizes an email pretty well? The subject line is the subject line. The majority of emails I get are short and to the point. Maybe Tim Cook has more time for reading long emails, but I think I would live without a little bit of an email summary from the DNC.
Apple uses AI to summarize groups of notifications so you can catch up on what you missed faster. A new focus mode filters out unnecessary distraction, so you can summarize emails and use it. After using these things for a week, I didn’t feel like I saved much time.
One time, it summarized my work emails with the phrase “Medical emergency” in it. I checked to see if there was anything new in the inbox. Turns out someone said they were responding a day late due to a medical emergency but that they were fine. It wasn’t an important work email—glad to hear they were fine—but the summary made me check my inbox when I didn’t need to. I clicked into my notifications more times than not because Apple Intelligence highlighted something that seemed crucial but wasn’t.
Summaries are another big part of Apple Intelligence. You can use it to get a overview of the web pages you visit. If you have multiple messages from a group chat, the summary will highlight certain things that were said and you’ll be able to click to see the full details. My summaries are often garbled, so I haven’t used this yet.
You can send quick Smart Replies based on the context of a conversation, like, “Thank you” or “Sounds good”, to people in Messages and Mail. While this can be helpful, it’s hard to get excited about a feature that’s been built into Gmail since 2017.
This is new, but you can use it now. Apple finally caught up to its competitors who have existed for years without this option, thanks to the accessibility setting that they have baked into the experience. There is a glowing effect around the screen, and the ability to understand queries, even if you are standing, is better with the new design. Despite the new paint, it feels the same, and that might be a sign of a let down.
So what can you do right now? Writing Tools can be used to rewrite text wherever you are in the operating system. Rewrite changes the sentence’s tone from casual to professional, for example, while Proofread fixes typos and improves grammar. Too bad. it’s nearly impossible to remember this feature exists, because it only shows up when you highlight words. Writing Tools might be better if a button was built into the virtual keyboard.