Hosting
Saturday, January 18, 2025
Google search engine
HomeArtificial Intelligence10 features to try first with Apple AI

10 features to try first with Apple AI


Although the iPhone 16 is built for Apple Intelligence, it has been on the market for over a month without any of these key features available. That’s finally changing with the release of iOS 18.1: Apple’s artificial intelligence features are now finally rolling out, after an extensive beta testing period.

Apple Intelligence aims to transform the way we interact with our iPhones, iPads and Mac computers. With new AI-based tools and features, Apple combines advanced machine learning and natural language processing with practical functionality.

In layman’s terms? Your phone becomes smarter, anticipating your needs to make using your device and completing tasks easier.

If you’ve been wondering what’s coming so you know what to try out when Apple Intelligence launches, here’s a look at the top 10 Apple Intelligence features you should try first.

US English areas are getting these features starting now – and while the rest of the world will have to wait a little longer, here’s a quick pro tip: set your device’s language and location to US English, and you can use them all now , at…

1. Writing tools

One of the most useful additions in iOS 18.1 is the suite of system-wide writing tools. Apple says its writing tools will “refine their words by rewriting, proofreading, and summarizing text almost everywhere they write, including Mail, Notes, Pages, and third-party apps.” Whether you’re typing a message, composing an email, or taking notes, Apple Intelligence can suggest grammar corrections, provide rephrasing options, and help you polish your text. The promise is a writing assistant that brings accuracy, clarity, and style to all your apps.

Of course, this kind of AI tool is nothing new. In fact, companies have been doing this for a while. Between OpenAI’s ChatGPT, Microsoft’s Bing, Google’s Gemini, and Anthropic’s Claude, all of these tools offer writing assistance. However, Apple’s ability to integrate these writing tools into all your apps is unique. So far, only Google has a chance to replicate this level of integration.

2. Clean up in photos

Like Google before it, Apple is introducing a new Clean Up tool in the Photos app that can “identify and remove distracting objects in the background of a photo – without accidentally changing the subject.” If you’ve ever taken a photo on vacation and realized someone photobombed your perfect moment, Apple now offers a built-in tool to fix it.

Whether it’s a person, an animal or just an object you want to clean up or remove, Clean Up is the integrated tool to do it. While it’s not going to replace Photoshop, it gives most people 90% of what they need when editing photos in the Photos app itself.

3. Making a memory film

One of the most interesting ideas in Apple’s Photos app is the Memories feature. While the feature historically merged movies on its own, Apple Intelligence now lets you use prompts to generate your own videos. Apple says that “Users can now create the movies they want to watch with the Memories feature simply by typing a description.”

This feature allows users to type a theme or description, such as “summer vacation,” and Apple Intelligence will automatically merge related photos and videos into a polished movie. If you enjoyed the Reminders feature but found that Apple missed some that shouldn’t have been missing, you can now solve that problem by generating your own.

4. Natural language search in photos

Finding photos in the Photos app has traditionally been…not great. While machine learning has improved search recently, Google Photos search has been much better at finding specific things in your photos for a while now.

Fortunately, Apple uses its natural language understanding with Apple Intelligence to improve this feature. With descriptions like “birthday party in January” or “dog in the park,” Apple Intelligence can help you track down specific memories. The AI ​​processes visual content and metadata, making photo searching more intuitive and accurate.

The company says that “natural language can be used to search for specific photos, and video searching becomes more powerful with the ability to find specific moments in clips.” The second one is especially great. Being able to search for something hidden in a video makes Photos searching much more powerful than what we’ve had before.

5. Priority notifications

If there’s one thing Android still has over iOS, it’s better notification management. Rather than giving us an endless list of settings to adjust and fine-tune this, Apple hopes AI can solve this problem.

With Priority Notifications, your device should help “surface what’s most important, and summaries help users scan long or stacked notifications to bring up the most important details right on the lock screen.” The feature scans your messages and email to understand context and places the most important communications in their own section.

Apple now also uses AI to summarize your notifications. Instead of seeing the entire contents of a long message in your notification, Apple shows a summary of each message or piece of mail. So if you’re in a very active group chat, you’re more likely to get caught up.

6. Smart answer

Another feature that Apple is finally catching up to Google with is Smart Reply. Gmail has provided quick answers that try to understand the context of the email exchanges you’ve been having in Gmail for years, and Apple wants to use Apple Intelligence to provide the same functionality.

The company says: “Smart Reply in Mail offers users suggestions for a quick response and identifies questions in an email to ensure everything is answered.” While the first feature concerns the table stakes for email in the 20th century, the second is interesting. While Gmail always reminds me when I forget to attach a file, I’d like Apple Mail to remind me if I missed something someone asked me to do.

7. Focus Modes: Reduce interruptions

I’m a big fan of focus modes. From letting people know I’m driving to silencing non-urgent notifications when I’m sleeping, focus modes are one of my favorite features for staying on track and reducing anxiety when using my devices.

Now Apple wants to bring AI smarts to focus modes with the Reduce Interruptions feature within Focus Modes. This new focus mode helps users maintain concentration by filtering out non-essential notifications. Apple says it “only surfaces those notifications that may need immediate attention.”

Whether you’re working or relaxing, this feature ensures only the most critical alerts reach you, giving you more uninterrupted time. Of course, this is also the feature I’m most nervous about if Apple gets it right. If it suppresses a notification that is actually urgent, that could cause a real problem. We’ll see how smart it is!

8. Safari Summaries

Personally, I love Reader Mode in Safari when I’m trying to read articles. It removes all the clutter, which is mainly ads, and allows me to focus on the actual content. Now Apple is using AI to make scanning an item even faster.

Safari Summaries in iOS 18.1 are a standout feature for Reader Mode fans. When using the feature, users can now request a summary of long articles or documents. Apple Intelligence summarizes the most important points in a concise summary, so you can get the most essential information from an article without having to search through the entire article.

Apple Intelligence: 10 features to try first

9. Transcription in notes and phone

Taking notes from meetings and interviews, both written and oral, is now easier with transcription in the Notes and Phone apps. Apple says: “In the Notes and Phone apps, users can record, transcribe, and summarize audio. When a recording is started during a call in the Phone app, participants are automatically notified, and once the call ends, Apple Intelligence also generates a summary to help remember key points.”

So if you’ve ever had a phone call that required you to retrieve information, or record and recall your thoughts, the Phone and Notes app can finally help you do just that. I’ve personally been jealous of how good Google’s Voice Recorder app has been for journalists and writers, so I’m personally excited for this app to help me write down my first drafts by talking to the Notes app and using the app to have it transcribed.

10. Improved Siri

While every other feature will be great, this is the one I’ve been most feverishly anticipating. The promise of Siri as a helpful assistant has been around for years, but it has never fully fulfilled that promise. Apple seems confident that Apple Intelligence will finally close this gap substantially.

In iOS 18.1, Siri can “follow along when users stumble over their words and maintain context from one request to the next.” You can also now finally type to Siri and switch between typing and speech to interact with the assistant. I’m really hopeful that Siri will actually start to understand what I’m saying to it.

Apple Intelligence: 10 features to try first

There’s more to come

While these are all the main AI features that will be rolling out with iOS 18.1, more will be added in the coming months. Image Playground allows users to generate images using text prompts. Image Wand turns rough sketches into polished images. And with Genmoji, you can use text prompts or a photo to create your own emojis that you can share with friends.

Siri will also be able to use on-screen personal context and awareness and integrate with OpenAI’s ChatGPT, so if Apple determines what you need isn’t something Apple Intelligence can provide, you can enable ChatGPT to get you there – all within the Siri experience. The company plans to add more AI chatbots like Google Gemini in the future.

There’s a lot to look forward to with Apple Intelligence in the coming weeks and months, so prepare for a lot of changes in the way we use our Apple devices.



Source link

RELATED ARTICLES
- Advertisment -
Google search engine

Most Popular