Apple Intelligence’s rollout has been slow, staggered, and steady since the company first unveiled its take on AI at WWDC this year. It continues today with the release of the latest developer betas for iOS 18, iPadOS 18, and macOS Sequoia. The updates in iOS 18.2, iPadOS 18.2, and macOS Sequoia (15.2) bring long-awaited features like Genmoji, Image Playground, Visual Intelligence, and ChatGPT integration for those using the preview software, as well as Image Wand for iPads and more writing tools.
This follows the announcement that iOS 18.1 would be available as a stable release to the public next week, bringing things like writing tools, notification summaries, and Apple’s hearing test to the masses.
That’s the first time people who haven’t signed up for beta software will try out Apple Intelligence, which the company has widely touted as the main feature for the devices it launched this year. For example, the iPhone 16 series were announced as phones designed for Apple Intelligence, although they launched without these features.
Now that the next set of tools is ready for developer testing, it looks like it will be weeks before they become available to the general public. For those already in the developer beta, the update will land automatically. As always, a word of caution: if you’re not already familiar with it, beta software is for users to test new features and often check for compatibility or issues. They may contain errors, so always backup your data before installing previews. In this case, you will also need an Apple developer account to access it.
Genmoji arrives today
Today’s updates bring Genmoji, which lets you create custom emoji from your keyboard. You go to the emoji keyboard, tap the Genmoji button next to the description or search input field, and then enter what you want to create. Apple Intelligence generates a few options, which you can swipe and select one to send. You can also use them as tapback responses to others’ posts. Moreover, you can create Genmoji from photos of your friends, allowing you to create more accurate Memojis of them. As these are all presented in emoji style, there is no risk of them being mistaken for real images.
Apple is also releasing a Genmoji API today so that third-party messaging apps can read and display Genmoji, and people you text on WhatsApp or Telegram can see your hot new gym rat emoji.
Other previously announced features such as Image Playground and Image Wand are also available starting today. The first is both a standalone app and something you can access from the Messages app via the Plus button. As you review Messages, the system will quickly generate some suggestions based on your conversations. You can also type descriptions or select photos from your gallery for reference, and the system will present you with an image that you can then adjust. To avoid confusion, only a few art styles are available: Animation or Illustration. You can’t display photorealistic images of people.
Image Wand is also out today as an update to the Apple Pencil tool palette, helping you transform your rough sketches into more polished works of art.
As announced at WWDC, Apple is bringing ChatGPT to Siri and Writing Tools, and every time your request is properly served by OpenAI’s tools, the system will suggest going there. For example, if you ask Siri to generate an itinerary, an exercise routine, or even a meal plan, the assistant might say to use ChatGPT for that and ask for your permission. You can choose to have the system ask you every time it goes to GPT, or show these requests less frequently.
It’s worth repeating that you don’t need a ChatGPT account to use these tools, and Apple has its own agreement with OpenAI so that when you use the latter’s services, your data such as your IP address is not stored or used to train models. However, if you link your ChatGPT account, your content will be subject to OpenAI’s policies.
Elsewhere, Apple Intelligence will also show that you can write with ChatGPT within Writing Tools, where you’ll find things like Rewriting, Summarizing and Proofreading. It’s also another area getting an update with the developer beta: a new tool called ‘Describe your change’. This is essentially a command bar that lets you tell Apple exactly what you want to do with your writing. For example, “Make it sound more enthusiastic,” or “Check this for grammatical errors.” In short, it makes it a little easier to let the AI edit your work, since you don’t have to go to the individual sections for Proofreading or Summarizing, for example. You can also make it do things like “Turn this into a poem.”
Visual intelligence is coming to iPhone 16 owners
Finally, if you have an iPhone 16 or iPhone 16 Pro and are using the developer beta, you can try out Visual Intelligence. This way you can point your camera at things around you and get answers to things like math problems in your textbook or the menu of a restaurant you see along the way. It may also use third-party services such as Google and ChatGPT.
Outside of the iPhone 16 series, you’ll need a compatible device to view all Apple Intelligence features. That means an iPhone 15 Pro and newer or an iPad or MacBook from the M series.