The rollout of Apple Intelligence has been slow, staggered, and steady since the company first revealed its version of ai at this year's WWDC. It continues today with the release of the latest developer beta builds for iOS 18, iPadOS 18, and macOS Sequoia. Updates in iOS 18.2, iPadOS 18.2, and macOS Sequoia (15.2) bring long-awaited features like Genmoji, Image Playground, Visual Intelligence, and ChatGPT integration for those running the preview software, as well as Image Wand for iPads and more writing tools.
This follows the announcement that iOS 18.1 would be available to the public as a stable release next week, bringing things like writing tools, notification summaries, and Apple's hearing test to the masses.
This marks the first time that people who haven't opted into the beta software have tried Apple Intelligence, which the company has widely touted as the marquee feature of the devices it launched this year. The iPhone 16 series, for example, was announced as phones designed for Apple Intelligence, although they were released without those features.
Now that the next set of tools is ready for developers to test, it looks like we're weeks away from them reaching the public. For those already in the developer beta, the update will arrive automatically. As always, a word of warning: if you're not already familiar, beta software is designed for users to test new features and often check for compatibility or issues. They may have errors, so always back up your data before installing previews. In this case, you will also need to have an Apple developer account to gain access.
Genmoji arrives today
Today's updates bring Genmoji, which lets you create custom emoji from your keyboard. You'll go to the emoji keyboard, tap the Genmoji button next to the description or search input field, then enter what you want to create. Apple Intelligence will generate a few options, which you can swipe and select one to send. You can also use them as response reactions to other people's messages. Additionally, you can create Genmoji based on photos of your friends, creating more accurate Memoji of them. Since all of these are presented in emoji style, there will be no risk of confusing them with real images.
Apple is also launching a Genmoji API today so that third-party messaging apps can read and render Genmoji, and people you text on WhatsApp or Telegram can see your new gym rat emoji.
Other previously announced features, such as Image Playground and Image Wand, are also available today. The first is a standalone app and you can access it from the Messages app using the More button. If you check Messages, the system will quickly generate some suggestions based on your conversations. You can also type descriptions or select photos from your gallery for reference, and the system will show you an image that you can then modify. To avoid confusion, only a few art styles are available: Animation or Illustration. You will not be able to render photorealistic images of people.
Image Wand is also arriving today as an update to the Apple Pencil tool palette, helping turn your rough sketches into more polished works of art.
As announced at WWDC, Apple is bringing ChatGPT to Siri and writing tools, and whenever OpenAI tools can handle your request well, the system will suggest heading there. For example, if you ask Siri to generate an itinerary, a workout routine, or even a meal plan, the assistant might tell you that you need to use ChatGPT to do it and ask for permission. You can choose to have the system prompt you every time you access GPT or to display these prompts less frequently.
It's worth reiterating that you don't need a ChatGPT account to use these tools, and Apple has its own agreement with OpenAI so that when you use the latter's services, your data, such as your IP address, is not stored or used to train models. . . However, if you connect your ChatGPT account, your content will be covered by OpenAI policies.
Elsewhere, Apple Intelligence will also show that you can write with ChatGPT within Writing Tools, which is where you'll find things like Rewrite, Summarize, and Revise. It's also another area that will receive an update with the developer beta: a new tool called “Describe your change.” This is basically a command bar that allows you to tell Apple exactly what you want to do with your writing. “Make it sound more enthusiastic,” for example, or “Check this for grammatical errors.” Basically, it will make the ai editing your work a little easier, since you won't have to go to individual sections to Review or Summarize, for example. You can also have it do things like “Turn this into a poem.”
Visual intelligence comes to iPhone 16 owners
Finally, if you have an iPhone 16 or iPhone 16 Pro and are running the developer beta, you'll be able to try Visual Intelligence. That allows you to point your camera at your surroundings and get answers to things like math problems in your textbook or the menu of a restaurant you pass by on your trip. You can also take advantage of third-party services like Google and ChatGPT.
Outside of the iPhone 16 series, you will need a compatible device to check out Apple Intelligence features. That means an iPhone 15 Pro and newer or an M-series iPad or MacBook.