Skip to main content

Apple Intelligence: Everything you need to know about Apple’s take on AI

Type to Siri being used with Apple Intelligence in macOS Sequoia.
Apple

Apple Intelligence is Apple’s take on AI, and it looks to fundamentally change the way we interact with technology, blending advanced machine learning and AI capabilities into everyday devices.

Promising more conversational prose from Siri, automated proofreading and text summarization across apps, and lightning-fast image generation, Apple’s AI ecosystem is designed to enhance user experiences and streamline operations across its product lineup. Here’s everything you need to know about Apple’s transformational new AI.

Recommended Videos

Apple Intelligence release date and compatibility

Apple Intelligence was originally slated for formal release in September, coinciding with the initial roll out of iOS 18, iPadOS 18, and macOS Sequoia. However, as Bloomberg’s Mark Gurman reported, Apple subsequently decided to slightly delay the release of Intelligence.

It was made available to developers as part of the iOS 18.1 beta release on September 19, and officially launched alongside the 18.1 roll out in October. However, it wasn’t until the release of iOS 18.2 in December 2024 that many Apple Intelligence features such as Genmoji, Image Playground, and Visual Intelligence finally arrived for all users. The company also released a bug fix addressing the “Apple Intelligence causing devices to overheat” issue.

NEW: Apple Intelligence will arrive later than anticipated, coming in iOS 18.1 and iPadOS 18.1 in October and missing the initial September releases. Still, 18.1 will go into beta for developers this week. https://t.co/LqXDvjO6ef

— Mark Gurman (@markgurman) July 28, 2024

These new AI features are available for users on the iPhone 15 Pro, 15 Pro Max, and the iPhone 16, as well as iPads and Macs with M1 or newer chips (and presumably the iPhone 16 handsets as well, since they’ll all be running iOS 18).

Currently, the features are only available when the user language is set to English, though the company plans to add support for Chinese, English (India), English (Singapore), French, German, Italian, Japanese, Korean, Portuguese, Spanish, and Vietnamese in an update scheduled for April 2025.

New AI features

Math Notes feature in iPadOS 18.
Apple

No matter what device you’re using Apple Intelligence with, the AI focuses primarily on three functions: writing assistance, image creation and editing, and enhancing Siri’s cognitive capabilities.

Apple Intelligence is designed to span the breadth and width of the company’s product line. As such, virtually every feature found in the macOS version of Apple Intelligence is mirrored in the iOS and iPadOS versions. That includes Writing Tools, Image Playground, Memories in Photos, and Siri’s improvements.

In addition, iPadOS, when paired with Apple Pencil, unlocks more features. Smart Script in the Notes app, for example, straightens and smooths handwritten text in real time. The new Math Notes calculator will automatically solve written equations in the user’s own handwriting and generate interactive graphs based on those equations with a single tap.

We at Digital Trends took an early version Apple Intelligence for a spin using macOS Sierra beta, but came away rather disappointed with what we’ve seen so far from the digital agent — a sentiment mirrored by many Apple Intelligence users. For one, only a fraction of the AI tools were actually available to use through the beta release. And the tools we did have access to, including the writing assistant, Siri, and audio transcription, proved buggy and unreliable.

By the time 18.1 was released, Apple had thankfully addressed many of those issues, putting Apple Intelligence on par with more established AI assistants, like Google’s Gemini.

Writing Tools

Apple Intelligence's Writing Tools being used in macOS Sequoia.
Apple

The new Writing Tools feature can proofread the user’s writing and rewrite sections as necessary, as well as summarize text across Apple’s application ecosystem including Mail, Notes, and Pages. Third-party developers will be able to leverage Writing Tools in their own apps via API calls.

For example, within the Mail app, Apple Intelligence will provide the user with short summaries of the contents of their inbox, rather than showing them the first couple of lines of the email itself (though if you aren’t a fan of that feature, it’s easy to disable). Smart Reply will suggest responses based on the contents of the message and ensure that the reply addresses all of the questions posed in the original email. The app even moves more timely and pertinent correspondence to the top of the inbox via Priority Messages.

The Notes app has received significant improvements as well. With Apple Intelligence, Notes offers audio transcription and summarization features, as well as an integrated calculator, dubbed Math Notes, that solves equations typed into the body of the note.

Image Playground

The Image Playground being used with Apple Intelligence in macOS Sequoia.
Apple

Image creation and editing functions are handled by the new Image Playground app, wherein users can spin up generated pictures within seconds and in one of three artistic styles: Animation, Illustration, and Sketch. Image Playground is a standalone app, although many of its features and functions have been integrated with other Apple apps like Messages.

Apple Intelligence is also improving your device’s camera roll. The Memories function in the Photos app was already capable of automatically identifying the most significant people, places, and pets in a user’s life, then curating that set of images into a coherent collection set to music. With Apple Intelligence, Memories is getting even better.

The AI can select photos and videos that best match the user’s input prompt (“best friends road trip to LA 2024,” for example), then generate a story line — including chapters based on themes the AI finds in the selected images — and assemble the whole thing into a short film. Photos users also now have access to Clean Up, a tool akin to Google’s Magic Eraser and Samsung’s Object Eraser, and improved Search functions.

Siri

Summoning Siri on an iPhone.
Nadeem Sarwar / Digital Trends

Perhaps the biggest beneficiary of Apple Intelligence’s new capabilities is Siri. Apple’s long-suffering digital assistant has been more deeply integrated into the operating system, with more conversational speech and improved natural language processing. You’ll have to manually enable the feature on your iPhone before you can use it, but doing so is a simple task.

What’s more, Siri’s memory now persists, allowing the agent to remember details from previous conversations, while the user can seamlessly switch between spoken and written prompts. Apple is reportedly working on an even more capable version of Siri, but its release may not come until 2026.

Apple is also expected to roll out a number of new capabilities as part of the 18.4 update scheduled for around April 2025. These include Personal Context, which enables Siri to know both where a piece of content is on the device, as well as how it got there. With it, you’ll be able to ask questions like “When is Mom’s flight landing,” or make requests like “Play that podcast that Jamie recommended.”

Siri will also reportedly gain a better understanding of what is happening on your device’s screen later this spring. If, for example, you’ve been texted an address, you’ll be able to simply say, “Add that to my contacts” and Siri should be able to complete the action without added clarification from you, the user. Both new features are part of Apple’s larger app integration initiative, which will enable Siri to interact natively with, and send commands to, a wide array of iOS and third-party applications.

Apple Intelligence privacy

A diagram showing Apple's entire setup for AI computing.
Apple

In order to avoid the costly and embarrassing data leaks that some of its competitors have suffered in recent months, Apple has put privacy at the front and center of the Apple Intelligence experience, even going so far as to build out its own private and secure AI compute cloud, named Private Cloud Compute (PCC), to handle complex user queries.

Most of Apple Intelligence’s routine operations are handled on-device, using the company’s most recent generations of A17 and M-family processors, said Craig Federighi, Apple’s senior vice president of Software Engineering, during WWDC 2024. “It’s aware of your personal data, without collecting your personal data,” he added.

“When you make a request, Apple Intelligence analyzes whether it can be processed on-device,” Federighi continued. “If it needs greater computational capacity, it can draw on Private Cloud Compute and send only the data that’s relevant to your task to be processed on Apple silicon servers.” This should drastically reduce the chances of private user data being hacked, intercepted, spied upon, and otherwise snooped while in transit between the device and PCC.

“Your data is never stored or made accessible to Apple,” he explained. “It’s used exclusively to fulfill your request and, just like your iPhone, independent experts can inspect the code that runs on these servers to verify this privacy promise.” The company is so confident in its cloud security that it is offering up to a million dollars to anyone able to actually hack it.

Apple Intelligence will defer to ChatGPT on complex queries

ChatGPT and Siri integration on iPhone.
Nadeem Sarwar / Digital Trends

Apple Intelligence isn’t the only cutting-edge generative AI taking up residence in your Apple devices. During WWDC 2024, Apple and OpenAI executives announced that the two companies are launching a partnership that will see ChatGPT functionality (powered by GPT-4o) — including text generation and image analysis — integrated into Siri and Writing Tools. ChatGPT will step in if Siri’s onboard capabilities aren’t sufficient for the user’s query, except that ChatGPT will instead send the request to OpenAI’s public compute cloud rather than the PCC.

Users won’t have to navigate away from the Siri screen when leveraging ChatGPT’s capabilities. OpenAI’s chatbot functions in the background when it is called upon, and Siri will state the answer regardless of which AI handles the query. To ensure at least a semblance of privacy protections, the device will display a confirmation prompt to the user before transmitting their request, as well as for any documents or images the user has attached.

ChatGPT x Apple Intelligence—12 Days of OpenAI: Day 5

On Day Five of the 12 Days of OpenAI event in December, OpenAI CEO Sam Altman provided additional details about how the two systems will work together, noting that ChatGPT is accessible directly from their device’s user interface (regardless of whether it’s iOS, iPadOS, or MacOS) and that users will have the option of either logging into their ChatGPT account to access it or using it anonymously. You’ll also be able to access ChatGPT directly simply by telling Siri to have ChatGPT handle the task (i.e., “Siri, have ChatGPT assemble a holiday music playlist.”)

Apple Intelligence trained on Google’s Tensor Processing Units

Google's Tensor G2 chip.
Google

A research paper from Apple, published in July, reveals that the company opted to train key components of the Apple Intelligence model using Google’s Tensor Processing Units (TPUs) instead of Nvidia’s highly sought-after GPU-based systems. According to the research team, utilizing TPUs allowed them to harness enough computational power needed to train its enormous LLM, as well as do so more energy efficiently than they could have using a standalone system.

This marks a significant departure from how business is typically done in AI training. Nvidia currently commands an estimated 70% to 95% of the AI chip market, so to have Apple opt instead for the product of its direct rival — and reveal that fact publicly — is highly unusual, to say the least. It could also be a sign of things to come. Nvidia’s market dominance couldn’t last forever — we’re already seeing today’s hyperscalers making moves into proprietary chip production.

Beyond Google’s ongoing TPU efforts, Amazon announced that it’s working on its own chipline, one that would outperform Nvidia’s current offerings by 50% while consuming half as much power. Microsoft announced that it will utilize AMD’s family of AI chips in May 2024.

Andrew Tarantola
Andrew Tarantola is a journalist with more than a decade reporting on emerging technologies ranging from robotics and machine…
Apple’s M4 iMac brings next-gen power to your desktop
People using the Apple iMac with M4 chip.

Apple has brought its M4 chip to the iMac, making it the first Mac to get Apple’s latest silicon chip. The update also brings new colors and a significant performance improvement for the all-in-one desktop computer, and it comes a year after it received the previous-generation M3 chip. As with the previous M1 and M3 iMacs, the M4 model is compatible with Apple Intelligence.

It comes at the beginning of a week of product releases from Apple, with the company previously teasing that it had much more to reveal in the coming days. The updates could see the entire Mac lineup receive some variant of the M4 chip (including more powerful M4 Pro, M4 Max and M4 Ultra editions) over the coming months.

Read more
Apple’s next-gen M4 Macs look set to embrace serious gaming
The Mac mini on a wooden table.

Apple’s Mac machines and gaming don’t quite fit in the same equation, even though the recent trajectory of its Metal architecture has pulled off a few surprises. But it looks like the upcoming M4-tier machines won’t pull any punches, including the Mac mini.

In the latest edition of his Power On newsletter, Bloomberg’s Mark Gurman writes that for the first time, Apple’s entry-level desktop computer will offer ray tracing support. For the unaware, it’s a lighting system that adds a whole new level of visual realism to games.

Read more
GPT-5: everything we know so far about OpenAI’s next frontier model
A MacBook Pro on a desk with ChatGPT's website showing on its display.

There's perhaps no product more hotly anticipated in tech right now than GPT-5. Rumors about it have been circulating ever since the release of GPT-4, OpenAI's groundbreaking foundational model that's been the basis of everything the company has launched over the past year, such as GPT-4o, Advanced Voice Mode, and the OpenAI o1-preview.

Those are all interesting in their own right, but a true successor to GPT-4 is still yet to come. Now that it's been over a year a half since GPT-4's release, buzz around a next-gen model has never been stronger.
When will GPT-5 be released?
OpenAI has continued a rapid rate of progress on its LLMs. GPT-4 debuted on March 14, 2023, which came just four months after GPT-3.5 launched alongside ChatGPT. OpenAI has yet to set a specific release date for GPT-5, though rumors have circulated online that the new model could arrive as soon as late 2024.

Read more