Apple on Thursday rolled out updated App Review Guidelines that directly target how apps handle user data when interacting with external AI providers. The revised rule now states that developers must “clearly disclose where personal data will be shared with third parties, including with third-party AI, and obtain explicit permission before doing so.”
This is a notable shift because Apple has singled out AI companies by name for the first time. Until now, the company’s privacy rules — including the long-standing 5.1.2(i) guideline — required user consent before any personal data could be shared, but they didn’t explicitly mention AI. That earlier language was designed to keep Apple aligned with privacy laws like the EU’s GDPR and California’s CCPA. Now, with AI systems becoming deeply embedded in apps, Apple is tightening the net.
The timing is strategic. Apple is gearing up for its own Siri overhaul in 2026, which will let the assistant perform actions across apps and use Google’s Gemini for some of its capabilities, according to Bloomberg. With Siri about to become far more powerful, Apple is also ensuring that third-party apps don’t quietly funnel personal data to outside AI models for training or personalisation without user awareness.
What remains unclear is how strictly Apple will enforce this rule, especially given how broad the term “AI” is. Everything from large language models to basic machine-learning features could fall under the definition.
Thursday’s update also included new guidelines around Apple’s Mini Apps Program, creator apps, loan apps, and a clarification that crypto exchanges fall into highly regulated categories.
Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!
