Apple has introduced a change to its App Store Review Guidelines that requires developers to clearly inform users and secure explicit permission before sharing personal data with third-party artificial-intelligence systems.
The update appears in the revised guideline section 5.1.2(i) and highlights “including with third-party AI” in the disclosure requirement.
Under the new directive, apps that send user data, such as photos, chat logs or details of app usage, to external AI providers must display transparent disclosures, obtain user consent and ensure the shared data is handled in alignment with Apple’s privacy standards. Apps failing to meet these criteria may face removal from the store.
The guideline update was posted November 13, 2025, according to developer documentation and coverage from multiple outlets. The timing is notable, coming ahead of Apple’s own introduction of expanded AI features later in 2026.
Impacts For Developers And Users
The change effectively raises the bar for any app using machine-learning or generative-AI services from external providers. Developers who integrate third-party AI for chatbots, image generation, recommendations or voice assistants now must include clear, upfront disclosure and obtain permission.
General machine-learning features that do not transmit personal data remain subject to existing rules, but any sharing with external AI systems triggers the new obligation.
For users, the update brings greater clarity about how their personal data is used within apps. They will now see prompts that specify if their data is being processed by an AI provider outside the app’s native system. This increases transparency and can influence decision-making during onboarding or feature activation.
Why The Move Matters Now
Apple’s tightening of rules occurs amid a surge of interest in integrating AI across apps, particularly model-based features like personalisation, voice agents and in-app automation. As more apps feed user data to external models, Apple is positioning itself as a steward of privacy in the AI era.
At the same time, the update aligns with Apple’s broader push into AI, such as its upcoming Sirius platforms and AI-enhanced services, where control of data and compute will be crucial.
The stricter guidelines may prompt some smaller developers to reconsider using external AI providers, or look for on-device or Apple-approved model options. For Apple, the move may reduce risk of uncontrolled data flows while the company builds out its own AI capabilities.
Evaluate Compliance Going Forward
App developers need to review how their apps handle third-party AI services, map data flows and update onboarding processes if necessary. They should assess whether any data sent to external AI systems meets the new disclosure rules and whether user consent flows meet Apple’s requirements.
Analysts and privacy advocates will observe how strictly Apple enforces the new rule, whether App Store removals increase, whether developer documentation changes or whether Apple rolls out further clarifications.
The broader question is how the ecosystem adapts to the evolving intersection of AI features and privacy regulation in mobile apps.
