Apple removed 28 apps from its App Store after the Tech Transparency Project (TTP) disclosed a January review that uncovered applications designed to produce nude images from photographs. The watchdog said its probe located 47 such apps on Apple’s platform and 55 on Google Play by searching for terms such as "nudify" and "undress." TTP tested the apps using AI-generated images of fully clothed women to assess their output.
In response to the TTP findings, Apple reportedly took action on Monday, deleting 28 of the identified applications and notifying other developers that their apps could be removed if they did not remedy violations of App Store guidelines. TTP's subsequent check, however, found that only 24 of the flagged apps had actually been removed at that time.
"These were definitely designed for non-consensual sexualization of people." - Katie Paul, TTP Director
Google has also moved to suspend several apps for alleged policy breaches, though the company did not disclose the exact number of suspensions and indicated its investigation was ongoing. The watchdog classified the problematic software into two distinct types: apps that deploy AI to render images of women without clothing and "face swap" apps that place women's faces onto nude bodies.
The report and ensuing removals come amid heightened scrutiny over potential misuse of AI tools. The debate intensified recently after criticism over the Grok AI tool from xAI, which produced sexualized images of women and children in response to user prompts.
TTP's findings and the subsequent platform enforcement illustrate the challenges app stores face policing AI-driven image-manipulation tools. Apple appears to be enforcing its guidelines against applications that enable non-consensual sexualization, while Google has signaled ongoing review and suspension activity. The situation remains fluid, with watchdog follow-ups uncovering discrepancies between apps identified and apps actually removed.
What happened
- TTP's January review identified 47 apps on Apple and 55 on Google Play that can create nude images from photos.
- Apple removed 28 apps and warned other developers about potential removal for guideline violations; a TTP follow-up found 24 apps removed.
- Google suspended several apps for policy violations and said its investigation is ongoing.
Context
The watchdog tested the apps using AI-generated images of fully clothed women and searched for apps using terms like "nudify" and "undress." The report categorized the problematic apps as those rendering nudity via AI and face-swap tools placing faces on nude bodies. The episode coincides with broader concerns about AI misuse after controversy over the Grok AI tool's outputs.