Apple and Google are found to be hosting more than 100 AI‑driven “nudify” apps that strip clothing from photos, violating each company’s store policies. [1][2]

The breach matters because it exposes users to unwanted image manipulation, threatens personal privacy, and undermines confidence in the major app marketplaces. [1]

A report by the Tech Transparency Project identified over 100 such apps across the Apple App Store and Google Play Store, all of which use artificial‑intelligence algorithms to remove garments from uploaded pictures. The sheer volume suggests systematic lapses in enforcement. [1]

Both Apple and Google maintain rules that forbid apps from manipulating images to undress individuals without consent, labeling such behavior a direct violation of their content‑manipulation policies. Apple’s guidelines state that apps must not facilitate “non‑consensual nudity,” while Google’s developer policy bans “image‑alteration that results in sexual content.” [2]

Apple said it will remove apps that breach its policies and is improving automated detection tools. Google said it will enforce its rules more rigorously and work with developers to ensure compliance. [1]

The episode highlights a broader challenge: AI tools that can alter visual media are proliferating faster than platform oversight mechanisms can keep pace, prompting calls for stronger regulatory frameworks and industry‑wide standards. [2]

**What this means** The discovery of dozens of policy‑breaking nudify apps on the two largest app stores signals that current review processes are insufficient for AI‑generated content. Users may face privacy violations, and the platforms risk eroding trust unless they adopt more aggressive detection and removal strategies, potentially influencing future policy reforms and legislative scrutiny.

The apps manipulate images to remove clothing, violating store policies.

The presence of over 100 AI‑driven nudify apps on Apple and Google platforms shows that existing review systems struggle to keep up with rapid AI innovation, raising privacy risks and prompting likely regulatory attention.