
Recent findings reveal that both the Apple App Store and Google Play Store are hosting a significant number of apps capable of creating non-consensual nude images using artificial intelligence. The Tech Transparency Project (TPP) reported that there are 55 such apps on Google Play and 47 on Apple's platform, highlighting a troubling trend in app development. Following inquiries from TPP and CNBC, Apple announced the removal of 28 apps identified in the report. The company emphasized its commitment to user safety, warning developers that failure to comply with app store guidelines could lead to further removals. Two of the removed apps were later reinstated after developers submitted revised versions that adhered to the guidelines. TPP criticized both Apple and Google for maintaining a collection of apps that can transform innocent photos into sexualized images, raising concerns about user safety. While Apple confirmed that only 24 apps had been removed from its store, Google stated it had suspended several apps for policy violations, although the exact number remains undisclosed as their investigation continues. The investigation into these apps follows a backlash against Elon Musk's xAI, particularly its Grok AI tool, which faced criticism for generating sexualized images in response to user prompts. TPP identified these problematic apps by searching for keywords like 'nudify' and 'undress' and testing how they utilized AI to create images of clothed women. Katie Paul, TPP's director, stressed that these apps are not simply for changing outfits but are designed for the non-consensual sexualization of individuals. A previous CNBC report also highlighted the dangers of such apps, showcasing women whose social media images were exploited to create deepfake pornography without their consent. Disturbingly, these incidents did not constitute a crime due to the lack of distribution by the individual who generated the content. The rise of AI technology has made it increasingly easier to produce explicit content, with many of the identified apps originating from China. Paul pointed out that this raises additional security concerns, as Chinese data retention laws could grant the government access to sensitive data from these applications. In light of these developments, the European Commission has initiated an investigation into xAI concerning the dissemination of sexually explicit content. Meanwhile, U.S. senators have urged both Apple and Google to remove xAI from their platforms, citing violations of their distribution terms regarding non-consensual content. Both companies' app store policies explicitly prohibit apps that undress individuals or view through clothing, yet the presence of such apps raises questions about their enforcement of these guidelines. According to TPP, the identified nudify apps collectively boast over 700 million downloads and have generated approximately $117 million in revenue, with both Apple and Google profiting from this disturbing trend. The discrepancies between these companies' stated policies and their actual enforcement have led to growing scrutiny over their roles as trusted app platforms.
The rapid expansion of data centers fueled by the artificial intelligence surge is drawing increasing scrutiny regarding...
CNBC | Mar 13, 2026, 08:55
Recent studies reveal that ChatGPT's energy consumption is staggering, with each query requiring at least ten times the ...
Business Today | Mar 13, 2026, 10:05
In a bid to strengthen his AI startup xAI, Elon Musk has announced plans to revisit previous job applications as he face...
Business Insider | Mar 13, 2026, 08:40Chinese automaker BYD is preparing to challenge luxury brands like Porsche and BMW in Europe with its latest electric ve...
Ars Technica | Mar 13, 2026, 14:30
For many gamers, the experience of starting a new game is often marred by frustrating wait times due to the 'compiling s...
Ars Technica | Mar 13, 2026, 15:35