NextFin

Apple and Google Continue to Offer Dozens of AI 'Nudify' Apps Despite Scrutiny

Summarized by NextFin AI
  • A recent investigation revealed that Apple and Google continue to host numerous AI applications that generate non-consensual sexual imagery, with 55 apps on Google Play and 47 on the Apple App Store.
  • These apps have collectively achieved over 700 million downloads and generated approximately $117 million in revenue, raising concerns about the platforms' enforcement of their own safety guidelines.
  • Political pressure is mounting following the signing of the TAKE IT DOWN Act, which aims to criminalize non-consensual sexual deepfakes, yet enforcement relies heavily on victim reporting.
  • The industry may face a shift towards stricter moderation practices, including mandatory watermarking and verification for developers, as regulatory scrutiny increases.

NextFin News - In a stark demonstration of the gap between corporate policy and platform reality, a new investigation has revealed that the world’s two dominant mobile ecosystems continue to host a proliferation of artificial intelligence applications designed to generate non-consensual sexual imagery. As of late January 2026, the Apple App Store and Google Play Store remain home to dozens of so-called "nudify" apps, which utilize generative AI to digitally remove clothing from photos of individuals without their consent.

According to a report released Tuesday by the Tech Transparency Project (TTP), a review conducted this month identified 55 such applications on Google Play and 47 within the Apple App Store. These apps, which have collectively garnered over 700 million downloads and generated an estimated $117 million in revenue, often bypass filters by marketing themselves as "entertainment" or "face swap" tools. Following inquiries from TTP and media outlets, Apple confirmed on Monday that it had removed 28 of the identified apps, though subsequent checks by TTP indicated that only 24 had actually been purged. Google stated it has suspended several apps and is continuing an ongoing investigation into the remaining software flagged in the report.

The persistence of these tools comes at a moment of heightened political and regulatory tension. U.S. President Trump recently signed the TAKE IT DOWN Act, a federal statute aimed at criminalizing the publication of non-consensual sexual deepfakes. However, the current enforcement of this law largely relies on victim reporting rather than proactive platform scrubbing. This has prompted a trio of Democratic U.S. senators—Wyden, Markey, and Luján—to send a formal letter to Apple CEO Tim Cook and Google CEO Sundar Pichai, demanding the removal of not just niche apps, but also mainstream platforms like X, whose Grok AI tool has been implicated in the mass generation of sexualized imagery.

The technical mechanism behind these apps has evolved significantly over the past year. While early iterations produced distorted or easily identifiable fakes, the 2026 class of nudify apps leverages advanced diffusion models that produce high-fidelity results with minimal user input. According to Paul, the director of TTP, many of these apps are developed by entities based in China, raising secondary concerns regarding data sovereignty and the potential for sensitive personal imagery to be stored on foreign servers subject to different privacy jurisdictions.

From a financial perspective, the reluctance to implement a total, proactive ban may be linked to the "app store tax." With $117 million in revenue flowing through these specific apps, Apple and Google have likely collected upwards of $30 million in commissions. This creates a perverse incentive structure where the platforms profit from the distribution of tools that violate their own stated safety guidelines. Apple’s guidelines explicitly prohibit "overtly sexual or pornographic material," while Google’s policy bans apps that "claim to undress people," yet the sheer volume of available software suggests that automated review processes are being easily outmaneuvered by developers using deceptive metadata.

The impact of this oversight is not merely theoretical. In a case cited by investigators, over 80 women in Minnesota were victimized when their public social media photos were processed through these services. Because the generation of such images often occurs in private, legal recourse remains difficult unless the material is widely distributed. This "legal gray zone" has allowed the nudify industry to flourish as a high-margin, low-risk sector of the broader AI economy.

Looking forward, the industry is likely facing a mandatory shift toward "friction-based" moderation. As international bodies like the European Commission open formal investigations into platforms like X over Grok’s outputs, Apple and Google will likely be forced to implement more rigorous, AI-driven vetting processes for any app utilizing image-to-image generation. The trend suggests that the era of reactive moderation—where apps are only removed after public outcry—is becoming politically untenable. We expect to see a move toward mandatory watermarking and "known-entity" verification for developers in the generative AI space by the end of 2026, as the liability for hosting these tools begins to outweigh the commission revenue they provide.

Explore more exclusive insights at nextfin.ai.

Insights

What are nudify apps and how do they operate?

What technical advancements have been made in nudify app technology?

What is the current market situation for nudify apps on Apple and Google platforms?

What feedback have users provided regarding nudify apps?

What recent policies have been introduced to regulate nudify apps?

What does the TAKE IT DOWN Act entail regarding non-consensual imagery?

What challenges do platforms face in removing nudify apps?

How do nudify apps generate revenue for Apple and Google?

What controversies surround the enforcement of nudify app policies?

How do nudify apps compare to other AI-generated content applications?

What steps are being taken internationally to address the issue of nudify apps?

What are the potential long-term impacts of nudify apps on digital privacy?

What might be the future direction of regulation for nudify apps?

What legal challenges do victims of nudify apps face?

What is meant by 'friction-based' moderation in the context of app enforcement?

How have Apple and Google responded to scrutiny regarding nudify apps?

What implications does the presence of nudify apps have for data sovereignty?

What role do deceptive marketing practices play in the availability of nudify apps?

Search
NextFinNextFin
NextFin.Al
No Noise, only Signal.
Open App