Meta files lawsuit against image censoring app Crush AI
Here Comes the House of Cards: Meta Slaps Down AI-Powered "Nudify" App for Skirting Ad Policies
In a surprising turn of events, Meta, the tech titan, has thrown a legal curveball at Joy Timeline HK Limited, the makers behind the popular "Crush AI" app. The app, notorious for its AI-generated naked images, infamously known as "nudify" or "undress" apps, has been under Meta's radar for evading their ad review processes.
The legal battle, unfolding in the heart of Hong Kong, sees Meta charging Joy Timeline HK with intentionally bypassing their ad review system. The company, according to the lawsuit, resorted to clever tactics such as disguising ad content, employing new domain names, and creating a cluster of advertiser accounts to keep the "nudify" app's AI-powered deepfakes running on their platforms despite repeated violations of Meta's advertising policies [1][2][3].
Meta's stance on this issue was as clear as crystal in their recent press release, "This legal action highlights our commitment to combat this abuse and protect our community from it. We'll continue to take robust actions, involving legal measures, against those who abuse our platforms." [4]
While Meta has been under criticism for its leniency in handling nudify apps, the company's advertising policies strictly prohibit the spread of non-consensual explicit imagery, and it blocks the search terms "nudify," "undress," and "delete clothing" [5]. An analysis by Cornell researcher Alexios Mantzarlis revealed that Crush AI ran an astonishing 8,000 ads across Meta platforms from fall 2024 through January 2025, with an staggering 90% of its traffic sourced from Meta [5].
The era of AI-generated ad content has been a bane for social media users, as Meta relaxes its content moderation policies in favor of automated review processes and community-generated fact-checking. The victims of AI-generated non-consensual intimate imagery have been battling for stricter regulations and legal avenues. In May, the Trump Administration signed the Take It Down Act, criminalizing non-consensual intimate imagery and mandating takedown policies for online platforms [6]. AI-generated child sexual abuse material (CSAM) has also been rampant on the internet, fueling concern over the safety and regulation of generative AI tools [7].
Alongside the legal action against Crush AI, Meta announced its intentions to roll out a state-of-the-art detection technology to flag and delete ads promoting nudify apps more accurately. The company is also ramping up its collaboration with the Tech Coalition’s Lantern program, an industry-wide initiative focused on child online safety, and will continue to share intelligence on offending companies and products [4]. Since March, Meta has identified more than 3,800 unique URLs related to nudify apps and websites and discovered four separate networks attempting to peddle their wares, according to the company [8].
In conclusion, Meta is taking deliberate and decisive steps to counter AI-generated explicit content, employing legal action, policy enforcement, enhanced detection technology, and proactive measures to safeguard its users and uphold community standards. The cat-and-mouse game between Meta and the makers of nudify apps is far from over, but the moves made by Meta ensure that the clandestine world of AI-generated non-consensual intimate images will not remain a dark corner on its platforms.
Keywords: Artificial Intelligence, Social Justice, Meta
Insights:
- Joy Timeline HK Limited, the makers of Crush AI, deliberately attempted to evade Meta’s ad review processes by disguising ad content, using new domain names, and creating networks of advertiser accounts [1][2][3].
- Meta’s plans to combat AI-generated non-consensual intimate images include legal action, policy enforcement, enhanced detection technology, and proactive measures [1][5].
- According to an analysis by Cornell researcher Alexios Mantzarlis, Crush AI ran over 8,000 ads on Meta platforms between the fall of 2024 and January 2025, with 90% of its traffic coming from Meta platforms [5].
- The legal action taken by Meta against Joy Timeline HK Limited, creators of the Crush AI app, highlights their commitment to combat AI-generated explicit content on their platforms.
- Meta plans to counter AI-generated non-consensual intimate images by employing enhanced detection technology, enforcing policies, taking legal action, and implementing proactive measures.
- The Crush AI app, known for generating nudify images, avoided Meta's ad review processes by disguising ad content, using new domain names, and creating multiple advertiser accounts.
- On one hand, Meta faces criticism for its leniency in handling nudify apps, but on the other, their advertising policies strictly prohibit the spread of non-consensual explicit imagery.
- The era of AI-generated ad content has brought concerns about safety and regulation, leading to the signing of the Take it Down Act by the Trump Administration in May, which criminalizes non-consensual intimate imagery and mandates takedown policies for online platforms.