
Meta has initiated legal proceedings against Joy Timeline HK Limited, the Hong Kong-based developer of CrushAI, an application that generates explicit deepfake content. The lawsuit claims that the company has consistently evaded Meta’s advertising regulations while promoting its controversial AI technology. This move is part of Meta's broader strategy to combat apps that create nude or sexualized images from user photos, often without consent. According to documents filed in a Hong Kong district court, CrushAI has allegedly run over 87,000 ads on Meta's platforms that breach the company’s guidelines. The lawsuit emphasizes that the app maker created an extensive network of at least 170 business accounts on Facebook and Instagram to facilitate these illegal advertisements. It was also reported that more than 55 active users managed over 135 Facebook pages displaying these ads, which primarily targeted audiences in the United States, Canada, Australia, Germany, and the United Kingdom. The advertisement content included AI-generated sexualized images accompanied by provocative captions, such as "upload a photo to strip for a minute" and "erase any clothes on girls." Meta's complaint seeks to address the apparent loopholes in its advertising system that allowed such content to proliferate. Recently, prominent figures, including celebrities and public figures, have been victims of similar deepfake technologies. In response to growing scrutiny, including inquiries from lawmakers like Senator Dick Durbin, Meta has acknowledged the increasing pressure to take decisive action against non-consensual explicit deepfakes. The company's recent legal actions align with the newly enacted Take It Down Act, which prohibits the sharing of non-consensual deepfake content. Reports earlier this year revealed that CrushAI had published thousands of ads across Meta’s platforms, with approximately 90% of its traffic sourced from Facebook and Instagram. Despite Meta’s strict policies against adult nudity and sexual exploitation, the company admitted it faced challenges in enforcing these rules effectively. In light of the lawsuit, Meta claims to have incurred $289,000 in expenses related to investigations and regulatory compliance while working to enhance its advertising detection capabilities. The company has developed new technology to identify problematic ads, even if they do not explicitly depict nudity. Meta is also collaborating with external experts to improve its automated content moderation systems. As part of its ongoing effort to combat the misuse of its platforms, Meta has begun sharing information about nudifying applications with other tech companies through a collective initiative called Lantern, which aims to tackle child sexual exploitation in the digital space. This legal action against CrushAI underscores Meta's commitment to reinforcing its advertising policies amid a rapidly evolving digital landscape.
Good morning! As Wall Street faces a challenging start to the trading day, investors are advised to prepare for possible...
CNBC | Mar 09, 2026, 12:25
ModRetro, the nostalgic gaming venture founded by Palmer Luckey, is reportedly seeking to secure funding that could valu...
TechCrunch | Mar 08, 2026, 21:40
Lewis Dickson, a 78-year-old retiree and former technology consultant, is redefining the narrative around aging and tech...
Business Insider | Mar 09, 2026, 24:00Taskrabbit is revolutionizing the gig economy by leveraging the power of AI to connect freelance workers with job opport...
Business Insider | Mar 09, 2026, 09:15The integration of AI into the workplace is reshaping traditional management structures, prompting businesses to rethink...
Business Insider | Mar 09, 2026, 11:25