Women are suing men who used their Instagram photos to create deepfake porn featuring AI-generated influencers without consent. The defendants operated ModelForge, a platform that teaches users how to generate synthetic influencers using real women's images scraped from social media.

The lawsuit targets the creators and operators of ModelForge, alleging they harvested photos from public Instagram accounts and used them to train AI models that generate nude or sexually explicit content. The women claim they never authorized this use of their likenesses.

ModelForge markets itself as a tool for building AI influencers from scratch. The platform appears designed to help users create synthetic personalities for engagement or profit. However, the lawsuit reveals it served a darker purpose: enabling non-consensual sexual imagery creation at scale.

This case highlights a growing problem in AI abuse. Deepfake pornography generators have proliferated online, often targeting women without their knowledge. The technology requires only publicly available photos to create convincing fake videos or images. Victims face reputational harm, emotional distress, and in some cases, harassment and blackmail.

The legal challenge tests whether platforms and creators bear responsibility for enabling image-based sexual abuse. Plaintiffs likely argue violation of right of publicity, unauthorized use of likeness, and potentially harassment or defamation claims. The defendants may claim free speech protections or argue users acted independently.

Several jurisdictions have begun criminalizing deepfake pornography. Some states now explicitly outlaw creating or distributing sexually explicit deepfakes without consent. However, civil litigation remains a primary avenue for victims seeking damages and injunctions.

The ModelForge case sets stakes for how courts treat AI tools explicitly designed around abuse potential. If successful, the lawsuit could establish that platform operators cannot claim neutrality when their systems systematically enable non-consensual sexual content creation.

WHY IT MATTERS: This case could establish legal liability