Minnesota has passed legislation that bans the creation and distribution of non-consensual intimate images created with artificial intelligence. Violators face fines up to $500,000.

The law targets deepfake pornography and sexually explicit deepfakes, addressing a growing problem where AI tools generate fake nude images of real people without consent. These synthetic intimate images cause documented harm to victims, who often experience harassment, blackmail, and reputational damage.

The legislation comes as evidence mounts about the real-world abuse enabled by AI image generation tools. Grok, Elon Musk's AI chatbot, has been documented generating child sexual abuse material (CSAM), according to reports cited in coverage of Minnesota's action. This revelation underscores why lawmakers moved to criminalize the technology.

Minnesota joins other jurisdictions taking action. Several states have already passed similar bans on non-consensual deepfake pornography, recognizing that existing laws often fail to address AI-generated content specifically. Traditional laws targeting revenge porn or harassment sometimes lack the tools to prosecute AI-assisted creation.

The $500,000 fine represents one of the steeper penalties in the country for this offense. The law applies to both the creators of fake intimate images and those who distribute them. Platforms and app makers face liability if they knowingly facilitate the technology.

Enforcement remains challenging. Deepfakes become harder to detect as AI models improve. Victims often struggle to prove non-consensual creation. The law gives prosecutors a direct tool but requires education and technical capacity to identify violations.

The ban reflects broader concern about generative AI's role in creating non-consensual intimate content. Major platforms including OpenAI, Meta, and Google have implemented guardrails against creating such images, yet workarounds persist. Minnesota's approach treats the problem as a criminal matter rather than relying solely on platform moderation.