SAN FRANCISCO – San Francisco City Attorney David Chiu is suing 16 companies in the U.S. and abroad, alleging they create and distribute non-consensual AI-generated pornography.
Through a statement Chiu’s office released Thursday, calling the lawsuit, filed on behalf of the people of the state of California, “a first of its kind” alleging violations of state and federal laws banning deepfake pornography, revenge pornography and CSAM. as violations of California’s unfair competition law.
The People, the statement notes, “seeks the removal of Defendants’ websites, as well as injunctive relief to permanently restrain Defendants from engaging in this unlawful conduct. The lawsuit also seeks civil penalties and costs for filing the lawsuit.”
The statement claims that “the spread of non-consensual deepfake pornographic images has exploited real women and girls around the world,” prompting a lawsuit against what the city attorney calls “the owners of 16 of the most visited websites that target users invite people to take nude photos without permission.” images of women and girls.”
The lawsuit claims that the people behind these sites “violate state and federal laws prohibiting deepfake pornography, revenge pornography, and child pornography.”
“This investigation has taken us to the darkest corners of the internet, and I am absolutely shocked for the women and girls who have suffered this exploitation,” said City Attorney David Chiu. “Generative AI has enormous promise, but as with all new technologies, there are unintended consequences and criminals seek to exploit the new technology. We need to be very clear that this is not innovation – this is sexual abuse. This is a major, multifaceted problem that we as a society must solve as quickly as possible.”
Prosecutors allege that “bad actors” behind these deepfakes sites “have influenced women and girls in California and beyond, causing immeasurable harm to everyone from Taylor Swift and Hollywood celebrities to middle school and high school students.”
According to the statement, deepfakes are “used to blackmail, bully, threaten and humiliate women and girls.”
The sixteen targeted websites, the statement alleges, offer “undressing” images of women and girls. These websites provide easy-to-use interfaces for uploading clothed images of real people to generate realistic pornographic versions of those images. These websites require users to subscribe or pay to generate nude photos, profiting from non-consensual pornographic images of children and adults. Collectively, these websites were visited more than 200 million times in the first six months of 2024 alone.”
Chiu’s statement makes no mention of non-consensual AI-generated pornography targeting men.
A tangled web of obscure sites and companies
Chiu’s office confirmed the complaint – filed in San Francisco Superior Court as “People of the State of California v. Sol Ecom, Inc, et al.” – but has redacted the names of the websites.
The unredacted version of the complaint lists the targeted websites as follows: Drawudes.io (operated by Sol Ecom); Porngen.art and Undresser.ai (operated by Briver); Undress.app, Undress.love, Undress.cc and Ai-nudes.app (operated by Itai Tech); Nudify.online (operated by Defirex); Undressing.io (operated by Itai OÜ); Undressai.com (operated by Gribinets); Deep-nude.ai (operated by someone identified only as Doe #1); Pornx.ai (operated by someone identified only as Doe #2); Deepnude.cc (operated by someone identified only as Doe #3); Ainude.ai (operated by someone identified only as Doe #4); and Clothoff.io (run by someone identified only as Doe #5).
According to the lawsuit, several of these sites openly promote the non-consensual nature of their deepfakes.
For example, Drawudes.io would allow users to “deepnude girls for free” by uploading an image and using the website’s AI technology to “undress” the image. Users are invited to upload a photo with the message: ‘Anyone got undressed?’ Sol Ecom provides step-by-step instructions for selecting images that will produce nudified results of “good” quality.”
Some of these companies and sites, among others, were the first investigated by investigative journalist and researcher Kolina Koltai for the exposé website Bellingcat in February.
Koltai’s article claimed to have identified a “loosely connected network of similar platforms” including Clothoff, Nudify, Undress and DrawNudes, which had “variously manipulated financial and online service providers that ban adult content and non-consensual deepfakes by restricting their activities concealment to avoid crackdowns. .”
Koltai reported that DrawNudes initially appeared to be affiliated with an obscure firm called GG Technology LTD, but during its investigation, DrawNudes changed “the company listed on its website to Sol Ecom Inc.”, the first defendant named in Chiu’s lawsuit named.
Sol Ecom had addresses in Miami and Los Angeles. Both GG Technology and Sol Ecom list Ukrainian nationals as operators.