CHICAGO — A lawsuit filed under an unusual Illinois state law aimed at collecting biometric data has spotlighted an AI program that X has used since 2015 to identify nudity and NSFW images.
The lawsuit is a proposed class action lawsuit filed in 2023 by attorneys on behalf of Chicago resident Mark Martell, alleging that X – the platform formerly known as Twitter – violated the Illinois Biometric Information Privacy Act, commonly known as BIPA.
As XBIZ reported, attorneys filing BIPA lawsuits in Illinois have been known to target major tech companies such as Google, OnlyFans, Shutterfly and others. In one of these actions, Facebook once agreed to pay a $650 million settlement over BIPA issues.
Martell’s complaint alleges that Twitter “has implemented software since approximately 2015 to monitor pornographic and other not-safe-for-work (‘NSFW’) images uploaded to the site. NSFW images are then ‘tagged’ as such by Twitter, preventing them from being viewed by people who don’t want to see them.”
The complaint adds that in analyzing each image uploaded to Twitter to determine “whether it contains nudity (or other characteristics that Twitter deems objectionable), Twitter actively collects, records and/or otherwise obtains; stores; and/or uses the biometric identifiers and biometric information of each individual appearing in each photograph.”
On Thursday, however, Illinois federal judge Sunil R. Harjani dismissed Martell’s lawsuit but left the door open for the plaintiff to file an amended complaint by June 27.
At the heart of Martell’s lawsuit is X’s use of the third-party software PhotoDNA, developed by Microsoft. The lawsuit also mentions a 2015 article from Wired magazine that references an internal AI technology that Twitter developed in 2014 after it acquired Madbits, a pioneering AI startup founded by NYU researcher Clément Farabet.
“When Farabet and his MadBits team joined Twitter last summer, Alex Roetter, the company’s head of engineering, said they should build a system that could automatically identify NSFW images on its popular social network,” Wired reported at the time. “A year later, that AI is here.”
Martell claims that X’s AI scans images “without first making certain disclosures and obtaining written, informed consent from Illinois users posting photos” as required under Illinois BIPA, Law 360 reports.
U.S. District Judge Harjani ruled that PhotoDNA’s creation of a “hash” or unique digital signature of one of Martell’s images did not amount to “a scan of his facial geometry in violation of BIPA,” the report added.
The judge defined “biometric identifiers” for BIPA purposes as “a retina or iris scan, fingerprint, voiceprint, or scan of hand or facial geometry,” which excluded photographs.
“Although the plaintiff alleged that PhotoDNA scanned the photo to create a unique hash, the plaintiff did not allege any facts indicating that the hash is a scan of facial geometry, as opposed to merely a record of the photo,” wrote Judge Harjani. “Plaintiff’s claims leave open the question of whether the hash is a unique representation of the entire photo or specific to the faces of the people in the photo.”
Industry attorney Corey Silverstein of Silverstein Legal told XBIZ: “I absolutely love this decision. Judge Harjani has done a fantastic job analyzing a very difficult issue. BIPA has become a gold mine for class action claimants and their attorneys, and in my opinion the abuse of these types of claims has gotten completely out of hand. I hope this will cause potential plaintiffs and their counsel to pause before filing baseless lawsuits.”
The plaintiff’s main contention is that creating hash values of photos violates BIPA. The plaintiff’s argument appears to be that services that use PhotoDNA must hash all photos, including images of people’s faces in their database, to check whether those photos match a hash value in the PhotoDNA database. The plaintiff apparently claims that hashing the photo necessarily calculates the facial geometries of the subjects photographed. The court did not buy what the plaintiff was selling, although the judge did grant permission to make changes, which meant the plaintiff could amend his allegations and re-assert his claim.”