'Impulss' explores the red lines of AI use
This summer, the San Francisco city attorney filed charges against more than 50 individuals connected to websites that allow users to create nude photos of anyone using artificial intelligence. Among those implicated are individuals with ties to Estonia, according to this week's episode of "Impulss."
The fact that the United States is combating websites that use artificial intelligence is relatively new. According to IT-specialized attorney Rauno Kinkar, it is even more unusual for AI to be used to create pornographic images of individuals without their consent.
"Such technology simply didn't exist before. New technology made it explosively accessible to anyone, anywhere, at virtually no cost," said Kinkar, a sworn advocate specializing in IT, intellectual property and data protection.
Around 16 websites have been created where users can upload a photo of any clothed person. Within seconds, the user receives an AI-generated fake image in which the person in the photo is entirely undressed.
"The issue with pornographic images is not just that an unknown person is being generated. It's that someone familiar, someone famous or someone real is being targeted and generated based on their likeness," said Tanel Tammet, a professor of applied artificial intelligence at TalTech.
Among the 56 defendants charged with running websites that created pornographic content, three have links to Estonia. Of these, two are Estonian-registered companies: Defirex OÜ, which is owned by an Armenian national, and ITAI OÜ, owned by a Serbian citizen.
"I'm not convinced that these entities are substantively connected to Estonia. It's possible they are merely legal shells," Kinkar commented.
However, the connection is clearer in the case of the third individual – 22-year-old Augustin Gribinets – whose listed place of residence is Estonia. His website was visited nearly six million times in the first half of this year.
"The apparent purpose of these websites is to create nude photos of women without their consent. And they don't hide it at all. To quote one of these sites: 'Why waste time taking a woman on a date when you can just use site X to get her nude photos,' said San Francisco Deputy City Attorney Karun Tilak.
This very phrase was used by Gribinets to advertise his website, which has since been removed from the internet. However, "Impulss" managed to retrieve an archived version of the site. On the restored webpage, it is evident that it guaranteed anonymity to all users aiming to see individuals in photos undressed. The site also prominently promised the ability to strip anyone naked.
"I can safely say that none of these portals comply with any provisions of the General Data Protection Regulation (GDPR). Violations are clear," remarked Kinkar.
He pointed out that the individuals being digitally undressed by AI likely have no idea such revealing images of them have been created.
According to court documents, Gribinets asked users of the website to confirm whether the subject of their interest had consented to being undressed. However, as per the charges, the site accepted all images, even those depicting minors.
Rauno Kinkar suggested that the portal's creator might argue that they simply created a platform for self-expression. "I couldn't have assumed that some perverts or extreme perverts would come here and use it to produce child pornography," Kinkar reflected on the potential defense of the website creator.
"Impulss" also asked the website's owner directly about the charges brought against him. In response to the journalist's question about the website, Gribinets claimed he had no such site and ended the call.
On Gribinets' website, users could generate their first images for free, but to expand their portfolio, payment was required via credit card or cryptocurrency.
According to Tanel Tammet, it is easier to identify the real beneficiaries behind paid websites.
"As soon as money is involved, it becomes clear that the movement of funds can be tracked. At the same time, some creators of these tools have been cautious, requiring payments in cryptocurrency and making deliberate efforts to hide their identities," Tammet explained.
Tammet noted that applying AI does not require significant technological expertise, as website creators often use pre-existing AI models.
"This image-generation technology wasn't specifically developed for pornography. It's simply being used for that because it seems like a very attractive application to many people," Tammet added.
He clarified that such activity is not necessarily illegal. "In most cases, they're not distributing the content themselves. They're just providing a tool that you can use. The tool itself isn't inherently criminal."
In summary, Estonia lacks a clear consensus or legal framework to definitively state that creating fake nude images of individuals or developing websites for this purpose is a crime.
"Is it okay if I take a picture of my neighbor to a friendly artist and ask them to draw me a nude calendar of that person, paying them for it?" asked attorney Kinkar.
"Until now, as long as it hasn't been on a massive scale, the general answer has been that this is basically okay, provided you don't distribute it and it doesn't involve child pornography."
Kinkar suggested that distinguishing between real and deepfake images might benefit in the future from the European Union's forthcoming AI regulation.
"If you launch a platform or system that allows for the generation of deepfakes, the AI regulation doesn't conceptually ban this activity, but you would need to label it accordingly," Kinkar added.
--
Follow ERR News on Facebook and Twitter and never miss an update!
Editor: Marcus Turovski