In a small town in southern Spain, more than twenty school girls were shocked to find AI-generated nude images on their mobile phones. The alarming incident came to light when the girls returned to school after the summer break.
The use of deepfake technology in this case has raised complex legal concerns and opened up discussions about potential consequences.
Although the images were fake, they appeared incredibly realistic. These manipulated images were created by stealing the girls’ photos from Instagram and using an AI application to transform them into lifelike nudes. The images were then circulated through WhatsApp groups.
Parents and legal authorities are now grappling with questions about whether a crime has been committed and if these images can be considered child pornography.
One of the concerned mothers, Miriam Al Adib, expressed her distress on Instagram, saying, “The montages are super realistic, it’s very disturbing and a real outrage.” She went on to reveal her daughter’s reaction to the images, exclaiming, “My daughter told me with great disgust: ‘Mum, look what they have done to me.'”
Al Adib also expressed fears that these images could have been shared on platforms like Onlyfans or pornographic websites.
The incident involves victims as young as 11 years old, some of whom have yet to enter high school. Another mother, Fátima Gómez, shared how her daughter was targeted for blackmail by a boy on social media. When her daughter refused, he retaliated by sending her a fake nude image.
In response to this distressing incident, the mothers of the victims joined forces to protest and demand action, leading to a police investigation. Several minors who were allegedly involved in distributing these deepfake images have already been identified, some of whom were classmates of the affected girls, according to a local politician.
The deepfake images were produced using the ClothOff app*, which boasted the slogan “Undress anybody, undress girls for free.” This app allowed users to digitally remove clothing from individuals in their phone’s image gallery. With just $10, they could create up to 25 fake nude images.
Although the images were entirely fabricated, the emotional distress experienced by the girls was very real, as highlighted by the mothers.
The legal implications surrounding deepfakes present a dilemma. Current laws in Spain and other European Union countries do not adequately address this issue. However, Manuel Cancio, a criminal law professor at the Autonomous University of Madrid, suggests that the act could potentially be classified as a crime against moral integrity, thus filling a legal gap for such offenses.
In 2022, the European Commission proposed legislation to criminalize these types of offenses in a directive on cybercrime. So far, the Dutch Criminal Code is the only legal framework that specifically addresses the issue.
Experts have differing opinions on whether this incident can be classified as the distribution of child pornography, which carries more severe penalties. Concerningly, the dark web is already flooded with AI-generated child pornography, according to a Guardian report.
The legal battle surrounding deepfake technology is an ongoing process, with many unanswered questions.
One thing is certain: this issue isn’t going away, and it might only worsen in the future.