The Civil Guard is investigating three minors in Mallorca for an alleged montage of pornographic images starring girls and which was carried out through artificial intelligence. The images show the girls, classmates at the minors’ school, in sexual attitudes that were later shared on the internet, according to Ultima Hora. It is the first case to be detected in Mallorca although the Civil Guard has already been aware of similar events in other parts of Spain.
The minors manipulated real images of their classmates obtained through social networks and then disseminated their content massively through social networks and messaging applications. The parents of the affected minors are the ones who filed the complaint with the Civil Guard when they discovered the harassment that the minors are suffering.
The manipulated images were widely disseminated among colleagues at the institute before the Civil Guard became aware of the complaint. In them the girls are seen in highly realistic sexualized attitudes thanks to the use of artificial intelligence, something that made it difficult to discover that it was a montage. Following the complaint, the Civil Guard began investigations and came to the conclusion that it came from people close to the girls.
Finally, it was discovered that the images had been created by three classmates, two of them 16 years old and one 14 years old. The Juvenile Prosecutor’s Office has opened proceedings and the three children have been summoned to testify, although the youngest child cannot be charged because he is under 16 years old.
Cases like this have occurred before in other parts of Spain, such as Zaragoza and Badajoz. In line with these precedents, experts in criminal law explained that the companies that own the applications that are used to recreate these false nudes would not have criminal liability, but the Spanish Data Protection Agency could take administrative action for not acting to prevent this type of applications.
In October, the Senate approved a PP motion that advocates expanding the legal definition of child pornography to classify images generated by AI that present minors in sexually explicit conduct or that affect their privacy.