archive

A sleepy town in southern Spain is in shock after it emerged that AI-generated naked images of young local girls had been circulating on social media without their knowledge.

The pictures were created using photos of the targeted girls fully clothed, many of them taken from their own social media accounts.

These were then processed by an application that generates an imagined image of the person without clothes on.

So far more than 20 girls, aged between 11 and 17, have come forward as victims of the app’s use in or near Almendralejo, in the south-western province of Badajoz.

“One day my daughter came out of school and she said ‘Mum there are photos circulating of me topless’,” says María Blanco Rayo, the mother of a 14-year-old.

“I asked her if she had taken any photos of herself nude, and she said, ‘No, Mum, these are fake photos of girls that are being created a lot right now and there are other girls in my class that this has happened to as well.’”

She says the parents of 28 girls affected have formed a support group in the town.

Spanish police press conference
Spanish authorities have launched an investigation into the images

Police are now investigating and according to reports, at least 11 local boys have been identified as having involvement in either the creation of the images or their circulation via the WhatsApp and Telegram apps.

Investigators are also looking into the claim that an attempt was made to extort one of the girls by using a fake image of her.

The impact the images’ circulation has had on the girls affected varies. Ms Blanco Rayo says her daughter is bearing up well, but that some girls “won’t even leave their house”.

Almendralejo is a picturesque town with a population of just over 30,000 which is known for its production of olives and red wine. But it’s not used to the sudden attention this case has brought, making the town national headline news.

That’s in great part because of the efforts of one of the girls’ mothers, Miriam Al Adib. She’s a gynaecologist who has used her already prominent social media profile to place this issue at the centre of Spanish public debate.

Miriam Al Adib
“I wanted to give the message: it’s not your fault,” Miriam Al Adib says

Although many of the AI images are believed to have been created over the summer, the case only came to light in recent days after Dr Adib posted a video reassuring the girls affected and their parents.

“We didn’t know how many children had the images, if they had been uploaded to pornographic sites - we had all those fears,” she says.

“When you are the victim of a crime, if you are robbed, for example, you file a complaint and you don’t hide because the other person has caused you harm. But with crimes of a sexual nature the victim often feels shame and hides and feels responsible. So I wanted to give that message: it’s not your fault.”

The suspects in the case are aged between 12 and 14. Spanish law does not specifically cover the generation of images of a sexual nature when it involves adults, although the creation of such material using minors could be deemed child pornography.

Another possible charge would be for breaching privacy laws. In Spain, minors can only face criminal charges from the age of 14 upwards.

The case has caused concern even for local people who are not involved.

“Those of us who have kids are very worried,” says Gema Lorenzo, a local woman who has a son, aged 16, and a daughter, aged 12.

“You’re worried about two things: if you have a son you worry he might have done something like this; and if you have a daughter, you’re even more worried, because it’s an act of violence.”

Francisco Javier Guerra, a local painter and decorator, says the parents of the boys involved are to blame. “They should have done something before, like take their phones away, or install an application that tells them what their children are doing with their phone.”

This is not the first time such a case has become news in Spain. Earlier this year, AI-generated topless images of the singer Rosalía were posted on social media.

“Women from different parts of the world have written to me explaining that this has happened to them and they don’t know what to do,” says Miriam Al Adib.

“Right now this is happening across the world. The only difference is that in Almendralejo we have made a fuss about it.”

The concern is that apps such as those used in Almendralejo are becoming increasingly commonplace.

Javier Izquierdo, head of children’s protection in the national police’s cyber-crime unit, told Spanish media that these kinds of crimes are no longer confined “to the guy who downloads child porn from the Dark Web or from some hidden internet forum”.

He added: “That obviously is still going on, but now the new challenges we are facing are the access that minors have at such an early age [to such technology], such as in this case.”

  • Chaos@lemmy.world
    link
    fedilink
    arrow-up
    37
    arrow-down
    2
    ·
    1 year ago

    They had the one ability not to harm anyone with CSAM and they try to generate real people… Truly vile

    • breathless_RACEHORSE@lemmy.world
      link
      fedilink
      arrow-up
      25
      ·
      1 year ago

      I admire your positive thinking, but it may also provide plausible deniability for legitimate CSAM, by your own logic. Either way, I see this being used to bully, blackmail, or worse. It’s not that we are going to stop AI development, nor that we should. Perhaps as it improves (remember, right now it’s the worst it will ever be), we can teach AI to recognize when it may be used for purposes like creating realistic CSAM or other such material, and have it log or report such uses.

      I honestly don’t know the solution, but I don’t see the world ever getting “ho-hum, it’s all fake anyway” about minor involved pornography.

      • cactusupyourbutt@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        it will do neither. i generated pictures can usually be identified by eye alone, and some companies are starting to add an invisible watermark to their output

        • Eezyville@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          1 year ago

          You guys are working with the idea that companies and organizations will be creating these AI tools. They will be bound by laws but random people of groups of people will not. It will only get worse.

    • CeruleanRuin@lemmings.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 year ago

      But the overall impact of not being able to trust any image will not be positive. We already have widespread distrust of everything in the media that leads to people forming cults around their beliefs and strong personalities that are immune to facts. That will only get worse as photos and videos can no longer be trusted as evidence of anything real.

    • Fisk400@feddit.nu
      link
      fedilink
      arrow-up
      7
      arrow-down
      1
      ·
      1 year ago

      Yeah, it might all work out after an undetermined time of unimaginable suffering on multiple fronts.

    • macallik@kbin.social
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      edit-2
      1 year ago

      Methinks you might want to give it space to breathe from the child porn scandal before you attempt a positive spin on truth becoming irrelevant in the future.

  • Beardedsausag3@kbin.social
    link
    fedilink
    arrow-up
    16
    arrow-down
    8
    ·
    1 year ago

    I’ve said for the longest time, the world needs wiping. A hard reset.

    Technology comes to the masses and shit like this is some people’s first thought? Constantly figuring out how to better kill each other, humiliate, bully. It blows my mind, even at the old old age of 34 how utterly deprived people can be.

    Tell a family member you love them today, give them a hug - create your own net positive.

    • thonofpy@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      edit-2
      1 year ago

      The world needs building. A shared vision. People empowering and protecting each other. It is hard, but we must try.

      • Beardedsausag3@kbin.social
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        This is the more reasonable and realistic approach, I’m just not even sure how that’d come about. Although I feel education is an important factor from a young age on the do’s and don’ts - tools? Maybe need to be developed that somehow doesn’t intrude on the right to privacy.

        It’s concerning, after reading this this morning then speaking to some of the lads about it, me Mrs - even with me Mum when I was out for a walk with her… It’s leaving people feel hella vulnerable. Nothing to stop anyone right now taking a pic of any one us and slapping it on some weird shit, some incriminating shit, racial hatred shit. Anything.

        It’s ticking time bomb imo

  • Pyr_Pressure@lemmy.ca
    link
    fedilink
    arrow-up
    5
    arrow-down
    1
    ·
    1 year ago

    For an AI generator to make convincing photos of CSAM, wouldn’t it have had to have access to real CSAM to “learn” how to make it?

    Wouldn’t the creators of the generator then have to have had access to it / download it or whatever they do when they feed material to the algorithm?

    • drislands@lemmy.world
      link
      fedilink
      arrow-up
      12
      ·
      1 year ago

      Not necessarily. I’m only speculating, which I don’t relish, but I imagine the service isn’t explicitly designed with CSAM as the intended result – if I’m reading it right, it’s billed as “upload picture of clothed person X and received ‘naked’ picture of same person”, and is almost certainly trained on appropriately-aged pornographic material.

      I don’t know the inner workings of technology like this, and I certainly don’t intend to critique the images talked about in this article, but the resulting picture doesn’t have to be an accurate representation of a minor’s body to be distressing to the people involved.

    • lud@lemm.ee
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      1 year ago

      Nah, I don’t think so. I haven’t tried and I won’t try, but these images are probably just made with inpainting with generic nsfw prompts.

      If the AI was trained on enough nudity it would probably just scale it down to child size or whatever.

      I can’t imagine there must be a huge difference in appearance.

      I am just guessing though.