• ssj2marx@lemmy.ml
    link
    fedilink
    arrow-up
    3
    ·
    5 months ago

    there cannot be developed a scale or spectrum to judge where the fake stops and real starts

    Ah, but my definition didn’t at all rely on whether or not the images were “real” or “fake”, did it? An image is not merely an arrangement of pixels in a jpeg, you understand - an image has a social context that tells us what it is and why it was created. It doesn’t matter if there were real actors or not, if it’s an image of a child and it’s being sexualized, it should be considered CSAM.

    And yes I understand that that will always be a subjective judgement with a grey area, but not every law needs to have a perfectly defined line where the legal becomes the illegal. A justice system should not be a computer program that simply runs the numbers and delivers an output.

    • TheAnonymouseJoker@lemmy.ml
      link
      fedilink
      arrow-up
      1
      arrow-down
      5
      ·
      5 months ago

      An image is not merely an arrangement of pixels in a jpeg,

      I am not one of those “it’s just pixels on a screen” people. But if it was not recorded in real world with a camera, it cannot be real.

      Who will be the judge? If there is some automated AI created, who will be the one creating it? Will it be perfect? No. We will end up in the situation that Google caused to users, like doctors, married parents and legitimate people being labelled as pedophiles or CSAM users. It has already happened to me in this thread, and you also said it. The only accurate way to judge it will be a very large team of forensic experts on image/video media, which is not feasible for the amount of data social media generates.

      not every law needs to have a perfectly defined line

      And this is where the abuse by elites, politicians and establishment starts. Activists and dissidents can be easily jailed by CSAM being planted, which would in this case be as simple as AI pictures being temporary drive by downloads onto target’s devices.

      • ssj2marx@lemmy.ml
        link
        fedilink
        arrow-up
        2
        ·
        5 months ago

        Who will be the judge?

        The same people that should judge every criminal proceeding. Of course it’s not going to be perfect, but this is a case of not letting perfect be the enemy of good. Allowing generated or drawn images of sexualized children to exist has external costs to society in the form of normalizing the concept.

        The argument that making generated or drawn CSAM illegal is bad because the feds might plant such images on an activist is incoherent. If you’re worried about that, why not worry that they’ll plant actual CSAM on your computer?

        • TheAnonymouseJoker@lemmy.ml
          link
          fedilink
          arrow-up
          1
          arrow-down
          4
          ·
          5 months ago

          Have you considered the problem of doctors, married parents and other legitimate people being labelled as CSAM users and pedophiles? This has already happened, and they are not obligated to take the brunt of misjudgement of tools developed to judge such media. This is not a hypothetical scenario, and has already happened in real world, and has caused real world damage to people.

          The argument of planted CSAM is not incoherent, but has also played out with many people. It is one of the favourite tools for elites and ruling politicians to use. I am less worried about it because such a law thankfully does not exist, that will misjudge the masses brutally for fictional media.

          • ssj2marx@lemmy.ml
            link
            fedilink
            arrow-up
            3
            ·
            5 months ago

            How many times can I say “social context” before you grok it? There’s a difference between a picture taken by a doctor for medical reasons and one taken by a pedo as CSAM. If doctors and parents are being nailed to the cross for totally legitimate images then that strikes me as evidence that the law is too rigid and needs more flexibility, not the other way around.

            • TheAnonymouseJoker@lemmy.ml
              link
              fedilink
              arrow-up
              1
              arrow-down
              6
              ·
              5 months ago

              If a pedophile creates a hospital/clinic room setting and photographs a naked kid, will it be okay? Do you understand these problems impossible to solve just like that? Parents also take photos of their kids, and they do not take photos like a doctor would. They take photos in more casual settings than a clinic. Would parents be considered pedophiles? According to the way you propose to judge, yes.

              You are basically implying that social defamation is what matters here, and the trauma caused to victim of such fictional media is a problem. However, this is exactly what anti-AI people like me were trying to warn against. And since these models are open source and in public hands, the cat is out of the bag. Stable diffusion models work on potato computers and take atmost 2-5 minutes to generate per photo, and 4chan has entire guides for uncensored models. This problem will be 100x worse in a couple years, and 1000x worse in the next 5 years. And infinitely worse in a decade. Nothing can be done about it. This is what AI revolution is. Future generations of kids are fucked thanks to AI.

              The best thing one can do is protect their privacy, and photos from being out there. Nobody can win this battle, and even in the most dystopian hellhole with maximum surveillance, there will be gaps.

              • Todd Bonzalez@lemm.ee
                link
                fedilink
                arrow-up
                1
                arrow-down
                1
                ·
                5 months ago

                These are some insane mental gymnastics.

                Congratulations on the power trip purging every comment that calls you out.