• Allero@lemmy.today
    link
    fedilink
    English
    arrow-up
    34
    arrow-down
    1
    ·
    2 days ago

    I’m afraid Europol is shooting themselves in the foot here.

    What should be done is better ways to mark and identify AI-generated content, not a carpet ban and criminalization.

    Let whoever happens to crave CSAM (remember: sexuality, however perverted or terrible it is, is not a choice) use the most harmless outlet - otherwise, they may just turn to the real materials, and as continuous investigations suggest, there’s no shortage of supply or demand on that front. If everything is illegal, and some of that is needed anyway, it’s easier to escalate, and that’s dangerous.

    As sickening as it may sound to us, these people often need something, or else things are quickly gonna go downhill. Give them their drawings.

    • raptir@lemmy.zip
      link
      fedilink
      English
      arrow-up
      3
      ·
      18 hours ago

      What would stop someone from creating a tool that tagged real images as AI generated?

      Have at it with drawings that are easily distinguished, but if anything is photorealistic I feel like it needs to be treated as real.

      • Allero@lemmy.today
        link
        fedilink
        English
        arrow-up
        1
        ·
        17 hours ago

        Some form of digital signatures for allowed services?

        Sure, it will limit the choice of where to legally generate content, but it should work.

        • raptir@lemmy.zip
          link
          fedilink
          English
          arrow-up
          1
          ·
          13 hours ago

          I highly doubt any commercially available service is going to get in on officially generating photorealistic CSAM.

    • Dr. Moose@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      22 hours ago

      This relies on the idea that “outlet” is not harmful. It might be encouraging even but who do you think even would ever study this to help us know here. Can you imagine the scientists who’d have to be leading studies like this - incredibly grim and difficult subject with high likelihood that no one would listen to you anyway.

      • aidan@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        19 hours ago

        IIRC there was actually a study and pedos with access to synthetic CSAM were less likely to victimize real children.

    • turnip@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      9
      ·
      2 days ago

      You can download the models and compile them yourself, that will be as effective as the US government was at banning encryption.

    • Fungah@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      12
      ·
      2 days ago

      I haven’t read any of this research because, like, the only feelings I have about pedophiles are outright contempt and a small amount of pity for the whole fucking destructive evilness of it all, but I’ve been told having access to drawings and images and whatnot makes people more likely to act on their impulses.

      And like. I don’t think images of CSAM in any form, no matter how far removed they are from real people, actually contribute anything worthwhile st all yo the world, so like. I dunno.

      Really couldn’t give two squirts of piss of about anything that makes a pedophiles life harder. Human garbage.

      • Allero@lemmy.today
        link
        fedilink
        English
        arrow-up
        19
        ·
        2 days ago

        As an advocate for online and offline safety of children, I did read into the research. None of the research I’ve found confirm with any sort of evidence that AI-generated CSAM materials increase risks of other illicit behavior. We need more evidence, and I do recommend to exercise caution with statements, but for the time being, we can rely on the studies in other forms of illegal behaviors and the effects of their decriminalization, which paint a fairly positive picture. Generally, people will tend to opt for what is legal and more readily accessible - and we can make AI CSAM into exactly that.

        For now, people are criminalized for the zero-evidence-its-even-bad crime, while I tend to look quite positively on what it can bring on the table instead.

        Also, pedophiles are not human trash, and this line of thinking is also harmful, making more of them hide and never get adequate help from a therapist, increasing their chances of offending. Which, well, harms children.

        They are regular people who, involuntarily, have their sexuality warped in a way that includes children. They never chose it, they cannot do anything about it in itself, and can only figure out what to do with it going forward. You could be one, I could be one. What matters is the decisions they take based on their sexuality. The correct way is celibacy and refusion of any sources of direct harm towards children, including the consumption of real CSAM. This might be hard on many, and to aid them, we can provide fictional materials so they could let some steam off. Otherwise, many are likely to turn to real CSAM as a source of satisfaction, or even turn to actually abusing children IRL.