• Empricorn@feddit.nl
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 month ago

    This is tough. If it was just a sicko who generated the images for himself locally… that is the definition of a victimless crime, no? And it might actually dissuade him from seeking out real CSAM…

    BUT, iirc he was actually distributing the material, and even contacted minors, so… yeah he definitely needed to be arrested.

    But, I’m still torn on the first scenario…

    • kromem@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 month ago

      But, I’m still torn on the first scenario…

      To me it comes down to a single question:

      “Does exposure and availability to CSAM for pedophiles correlate with increased or decreased likelihood of harming a child?”

      If there’s a reduction effect by providing an outlet for arousal that isn’t actually harming anyone - that sounds like a pretty big win.

      If there’s a force multiplier effect where exposure and availability means it’s even more of an obsession and focus such that there’s increased likelihood to harm children, then society should make the AI generated version illegal too.

        • ricecake@sh.itjust.works
          link
          fedilink
          arrow-up
          1
          ·
          1 month ago

          How they’ve done it in the past is by tracking the criminal history of people caught with csam, arrested for abuse, or some combination thereof, or by tracking the outcomes of people seeking therapy for pedophilia.

          It’s not perfect due to the sample biases, but the results are also quite inconsistent, even amongst similar populations.

  • not_that_guy05@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    1 month ago

    Fuck that guy first of all.

    What makes me think is, what about all that cartoon porn showing cartoon kids? What about hentai showing younger kids? What’s the difference if all are fake and being distributed online as well?

    Not defending him.

  • 0x0001@sh.itjust.works
    link
    fedilink
    arrow-up
    0
    ·
    1 month ago

    One thing to consider, if this turned out to be accepted, it would make it much harder to prosecute actual csam, they could claim “ai generated” for actual images

    • theherk@lemmy.world
      cake
      link
      fedilink
      arrow-up
      0
      ·
      1 month ago

      I get this position, truly, but I struggle to reconcile it with the feeling that artwork of something and photos of it aren’t equal. In a binary way they are, but with more precision they’re pretty far apart. But I’m not arguing against it, I’m just not super clear how I feel about it yet.

      • Corkyskog@sh.itjust.works
        link
        fedilink
        arrow-up
        0
        ·
        1 month ago

        It’s not a difficult test. If a person can’t reasonably distinguish it from an actual child, then it’s CSAM.

        • phoenixz@lemmy.ca
          link
          fedilink
          arrow-up
          0
          ·
          1 month ago

          Just to play devil’s advocate:

          What about hentai where little girls get fondled by tentacles? (Please please please don’t make this be my most up voted post)

          • bitfucker@programming.dev
            link
            fedilink
            arrow-up
            0
            ·
            1 month ago

            Yeah, no. The commenter has stated actual child, not cartoon one. It is a different discussion entirely, and a good one too. Because artwork is a part of freedom of expression. An artwork CAN be made without hurting anyone or abusing anyone. We fully know that a human has creative capabilities to come up with something without having those actual something exist beforehand. It implies that humans can come up with CSAM without ever having seen a CSAM.

            • phoenixz@lemmy.ca
              link
              fedilink
              arrow-up
              1
              ·
              1 month ago

              Yeah but then it gets very messy and complicated fast. What about photo perfect AI pornography of minors? When and where do you draw the line?

        • Madison420@lemmy.world
          cake
          link
          fedilink
          arrow-up
          0
          arrow-down
          1
          ·
          1 month ago

          This would also outlaw “teen” porn as they are explicitly trying to look more childlike as well as models that only appear to be minors.

          I get the reason people think it’s a good thing but all censorship has to be narrowly tailored to content lest it be too vague or overly broad.

          • Corkyskog@sh.itjust.works
            link
            fedilink
            arrow-up
            0
            ·
            1 month ago

            And nothing was lost…

            But in seriousness, as you said they are models who are in the industry, verified, etc. It’s not impossible to have a white-list of actors, and if anything there should be more scrutiny on the unknown “actresses” portraying teenagers…

            • Madison420@lemmy.world
              cake
              link
              fedilink
              arrow-up
              0
              arrow-down
              1
              ·
              1 month ago

              Except jobs dude, you may not like their work but it’s work. That law ignores verified age, that’s a not insignificant part of my point…

      • Madison420@lemmy.world
        cake
        link
        fedilink
        arrow-up
        0
        arrow-down
        1
        ·
        1 month ago

        So long as the generation is without actual model examples that are actual minors there’s nothing technically illegal about having sexual material of what appears to be a child. They would then have a mens rea question and a content question, what actual defines in a visual sense a child? Could those same things equally define a person of smaller stature? And finally could someone like tiny texie be charged for producing csam as she by all appearance or of context looks to be a child.

        • Fungah@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          1 month ago

          It is illegal in Canada to have sexual depictions of a child whether its a real image or you’ve just sat down and drawn it yourself. The rationale being that behavior escalated, and looking at images goes to wanting more

          It borders on thought crime which I feel kind of high about but only pedophiles suffer which I feel great about. There’s no legitimate reason to have sexualized image of a child whether computer geneerate, hand drawn, or whatever.

          • Madison420@lemmy.world
            cake
            link
            fedilink
            arrow-up
            0
            arrow-down
            1
            ·
            1 month ago

            This article isn’t about Canada homeboy.

            Also that theory is not provable and never will be, morality crime is thought crime and thought crime is horseshit. We criminalize criminal acts not criminal thoughts.

            Similarly, you didn’t actually offer a counterpoint to any of my points.

          • Madison420@lemmy.world
            cake
            link
            fedilink
            arrow-up
            0
            ·
            1 month ago

            Real images that don’t have to be of csam but rather of children, it could theoretically train anything sexual with legal sexual content and let the ai connect the dots.

  • prettydarknwild@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    1 month ago

    oh man, i love the future, we havent solved world hunger, or reduce carbon emissions to 0, and we are on the brink of a world war, but now we have AI’s that can generate CSAM and fake footage on the fly 💀

    • Dasus@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      1 month ago

      Technically we’ve solved world hunger. We’ve just not fixed it, as the greedy fucks who hoard most of the resources of this world don’t see immediate capital gains from just helping people.

      Pretty much the only real problem is billionaires being in control.

      • ArchRecord@lemm.ee
        link
        fedilink
        arrow-up
        0
        ·
        1 month ago

        True that. We have the means to fix so many problems, we just have a very very very small few that reeeeally don’t like to do anything good with their money, and instead choose to hoard it, at the expense of everyone else.

        • myliltoehurts@lemm.ee
          link
          fedilink
          arrow-up
          0
          ·
          1 month ago

          Oh cmon they don’t hoard the money. They use it to pay each other/politicians to make sure the status quo remains.

          • luciferofastora@lemmy.zip
            link
            fedilink
            arrow-up
            1
            ·
            1 month ago

            They hoard rights and powers, usually. The right to control property and capital far in excess of reasonable private comfort, the right to a share of a company’s profit for using that property and capital, the right to influence its course and all the powers deriving from that.