Hugh Nelson, 27, from Bolton, jailed after transforming normal pictures of children into sexual abuse imagery

A man who used AI to create child abuse images using photographs of real children has been sentenced to 18 years in prison.

In the first prosecution of its kind in the UK, Hugh Nelson, 27, from Bolton, was convicted of 16 child sexual abuse offences in August, after an investigation by Greater Manchester police (GMP).

Nelson had used Daz 3D, a computer programme with an AI function, to transform “normal” images of children into sexual abuse imagery, Greater Manchester police said. In some cases, paedophiles had commissioned the images, supplying photographs of children with whom they had contact in real life.

He was also found guilty of encouraging other offenders to commit rape.

  • Flying Squid@lemmy.worldM
    link
    fedilink
    English
    arrow-up
    55
    arrow-down
    1
    ·
    20 days ago

    I think the last two paragraphs in the body of this post are the real issue here, not that he was just using AI to create CSAM.

    • Mango@lemmy.world
      link
      fedilink
      English
      arrow-up
      22
      arrow-down
      7
      ·
      20 days ago

      Right? Feels like this is being tacked on as a shot at AI. Otherwise nobody is harmed except the guy. Pedos are ick, but if harmless then why punish? I don’t think anyone should have to take a fall because others think their desires are gross.

        • Flying Squid@lemmy.worldM
          link
          fedilink
          English
          arrow-up
          14
          arrow-down
          4
          ·
          20 days ago

          I agree, but if there were some way to create CSAM without using real children (I’m not sure how you would train such an AI model), it would probably be worth seeing if that did anything to make pedophiles less likely to act out on their desires.

          Because my god, we need to figure out something.

          • Zorque@lemmy.world
            link
            fedilink
            English
            arrow-up
            10
            arrow-down
            1
            ·
            20 days ago

            I mean trying to help them get treatment instead of going all pod-people on anyone showing even the possibility of being attracted to kids would be helpful.

            • Flying Squid@lemmy.worldM
              link
              fedilink
              English
              arrow-up
              13
              arrow-down
              3
              ·
              20 days ago

              I’ve been saying that for ages. Obviously we don’t want to enable any pedophiles to do anything horrific to children, but we’re at a state right now where if you have those urges to begin with, you’re basically already told to accept that you’re an incurable monster. So why not act on the urges?

              Somehow we need to get through to such people that they need to get help before they do anything terrible. I’m not sure how to do that in the current climate though.

          • otp@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            2
            ·
            20 days ago

            Train it to depict humans that look like anime characters that are definitely 18 or older immortal dragons that are taking on the bodies of young human beings

            Disclaimer

            I am not condoning, endorsing, or suggesting this

          • Jake Farm@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            1
            ·
            20 days ago

            Its a form of stalking, probably makes it more likely for them to rape that child, even if they don’t wind up doing that it would still qualify as a form of revenge porn.

                • Mango@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  20 days ago

                  Commissioning as in buying? I’m not sure how that changes it to stalking.

                  IMO, the worst part about it is that there’s someone else out there who thinks less of me because there’s some naked imagery of me.

                • Mango@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  1
                  ·
                  20 days ago

                  I can buy photos of Robert Downey Junior from Marvel Studios and that’s not stalking.

      • cygnus@lemmy.ca
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        1
        ·
        20 days ago

        I think this was a crime because he modified images of actual kids. If the images were 100% AI (not of real people) I’m not sure on what basis that would be considered a crime, no more than a handmade drawing of a nude minor drawn from imagination.

        • FourPacketsOfPeanuts@lemmy.world
          link
          fedilink
          English
          arrow-up
          17
          ·
          20 days ago

          Any sexual representation of a child is illegal in the UK whether it looks real or not. In fact I believe it doesn’t need to even be a child, it’s a illegal if a reasonable person would believe it was depicting a child. This came up when adults who were into age play got into trouble distributing their images because it looked convincingly underage.

          • Jake Farm@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            15
            ·
            20 days ago

            Wait so even if the subjects are adults in costume its illegal? Fuck man, school uniforms is a whole genre of porn.

          • AmidFuror@fedia.io
            link
            fedilink
            arrow-up
            11
            ·
            20 days ago

            And I suppose we can rely on the courts to know sexual when they see it, so people don’t get in trouble for taking pictures of cherubs at the Louvre.

            • FourPacketsOfPeanuts@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              20 days ago

              Nice try lol, non-sexualised nudity is not illegal. UK law has a degree of common sense about it. A stick figure, even mildly sexualised, is unlikely to pass the test for indecency. Having said that, if someone drew some sort of extreme circumstance then, I don’t know for sure, but I can imagine someone getting into shit about it.

          • cygnus@lemmy.ca
            link
            fedilink
            English
            arrow-up
            3
            ·
            20 days ago

            Thanks for clarifying, I didn’t know that. Seems like a bit of an overreach to me, but I suppose in this particular case it’s best to err on the side of caution.

        • Mango@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          20 days ago

          I don’t really think anything is 100% AI. I also don’t really believe in the concept of thought being a crime and extend personally kept data to that realm.

      • Dr. Wesker@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        2
        ·
        edit-2
        20 days ago

        The fuck? Nothing about generating and distributing CSAM material is harmless, and especially if images of real children are being used to generate it.

          • otp@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            2
            ·
            20 days ago

            Would it harm you to have identifiable nude photos of you available for download on the internet?

            Would it harm you to have identifiable nude photos of you being used to train AI so that it can create more nude images that are “inspired” by your nude images?

            Would you be happy to upload your children’s nude photos so that people on the internet can share them and masturbate to them? Would you be harmed if your parents had done that with your images?

            • Mango@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              20 days ago

              As a child? No. In fact, I can milk that for pity money. As an adult, I can’t see how it matters. I don’t like it, but it doesn’t hurt me any.

              Also definitely no.

              Again, double no.

              • otp@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                20 days ago

                To clarify, the second last question about your children was “would you be happy to …”

                If you wouldn’t be happy to, then why not?

                And if you would be happy to do that, then why? Lol

                • Mango@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  20 days ago

                  You got me there. It’s definitely weird and gross and therefore no. That’s harm enough, but that’s more a matter of it being published and real. This dude doing it for himself is hardly different to me from fantasizing in your head or drawing in your sketchbook. That said, what was his AI training material? He’s also doing this for other people and encouraging rape and shit.

  • Sundial@lemm.ee
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    2
    ·
    20 days ago

    Good riddance. Now go after the fucks that bought this shit from him.

  • Prunebutt@slrpnk.net
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    2
    ·
    20 days ago

    Most AI porn images looks quite underage to me, to be completely frank. :/

    • Catoblepas@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      7
      ·
      20 days ago

      At least with a human being it’s a matter of factuality whether or not they’re over 18. But with AI it’s unverifiable, especially considering some models have already been trained on CSEM.

      Once someone has that model locally, do they technically possess CSEM, even unknowingly? Do they only possess it if they try to make the AI make it? Seems like something someone in charge should have thought about in a legally binding way before dumping the internet into an image generator!

      • superkret@feddit.org
        link
        fedilink
        English
        arrow-up
        16
        ·
        edit-2
        20 days ago

        In this case he used pictures from actual children and transformed them into CSAM using AI. So there’s no question about the age, and there are real victims, too.

        • Catoblepas@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          7
          ·
          20 days ago

          Oh yeah, this dude without a question is guilty and a pedo. I meant more that ‘out of the box’ models may still produce material that looks really CSEM adjacent, and you have no way of telling whether or not it used CSEM to generate the image if the whole dataset is poisoned by actual CSEM being included.

      • Zaktor@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        2
        ·
        20 days ago

        I assume any CSEM ingested into these models is absolutely swamped by the massive amount of adult porn that’s much more easily available. A handful of images aren’t going to drive model output in datasets of the scale of the image generation models. Maybe there are keywords that could drill down to be more associated with the child porn, but a lot of “young” type keywords are already plentifully applied to adults, and I imagine accidental child porn ingests are much less likely to be as conveniently labeled.

        So maybe you can figure out how to get it to produce child porn, but it probably won’t just randomly produce it for an innocent porn prompt.

      • FourPacketsOfPeanuts@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        20 days ago

        Bound to be tested in court sooner or later. As far as I understand it one is “in possession” if they have access to a set of steps or procedures that would recover an image. So this prevents offenders from hiding behind the fact their images were compressed in a zip file or something. They don’t have a literal offending image, but they possess it in a form that they can transform.

        What would need to be tested is that AI generators are coming up with novel images rather than retrieving existing ones. It seems like common sense but the law is quite pedantic. The more significant issue is that generators don’t need to be trained on csem to come up with it. So proving someone had it with the intent of producing it would always be hard. Even generators trained on illegal material I’m not sure it would be straight forward to prove that someone knew what it was capable of.

      • Prunebutt@slrpnk.net
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        edit-2
        20 days ago

        I’m not the person to clear up this legal grey area. I just think that AI porn often has these incredibly young faces which makes the enjoyers of that porn extra creepy.

    • Flying Squid@lemmy.worldM
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      4
      ·
      20 days ago

      Most real porn has women who look like kids to me.

      Even the so-called MILFs look about 15 years younger than me and I’m 47.

      You have to get into “mature” and shit to see women my age.

      I’m not into young women. I’m just not. It looks like they’re fucking a high schooler and it’s icky to me.

      And then there’s all the schoolgirl and incest or incest-adjacent shit. “Playing with my stepdad.” No. Just no.

      • Zorque@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        5
        ·
        20 days ago

        So… anyone who’s not your age looks like a child to you? That’s kind of fucked up.

        • Flying Squid@lemmy.worldM
          link
          fedilink
          English
          arrow-up
          7
          ·
          20 days ago

          No?

          The majority of the women in porn, who can’t be more than their very early 20s, look like children to me. And they infantilize them too. I’m not sure where you got anyone not my age from.

          • Zorque@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            20 days ago

            Fair, you didn’t explicitly state it. Just implied it with statements about how most people in porn (who should all be adults, unless you’re looking at questionable material) look like children to you. Then make comments about how even the “milfs” are too young.

            Maybe it’s not about them being too young, maybe it’s time you accept that you’re old. You’re putting a lot of your own biases into your judgment instead of looking at it objectively.

            • Flying Squid@lemmy.worldM
              link
              fedilink
              English
              arrow-up
              4
              ·
              edit-2
              20 days ago

              Sorry… why should I look at what I personally want out of the porn I want to see objectively? It’s entirely subjective.

              I mean I’m not sure how I could have been clearer that this was about my personal preferences. I said “to me” twice.

    • Zaktor@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      2
      ·
      20 days ago

      I have not personally explored AI porn, but as someone with experience in machine learning and accidental biases that’s not very surprising to me.

      On top the of the general societal bias towards youth for “beauty” related roles, smoother and less-featured faces (that in general look younger) are closer to an average face so defaulting to that gets a bit of training boost (when in doubt, target the mean). It’s probably also not helped by youth-related porn keywords (teen, daughter, young) that further associate other porn prompts (even ones not about youth) with non-porn images of underage women that also have those keywords.

  • Media Bias Fact Checker@lemmy.worldB
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    6
    ·
    20 days ago
    The Guardian - News Source Context (Click to view Full Report)

    Information for The Guardian:

    Wiki: reliable - There is consensus that The Guardian is generally reliable. The Guardian’s op-eds should be handled with WP:RSOPINION. Some editors believe The Guardian is biased or opinionated for politics. See also: The Guardian blogs.
    Wiki: mixed - Most editors say that The Guardian blogs should be treated as newspaper blogs or opinion pieces due to reduced editorial oversight. Check the bottom of the article for a “blogposts” tag to determine whether the page is a blog post or a non-blog article. See also: The Guardian.


    MBFC: Left-Center - Credibility: Medium - Factual Reporting: Mixed - United Kingdom


    Search topics on Ground.News

    https://www.theguardian.com/uk-news/2024/oct/28/man-who-used-ai-to-create-child-abuse-images-jailed-for-18-years

    Media Bias Fact Check | bot support