A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.

  • JackGreenEarth@lemm.ee
    link
    fedilink
    English
    arrow-up
    64
    arrow-down
    12
    ·
    6 months ago

    That’s a ripoff. It costs them at most $0.1 to do simple stable diffusion img2img. And most people could do it themselves, they’re purposefully exploiting people who aren’t tech savvy.

    • Khrux@ttrpg.network
      link
      fedilink
      English
      arrow-up
      62
      arrow-down
      6
      ·
      edit-2
      6 months ago

      I have no sympathy for the people who are being scammed here, I hope they lose hundreds to it. Making fake porn of somebody else without their consent, particularly that which could be mistaken for real if it were to be seen by others, is awful.

      I wish everyone involved in this use of AI a very awful day.

    • echo64@lemmy.world
      link
      fedilink
      English
      arrow-up
      47
      arrow-down
      4
      ·
      6 months ago

      The people being exploited are the ones who are the victims of this, not people who paid for it.

        • sbv@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          29
          arrow-down
          1
          ·
          6 months ago

          It seems like there’s a news story every month or two about a kid who kills themselves because videos of them are circulating. Or they’re being blackmailed.

          I have a really hard time thinking of the people who spend ten bucks making deep fakes of other people as victims.

          • Sentient Loom@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            14
            ·
            edit-2
            6 months ago

            I have a really hard time thinking

            Your lack of imagination doesn’t make the plight of non-consensual AI-generated porn artists any less tragic.

        • Vanth@reddthat.com
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          1
          ·
          edit-2
          6 months ago

          How are the perpetrators victims?

          I could see an argument for someone in need of money making AI generated porn of themselves. Like, don’t judge sex workers, they’re just trying to make money. But taking someone else’s image without their consent is more akin to Tate coercing his “girlfriends” into doing cam work and taking all the money and ensuring they can’t escape. He’s not a victim nor a sex worker, he’s a criminal.

          • Sentient Loom@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            10
            ·
            6 months ago

            Writing /s would have implied that my fellow lemurs don’t get jokes, and I give them more credit than that.

            • Vanth@reddthat.com
              link
              fedilink
              English
              arrow-up
              10
              arrow-down
              1
              ·
              edit-2
              6 months ago

              I would love to assume Lemmy users are intelligent enough to realize text-only sarcastic jokes about sex criminals are almost never a good idea, but alas, I’ve been on the internet longer than two weeks.

              • Sentient Loom@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                9
                ·
                6 months ago

                Some people just don’t have a sense of humor.

                And those people are YOU!!

                Thanks for the finger-wagging, you moralistic rapist!

      • Dkarma@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        10
        ·
        6 months ago

        No one’s a victim no one’s being exploited. Same as taping a head on a porno mag.

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      25
      arrow-down
      2
      ·
      6 months ago

      IDK, $10 seems pretty reasonable to run a script for someone who doesn’t want to. A lot of people have that type of arrangement for a job…

      That said, I would absolutely never do this for someone, I’m not making nudes of a real person.

    • IsThisAnAI@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      1
      ·
      edit-2
      6 months ago

      Scam is another thing. Fuck these people selling.

      But fuck dude they aren’t taking advantage of anyone buying the service. That’s not how the fucking world works. It turns out that even you have money you can post for people to do shit like clean your house or do an oil change.

      NOBODY on that side of the equation are bring exploited 🤣

    • OKRainbowKid@feddit.de
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 months ago

      In my experience with SD, getting images that aren’t obviously “wrong” in some way takes multiple iterations with quite some time spent tuning prompts and parameters.