A mother and her 14-year-old daughter are advocating for better protections for victims after AI-generated nude images of the teen and other female classmates were circulated at a high school in New Jersey.

Meanwhile, on the other side of the country, officials are investigating an incident involving a teenage boy who allegedly used artificial intelligence to create and distribute similar images of other students – also teen girls - that attend a high school in suburban Seattle, Washington.

The disturbing cases have put a spotlight yet again on explicit AI-generated material that overwhelmingly harms women and children and is booming online at an unprecedented rate. According to an analysis by independent researcher Genevieve Oh that was shared with The Associated Press, more than 143,000 new deepfake videos were posted online this year, which surpasses every other year combined.

  • interceder270@lemmy.world
    link
    fedilink
    arrow-up
    12
    arrow-down
    5
    ·
    10 months ago

    I think the best way to combat this is to ostracize anyone who participates in it.

    Let it be a litmus test to see who is and is not worth hanging out with.

    • MotoAsh@lemmy.world
      link
      fedilink
      arrow-up
      17
      ·
      10 months ago

      The problem with that plan is there are too many horrible people in the world. They’ll just group up and keep going. Horrible people don’t stop over mere inconvenience.

      • interceder270@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        4
        ·
        edit-2
        10 months ago

        Yeah. Those horrible people can have a shitty life surrounded by other horrible people.

        Let them be horrible together and we can focus on the people who matter.

        • yamanii@lemmy.world
          link
          fedilink
          arrow-up
          4
          arrow-down
          1
          ·
          10 months ago

          Just like Nazis won’t go away just because you ignore them, it’s the same thing here.