In 2023, more deepfake abuse videos were shared than in every other year in history combined, according to an analysis by independent researcher Genevieve Oh. What used to take skillful, tech-savvy experts hours to Photoshop can now be whipped up at a moment’s notice with the help of an app. Some deepfake websites even offer tutorials on how to create AI pornography.

What happens if we don’t get this under control? It will further blur the lines between what’s real and what’s not — as politics become more and more polarized. What will happen when voters can’t separate truth from lies? And what are the stakes? As we get closer to the presidential election, democracy itself could be at risk. And, as Ocasio-Cortez points out in our conversation, it’s about much more than imaginary images.

“It’s so important to me that people understand that this is not just a form of interpersonal violence, it’s not just about the harm that’s done to the victim,” she says about nonconsensual deepfake porn. She puts down her spoon and leans forward. “Because this technology threatens to do it at scale — this is about class subjugation. It’s a subjugation of entire people. And then when you do intersect that with abortion, when you do intersect that with debates over bodily autonomy, when you are able to actively subjugate all women in society on a scale of millions, at once digitally, it’s a direct connection [with] taking their rights away.”

  • Vanth@reddthat.com
    link
    fedilink
    English
    arrow-up
    17
    ·
    3 months ago

    So basically existing revenge porn laws that are so difficult and time-consuming to pursue and subject to “well, they were asking for it” decisions, they are effectively worthless.

    We don’t have a consistent grasp on consent between two sober IRL adults let alone when there is more complexity. Any judge or jury wanting to blame victims will still find plenty of leeway in this law.

    • afraid_of_zombies@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 months ago

      Any judge or jury wanting to blame victims will still find plenty of leeway in this law.

      The same ones that put guys on the sex offender registry for peeing outdoors?

  • RealFknNito@lemmy.world
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    7
    ·
    3 months ago

    Friendly reminder we’ve had photoshop for decades. Legislation can’t keep up with technology and trying to do so will almost always come at the cost of constitutional rights. Like freedom of expression.

    If I want to photoshop a dick on Trump’s face, nobody should be allowed to tell me no. It’s not fucking “interpersonal violence”.

    • Adalast@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      2
      ·
      3 months ago

      Photoshopping a dick onto Trump’s face is 100% protected expression. Producing a photoreal deepfake of him balls deep in Lindsey Graham’s ass while Mitch McConnell can be seen holding the camera in a mirror wearing a ballgag and cuck strap then posting it online either without context or trying to pass it off as real is a problem.

  • nyan@lemmy.cafe
    link
    fedilink
    English
    arrow-up
    12
    ·
    3 months ago

    The real problem is that people automatically believe what they see online, no matter how ridiculous or outrageous, rather than thinking about probability and provenance and supporting evidence and all that stuff.

    Unfortunately, this problem is not likely to be solved any time soon, since we’ve had more than a quarter-century now (since the advent of image editing software) to work on it. Hell, even further back than that, a certain percentage of the population could be fooled into believing in UFOs by a blurry black-and-white photograph of pie plates suspended from fishing line. We’re never gonna fix this.

    • WarlordSdocy@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      3 months ago

      The problem also goes both ways though. Not only does it create fake things but it makes it much easier to discredit real things by just claiming that it’s deep faked.

  • dsemy@lemm.ee
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    4
    ·
    3 months ago

    It’s too late at this point IMO, you can make AI generated porn on your PC… How exactly are they going to stop it?

    • RainfallSonata@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      1
      ·
      3 months ago

      “The legislation amends the Violence Against Women Act so that people can sue those who produce, distribute, or receive the deepfake pornography, if they “knew or recklessly disregarded” that the victim did not consent to those images.”–The article.

      • dsemy@lemm.ee
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        5
        ·
        3 months ago

        I read the article… amending a law doesn’t make the problem go away.

        Maybe if more attention was given to the politicians talking about this half a decade ago (instead of focusing on AOC, which honestly realized this issue way too late), something more meaningful could have been done.

      • starman@programming.dev
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        3 months ago

        people can sue those who produce, distribute, or receive the deepfake pornography

        So can I send someone deepfake porn and then sue them?

    • General_Effort@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      3 months ago

      The law creates a new kind of intellectual property, so one would expect the enforcement problems to be similar to copyright. However, there are some big differences.

      One is that the minimum damages are 150k USD + attorney’s fees/costs. That’s going to unleash quite some entrepreneurial zeal.

      To be on the hook, “possession with intent to distribute” is enough if one “recklessly disregards” that a depicted individual did not consent. EG if you come across nudes of some celebrities on your lemmy instance, you better delete them immediately. Assuming that the celebrity consented to the images being shared sounds like “reckless disregard” to me. If it’s just someone, then it’s no problem.

      This definitely will make some people quite a lot of money.

  • afraid_of_zombies@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    3 months ago

    Good. If you don’t own your body and reputation you own nothing. There is a reason why we have laws in place to protect people from false accusations. And since it is pretty believable that any given person does have sex we need to block out the exemption for reasonable person.

    • General_Effort@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      3 months ago

      You have never owned your reputation.

      And - while you sort of own your body - you have never owned depictions of your property (that someone else made with their labor).

      If you are wondering what I mean by “sort of owning your body”: You are not allowed to sell it whole or in parts (ie organs). If you try to destroy or damage it, most governments will interfere. In fact, governments provide assistance to maintain that particular piece of property.

      This ownership-centric view is simply dystopian.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    1
    ·
    3 months ago

    This is the best summary I could come up with:


    “It’s not a question of mental strength or fortitude — this is about neuroscience and our biology.” She tells me about scientific reports she’s read about how it’s difficult for our brains to separate visceral images on a phone from reality, even if we know they are fake.

    Ocasio-Cortez is one of the most visible politicians in the country right now — and she’s a young Latina woman up for reelection in 2024, which means she’s on the front lines of a disturbing, unpredictable era of being a public figure.

    She recently co-published a paper for UNESCO with research assistant Dhanya Lakshmi on ways generative AI will exacerbate what is referred to in the industry as technology-facilitated gender-based violence.

    Mary Anne Franks, a legal scholar specializing in free speech and online harassment, says it’s entirely possible to craft legislation that prohibits harmful and false information without infringing on the First Amendment.

    The legislation amends the Violence Against Women Act so that people can sue those who produce, distribute, or receive the deepfake pornography, if they “knew or recklessly disregarded” that the victim did not consent to those images.

    “Congress is spearheading this much-needed initiative to address gaps in our legal framework against digital exploitation,” says a spokesperson for Mace, who recently introduced her own bill criminalizing deepfake porn.


    The original article contains 4,747 words, the summary contains 218 words. Saved 95%. I’m a bot and I’m open source!

  • General_Effort@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    5
    ·
    edit-2
    3 months ago

    It’s disappointing that AOC supports this capitalist law. This law is not against harassment. The DEFIANCE act creates a new kind of intellectual property.