• otp@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    78
    arrow-down
    8
    ·
    edit-2
    5 months ago

    The laws regarding a lot of this stuff seem to ignore that people under 18 can and will be sexual.

    If we allow people to use this tech for adults (which we really shouldn’t), then we have to accept that people will use the same tech on minors. It isn’t even necessarily pedophilia on all cases (such as when the person making them is also a minor)*, but it’s still something that very obviously shouldn’t be happening.

    * we don’t need to get into semantics. I’m just saying it’s not abnormal (the way pedophilia is) for a 15-year old to be attracted to another 15-year old in a sexual way.

    Without checks in place, this technology will INEVITABLY be used to undress children. If the images are stored anywhere, then these companies will be storing/possessing child pornography.

    The only way I can see to counteract this would be to invade the privacy of users (and victims) to the point where nobody using them “”“legitimately”“” would want to use it…or to just ban them outright.

    • micka190@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      arrow-down
      4
      ·
      5 months ago

      such as when the person making them is also a minor

      I get the point you’re tying to make. But minors taking nudes of themselves is illegal in a lot of places, because it’s still possession.

      • BrianTheeBiscuiteer@lemmy.world
        link
        fedilink
        English
        arrow-up
        45
        arrow-down
        3
        ·
        5 months ago

        And that’s still a bit messed up. It’s a felony for a teen to have nude pictures of themselves and they’ll be registered sex offenders for life and probably ineligible for most professions. Seems like quite a gross over reaction. There needs to be a lot of reform in this area but no politician wants to look like a “friend” to pedophiles.

      • Zorque@kbin.social
        link
        fedilink
        arrow-up
        17
        arrow-down
        1
        ·
        5 months ago

        Which is more of a “zero-tolerance” policy, like unto giving the same punishment to a student defending themselves as the one given to the person who initiated the attack.

      • otp@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        7
        ·
        5 months ago

        I get the point you’re tying to make. But minors taking nudes of themselves is illegal in a lot of places, because it’s still possession.

        I agree, and on the one hand, I understand why it could be good to consider it illegal (to prevent child porn from existing), but it does also seem silly to treat it as a case of pedophilia.

    • vzq@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      137
      ·
      5 months ago

      That’s a lot of words to defend fake child porn made out of photos and videos of actual children.

      • NOT_RICK@lemmy.world
        link
        fedilink
        English
        arrow-up
        89
        arrow-down
        1
        ·
        5 months ago

        Reading comprehension not a strong suit? Sounds to me they’re arguing for protections for both adults AND minors.

      • Zorque@kbin.social
        link
        fedilink
        arrow-up
        41
        arrow-down
        2
        ·
        5 months ago

        That’s about the right amount of words to completely ignore the sentiment of a statement so you can make a vapid holier-than-thou statement based on purported moral superiority.

      • Fosheze@lemmy.world
        link
        fedilink
        English
        arrow-up
        25
        arrow-down
        1
        ·
        5 months ago

        Have you tried actually reading what they said instead of just making shit up?

      • otp@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        1
        ·
        5 months ago

        That’s a lot of words to defend fake child porn made out of photos and videos of actual children.

        Uh…this is the second sentence or so (and the start of the second paragraph, I think)

        If we allow people to use this tech for adults (which we really shouldn’t)

        So I’m not sure where you got the idea that I’m defending AI-generated child porn.

        Unless you’re so adamant about AI porn generators existing that banning their usage on adults (or invading the privacy of the users and victims with oversight) is outright unthinkable? Lol

        I’m saying that IF the technology exists, people will be using it on pictures of children. We need to keep that in mind when we think about laws for this stuff. It’s not just adults uploading pictures of themselves (perfectly fine) or adult celebrities (not fine, but probably more common than any truly acceptable usage).

      • WallEx@feddit.de
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        5 months ago

        What a dumb take. And I do those myself, so I know one if I see one.

  • ristoril_zip@lemmy.zip
    link
    fedilink
    English
    arrow-up
    37
    arrow-down
    2
    ·
    5 months ago

    This genie is probably impossible to get back in the bottle.

    People are going to just direct the imitative so called AI program to make the face just different enough to have plausible deniability that it’s a fake of this person or that person. Or use existing tech to age them to 18+ (or 30+ or whatever). Or darken or lighten their skin or change their eye or hair color. Or add tattoos or piercings or scars…

    I’m not saying we should be happy about it, but it is here and I don’t think it’s going anywhere. Like, if you tell your so called AI to give you a completely fictional nude image or animation of someone that looks similar to Taylor Swift but isn’t Taylor Swift, what’s the privacy (or other) violation, exactly?

    Does Taylor Swift own every likeness that looks somewhat like hers?

    • PM_Your_Nudes_Please@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      1
      ·
      5 months ago

      It’s also not a new thing. It’s just suddenly much easier for the layman to do. Previously, you needed some really good photoshop skills to pull it off. But you could make fake nudes if you really wanted to, and were willing to put in the time and effort.

    • TORFdot0@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      3
      ·
      5 months ago

      If the prompt includes “Taylor swift” or an image of her. Then it doesn’t matter if the AI slightly changed it, it used her likeness to generate the image and so she should have rights to the image and the ability to claim damages.

      The same thing should apply to using deepfake porn AIs to make non consensual nudes of private person, or heck manually creating nonconsensual deepfake nudes should also fall under the same definition

      • Saik0@lemmy.saik0.com
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        5 months ago

        This is not how it works. Paparazzi that take her image own the rights to the image. Not Taylor Swift. They make the money on the image when they sell it and Taylor Swift gets nothing out of the sale and has no rights on that transaction. If you’re in public you can be photographed. If a photographer takes an image and releases it to public domain, the subjects of the image will have no say in it unless the photographer broke some other law. (Eg peeping Tom laws or stalking)

  • themeatbridge@lemmy.world
    link
    fedilink
    English
    arrow-up
    35
    arrow-down
    8
    ·
    5 months ago

    No reason not to ban them entirely.

    The problem is enforcing the ban. Would it be a crime to have access to the software, or would they need to catch the criminals with the images and video files? It would be trivial to host a site in a country without legal protections and make the software available from anywhere.

    • 520@kbin.social
      link
      fedilink
      arrow-up
      18
      arrow-down
      2
      ·
      5 months ago

      Would it be a crime to have access to the software, or would they need to catch the criminals with the images and video files?

      Problem with the former is that would outlaw any self hosted image generator. Any image generator is capable of use for deep fake porn

      • themeatbridge@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        5 months ago

        Right, this is my point. The toothpaste is out of the tube. So would simply having the software capable of making deepfake porn be a crime?

      • assassin_aragorn@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        5 months ago

        Perhaps an unpopular opinion, but I’d be fine with that. I have yet to see a benefit or possible benefit that outweighs the costs.

    • Howdy@lemmy.zip
      link
      fedilink
      English
      arrow-up
      10
      ·
      5 months ago

      I feel like a sensible realistic course of action with this is it needs to be in the act of sharing/distributing. It would be way to broad otherwise as the tools that generate this stuff have unlimited purposes. Obvious child situations should be dealt with in the act of production of but the enforcement mechanism needs to be on the sharing/distribution part. Unfortunately analogy is blame the person not the tool on this one.

      • catloaf@lemm.ee
        link
        fedilink
        English
        arrow-up
        5
        ·
        5 months ago

        Right. And honestly, this should already be covered under existing harassment laws.

      • themeatbridge@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        6
        ·
        5 months ago

        Yeah, I feel like if you find this shit on someone’s computer, whether they shared it or not, there should be some consequences. Court-mandated counseling at a minimum.

  • Daft_ish@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    1
    ·
    edit-2
    5 months ago

    This is probably not the best context but I find it crazy how fast the government will get involved if it involves lude content but children are getting mudered in school shootings and gun control is just a bridge too far.

    • pro_grammer@programming.devOP
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      edit-2
      5 months ago

      I think they act faster on those matters because, aside from being a very serious problem, they also have a conservative agenda.

      Is very easy to say: “LOOK, WE ARE DOING THIS TO PROTECT YOUR CHILDREN FROM PEDOPHILES!!!”

      But they can’t just go and say “let’s enforce gun safety on schools”, because having a conservative voter reading “gun safety” will already go bad for them.

      They know they are sacrificing the well being of children by not acting on the school shootings, but for them is just the price of a few lives to stay in power.

  • NutWrench@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    5 months ago

    “We’re gonna ban Internet stuff” is something said by people who have no idea how the Internet works.

  • WhyDoYouPersist@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    7
    ·
    5 months ago

    For some reason I thought it was mainly to protect Taylor Swift, with teen girls being the afterthought.

  • hexdream@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    5 months ago

    And no chance it’s because they want to, uh, thoroughly investigate the evidence…

    • yokonzo@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      2
      ·
      5 months ago

      I don’t know why you got down voted, youre right. This is going to be ridiculously hard to enforce

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    3
    ·
    5 months ago

    This is the best summary I could come up with:


    Caroline Mullet, a ninth grader at Issaquah High School near Seattle, went to her first homecoming dance last fall, a James Bond-themed bash with blackjack tables attended by hundreds of girls dressed up in party frocks.

    Since early last year, at least two dozen states have introduced bills to combat A.I.-generated sexually explicit images — known as deepfakes — of people under 18, according to data compiled by the National Center for Missing & Exploited Children, a nonprofit organization.

    nudification apps is enabling the mass production and distribution of false, graphic images that can potentially circulate online for a lifetime, threatening girls’ mental health, reputations and physical safety.

    A lawyer defending a male high school student in a deepfake lawsuit in New Jersey recently argued that the court should not temporarily restrain his client, who had created nude A.I.

    Under the new Louisiana law, any person who knowingly creates, distributes, promotes or sells sexually explicit deepfakes of minors can face a minimum prison sentence of five to 10 years.

    After learning of the incident at Issaquah High from his daughter, Senator Mullet reached out to Representative Orwall, an advocate for sexual assault survivors and a former social worker.


    The original article contains 1,288 words, the summary contains 198 words. Saved 85%. I’m a bot and I’m open source!