Godric@lemmy.world to Lemmy Shitpost@lemmy.world · 2 个月前Ah, Yes! AI Will Surely Save Us All!lemmy.worldimagemessage-square75fedilinkarrow-up1787arrow-down147
arrow-up1740arrow-down1imageAh, Yes! AI Will Surely Save Us All!lemmy.worldGodric@lemmy.world to Lemmy Shitpost@lemmy.world · 2 个月前message-square75fedilink
minus-squareDaxtron2@startrek.websitelinkfedilinkarrow-up6arrow-down2·2 个月前How can text ever possibly be CSAM when there’s no child or sexual abuse involved?
minus-squareJimmyeatsausage@lemmy.worldlinkfedilinkarrow-up2arrow-down1·2 个月前I didn’t say anything about text?
minus-squareDaxtron2@startrek.websitelinkfedilinkarrow-up3·2 个月前What exactly do you think erotic roleplay means?
minus-squareJimmyeatsausage@lemmy.worldlinkfedilinkarrow-up1·2 个月前Well, I honestly hadn’t considered someone texting with a LLM, I was more thinking about AI generated images.
minus-squareweker01@feddit.delinkfedilinkarrow-up1·2 个月前Text even completely fictional can be CSAM based on jurisdiction.
minus-squareDaxtron2@startrek.websitelinkfedilinkarrow-up1·2 个月前I’ve seen no evidence to that. There are cases tried under obscenity laws but CSAM has a pretty clear definition of being visual.
How can text ever possibly be CSAM when there’s no child or sexual abuse involved?
I didn’t say anything about text?
What exactly do you think erotic roleplay means?
Well, I honestly hadn’t considered someone texting with a LLM, I was more thinking about AI generated images.
Text even completely fictional can be CSAM based on jurisdiction.
I’ve seen no evidence to that. There are cases tried under obscenity laws but CSAM has a pretty clear definition of being visual.