misk@sopuli.xyz to Technology@lemmy.worldEnglish · 10 months agoJailbroken AI Chatbots Can Jailbreak Other Chatbotswww.scientificamerican.comexternal-linkmessage-square74fedilinkarrow-up1439arrow-down113cross-posted to: hackernews@lemmy.smeargle.fanshackernews@derp.foo
arrow-up1426arrow-down1external-linkJailbroken AI Chatbots Can Jailbreak Other Chatbotswww.scientificamerican.commisk@sopuli.xyz to Technology@lemmy.worldEnglish · 10 months agomessage-square74fedilinkcross-posted to: hackernews@lemmy.smeargle.fanshackernews@derp.foo
minus-squareThe Barto@sh.itjust.workslinkfedilinkEnglisharrow-up3·edit-210 months agoLegitimate reason? No, but there’s always a reason to know how to make napalm.
Legitimate reason? No, but there’s always a reason to know how to make napalm.