return2ozma@lemmy.world to Technology@lemmy.worldEnglish · 4 days agoChatGPT safety systems can be bypassed to get weapons instructionswww.nbcnews.comexternal-linkmessage-square35fedilinkarrow-up1209arrow-down17cross-posted to: fuck_ai@lemmy.world
arrow-up1202arrow-down1external-linkChatGPT safety systems can be bypassed to get weapons instructionswww.nbcnews.comreturn2ozma@lemmy.world to Technology@lemmy.worldEnglish · 4 days agomessage-square35fedilinkcross-posted to: fuck_ai@lemmy.world
minus-squareRisingSwell@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up4·4 days agoIt’s really easy to make explosives. Making them stable and reliable is the hard part.
It’s really easy to make explosives. Making them stable and reliable is the hard part.