return2ozma@lemmy.world to Technology@lemmy.worldEnglish · 7 months agoChatGPT safety systems can be bypassed to get weapons instructionswww.nbcnews.comexternal-linkmessage-square33fedilinkarrow-up1210arrow-down17
arrow-up1203arrow-down1external-linkChatGPT safety systems can be bypassed to get weapons instructionswww.nbcnews.comreturn2ozma@lemmy.world to Technology@lemmy.worldEnglish · 7 months agomessage-square33fedilink
minus-squareCodenameDarlen@lemmy.worldlinkfedilinkEnglisharrow-up1·edit-27 months agodeleted by creator
deleted by creator