return2ozma@lemmy.world to Technology@lemmy.worldEnglish · 5 months agoChatGPT safety systems can be bypassed to get weapons instructionswww.nbcnews.comexternal-linkmessage-square33fedilinkarrow-up1210arrow-down17
arrow-up1203arrow-down1external-linkChatGPT safety systems can be bypassed to get weapons instructionswww.nbcnews.comreturn2ozma@lemmy.world to Technology@lemmy.worldEnglish · 5 months agomessage-square33fedilink
minus-squareCodenameDarlen@lemmy.worldlinkfedilinkEnglisharrow-up1·edit-25 months agodeleted by creator
deleted by creator