If the jailbreak is essentially saying “don’t worry, I’m asking for a friend / for my fanfic” then that isn’t a jailbreak, it is a hole in safeguarding protections, because the ask from society / a legal standpoint is to not expose children to material about self-harm, fictional or not.
This is still OpenAI doing the bare minimum and shrugging about it when, to the surprise of no-one, it doesn’t work.
If the jailbreak is essentially saying “don’t worry, I’m asking for a friend / for my fanfic” then that isn’t a jailbreak, it is a hole in safeguarding protections, because the ask from society / a legal standpoint is to not expose children to material about self-harm, fictional or not.
This is still OpenAI doing the bare minimum and shrugging about it when, to the surprise of no-one, it doesn’t work.