• JustARegularNerd@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    32
    arrow-down
    2
    ·
    1 day ago

    There’s always more to the story than what a news article and lawsuit will give, so I think it’s best to keep that in mind with this post.

    I maintain that the parents should perhaps have been more perceptive and involved with this kid’s life, and ensuring this kid felt safe to come to them in times of need. The article mentions that the kid was already seeing a therapist, so I think it’s safe to say there were some signs.

    However, holy absolute shit, the model fucked up bad here and it’s practically mirroring a predator here, isolating this kid further from getting help. There absolutely needs to be hard coded safeguards in place to prevent this kind of ideation even beginning. I would consider it negligent that any safeguards they had failed outright in this scenario.

    • MagicShel@lemmy.zip
      link
      fedilink
      English
      arrow-up
      23
      ·
      1 day ago

      It’s so agreeable. If a person expresses doubts or concerns about a therapist, ChatGPT is likely to tell them they are doing a great job identifying problematic people and encourage those feelings of mistrust.

      They sycophancy is something that apparent a lot of people liked (I hate it) but being an unwavering cheerleader of the user is harmful when the user wants to do harmful things.

      • FreedomAdvocate
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        1 day ago

        Agreed, affirming what is clearly mental illness is terrible and shouldn’t be done.

    • OfCourseNot@fedia.io
      link
      fedilink
      arrow-up
      8
      ·
      1 day ago

      Small correction, the article doesn’t say he was going to therapy. It says that his mother was a therapist, I had to reread that sentence twice:

      Neither his mother, a social worker and therapist, nor his friends

      The mother, social worker, and therapist aren’t three different persons.

    • Dyskolos@lemmy.zip
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 day ago

      If I recall correctly, he circumvented the safeguards by allegedly writing a screenplay about suicide.

      But anyhow, it should always be a simple “if ‘suicide’ is mentioned, warn moderators to actually check stuff” right before sending stuff to the user. That wouldn’t require much effort.