• loweffortname@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      23
      ·
      1 day ago

      Basically every SaaS therapy company (Better Help, Talkspace, Rula, etc) is doing a lot of shit with LLMs.

      Even stuff like SimplePractice (which is a very basic EHR) is offering AI session transcriptions now.

      • aeshna_cyanea@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        7
        arrow-down
        1
        ·
        1 day ago

        Transcriptions are exactly the kind of thing ai is perfect for tho, there’s no judgement involved and since it’s timestamped you can instantly check/correct any mistakes against the original.

        I wouldn’t want a llm therapist either but a llm stenographer is fine imo

        • captainlezbian@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          10 hours ago

          At the same time, my last therapist was very discretionary in what she wrote down. She didn’t want any sensitive information to be able to be obtained by a hostile government. This is common practice. Full transcription removes that discretion

        • [deleted]@piefed.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          16 hours ago

          Yes, and we have had those for over a decade. I find the newer LLM based transcription far worse than what we had before with other forms of AI.

          • groet@feddit.org
            link
            fedilink
            arrow-up
            5
            arrow-down
            1
            ·
            22 hours ago

            If it is even possible for it to hallucinate a session, it is implemented wrong. It should start transcribing once the session starts and everything it transcribed is saved as that session. Once the session stops the AI is turned of. It never has a chance to hallucinate a season because the AI never decides what a session is.

            But we all know the AI will run permanently with full access to read, edit and delete everything…

            • Pika@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              2
              ·
              16 hours ago

              I was just thinking that myself, it shouldn’t be possible for a speech to text to be able to hallucinate. It might put the wrong word down, but it’s not like it’s going to imagine entire conversations.

      • Tollana1234567@lemmy.today
        link
        fedilink
        arrow-up
        1
        ·
        23 hours ago

        i always assumed those companies were AI from the get-go, basically chatboxes. might as well go to webmd for your symptoms.