• groet@feddit.org
    link
    fedilink
    arrow-up
    5
    arrow-down
    1
    ·
    22 hours ago

    If it is even possible for it to hallucinate a session, it is implemented wrong. It should start transcribing once the session starts and everything it transcribed is saved as that session. Once the session stops the AI is turned of. It never has a chance to hallucinate a season because the AI never decides what a session is.

    But we all know the AI will run permanently with full access to read, edit and delete everything…

    • Pika@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      16 hours ago

      I was just thinking that myself, it shouldn’t be possible for a speech to text to be able to hallucinate. It might put the wrong word down, but it’s not like it’s going to imagine entire conversations.