If an LLM can’t be trusted with a fast food order, I can’t imagine what it is reliable enough for. I really was expecting this was the easy use case for the things.

It sounds like most orders still worked, so I guess we’ll see if other chains come to the same conclusion.

  • Communist@lemmy.frozeninferno.xyz
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    6
    ·
    2 days ago

    This isn’t something you can input any text into, it’s fixed, that joke doesn’t apply, you can’t do an sql injection here.

    • hark@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 hours ago

      I don’t know how you can think voice input is less versatile than text input, especially when a lot of voice input systems transform voice to text before processing. At least with text you get well-defined characters with a lot less variability.

        • hark@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 minutes ago

          Special characters is just one case to cover. If the user says they want “an elephant-sized drink” what does that mean to your system? At least that is relevant to size. Now imagine complete nonsense input like the joke you responded to (“-1 beers” or “a lizard”). SQL injection isn’t the only risk with handling inputs. The person who ordered 18,000 waters didn’t do a SQL injection attack.

    • betterdeadthanreddit@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 days ago

      Close one, a joke was related to but not a perfect match for the present situation. Something terrible could have happened like… Uh…

      Let me get back to you on that.