If an LLM can’t be trusted with a fast food order, I can’t imagine what it is reliable enough for. I really was expecting this was the easy use case for the things.

It sounds like most orders still worked, so I guess we’ll see if other chains come to the same conclusion.

  • leftzero@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    19
    ·
    2 days ago

    A QA engineer walks into a bar and orders a beer.

    She orders 2 beers.

    She orders 0 beers.

    She orders -1 beers.

    She orders a lizard.

    She orders a NULLPTR.

    She tries to leave without paying.

    Satisfied, she declares the bar ready for business. The first customer comes in an orders a beer. They finish their drink, and then ask where the bathroom is.

    The bar explodes.

    • Communist@lemmy.frozeninferno.xyz
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      6
      ·
      2 days ago

      This isn’t something you can input any text into, it’s fixed, that joke doesn’t apply, you can’t do an sql injection here.

      • hark@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 hours ago

        I don’t know how you can think voice input is less versatile than text input, especially when a lot of voice input systems transform voice to text before processing. At least with text you get well-defined characters with a lot less variability.

      • betterdeadthanreddit@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 days ago

        Close one, a joke was related to but not a perfect match for the present situation. Something terrible could have happened like… Uh…

        Let me get back to you on that.