- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
If an LLM can’t be trusted with a fast food order, I can’t imagine what it is reliable enough for. I really was expecting this was the easy use case for the things.
It sounds like most orders still worked, so I guess we’ll see if other chains come to the same conclusion.
there is an incredibly finite number of ways to mess with this, they just need a button to send a report to the engineers with how they got messed with and eventually they’ll have a complete list. I really doubt it’d take long to iron out the vast majority of ways that can be thought of.
This isn’t something you can input any text into, it’s fixed, that joke doesn’t apply, you can’t do an sql injection here.
I don’t know how you can think voice input is less versatile than text input, especially when a lot of voice input systems transform voice to text before processing. At least with text you get well-defined characters with a lot less variability.
No special characters, this is speech to text, inherently sanitized inputs.
Special characters is just one case to cover. If the user says they want “an elephant-sized drink” what does that mean to your system? At least that is relevant to size. Now imagine complete nonsense input like the joke you responded to (“-1 beers” or “a lizard”). SQL injection isn’t the only risk with handling inputs. The person who ordered 18,000 waters didn’t do a SQL injection attack.
Close one, a joke was related to but not a perfect match for the present situation. Something terrible could have happened like… Uh…
Let me get back to you on that.
Sounds like you’ve never programmed before.
stuck in a loop
stuck in a
stuck in
stuck
If what I was saying was false, there would be the same issues with the touchscreen ones.