- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
If an LLM can’t be trusted with a fast food order, I can’t imagine what it is reliable enough for. I really was expecting this was the easy use case for the things.
It sounds like most orders still worked, so I guess we’ll see if other chains come to the same conclusion.
This isn’t something you can input any text into, it’s fixed, that joke doesn’t apply, you can’t do an sql injection here.
I don’t know how you can think voice input is less versatile than text input, especially when a lot of voice input systems transform voice to text before processing. At least with text you get well-defined characters with a lot less variability.
No special characters, this is speech to text, inherently sanitized inputs.
Close one, a joke was related to but not a perfect match for the present situation. Something terrible could have happened like… Uh…
Let me get back to you on that.