anyone using LLMs also has to check that incorrect information hasn’t been injected.
It seems reasonable, but it’s pretty easy to miss crucial mistakes when one sentence in 300 is wrong, and there’s 25 cases of technically correct but misleading information
It seems reasonable, but it’s pretty easy to miss crucial mistakes when one sentence in 300 is wrong, and there’s 25 cases of technically correct but misleading information
Your worry is only reasonable if it was commonplace to write 300-sentence Wikipedia articles from scratch lol
That’s like 5x as long as the average article. Anyone submitting that much at once will raise an eyebrow