- cross-posted to:
- technology@lemmy.zip
- cross-posted to:
- technology@lemmy.zip
At least 80 million (3.3%) of Wikipedia’s facts are inconsistent, LLMs may help finding them
A paper titled “Detecting Corpus-Level Knowledge Inconsistencies in Wikipedia with Large Language Models”,[[1]](https://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2025-12-01/Recent_research#cite_note-1) presented earlier this month at the EMNLP conference, examines


My knee jerk is no, because fuck ai, but LLMs are literally made to parse vast amounts of data quickly. The analysis and corrections needs to be done manually, but finding these errors are literally what they were originally made to do