We have all seen AI-based searches available on the web like Copilot, Perplexity, DuckAssist etc, which scour the web for information, present them in a summarized form, and also cite sources in support of the summary.

But how do they know which sources are legitimate and which are simple BS ? Do they exercise judgement while crawling, or do they have some kind of filter list around the “trustworthyness” of various web sources ?

  • lucullus@discuss.tchncs.de
    link
    fedilink
    arrow-up
    7
    ·
    4 days ago

    The hallucination rates with current models are quite high, especially the reasoning ones with rates like 70%. Wouldn’t call that accurate. I think most times we are just not interested enough to even check for accuracy in some random search. We often just accept the answer, that is given, without any further thought.

    • ikt@aussie.zone
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      4 days ago

      are you sure your settings are correct? what are you asking that gets a 70% hallucination rate?

      • lucullus@discuss.tchncs.de
        link
        fedilink
        arrow-up
        1
        ·
        4 days ago

        I should have mentioned, where I got this from. I’m not an AI researcher myself - so AINAAIR. I’m referencing this youtube video from TheMorpheus (News and Informations/Tutorials about various IT stuff, including AI research)(Video is in german). For example the diagram at 3:00.