Both Ubuntu and Fedora have made it official: support is coming soon for running local generative AI instances.

An epic and still-growing thread in the Fedora forums states one of the goals for the next version: the Fedora AI Developer Desktop Objective. It is causing some discontent, and at least one Fedora contributor, SUSE’s Fernando Mancera, has resigned.

  • Mokey Fraggle@therock.fraggle-rock.org
    link
    fedilink
    English
    arrow-up
    6
    ·
    2 days ago

    FOMO is a thing. Who’s gonna want to run the old version packaged in the distro anyway. Those things go stale pretty quickly, especially at the rate we’re seeing updates in local inference.

    • Luffy@lemmy.ml
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      2 days ago

      Afaik its only gonna be used for stuff like screen reading, which if you’ve ever tried an open source speech synthesis model, you’d know even an old lightweight LLM model is better than it.

      I’d also argue that if you actually care about local llms, you can just set up ollama and use that.

      • Mokey Fraggle@therock.fraggle-rock.org
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 days ago

        I hope actual weights are not packaged as part of the OS. The inference engine, sure, but again, it’s gonna get stale really fast.

        Friends don’t let friends run Ollama. There are much better options.