- cross-posted to:
- technology@lemmy.zip
- cross-posted to:
- technology@lemmy.zip
Both Ubuntu and Fedora have made it official: support is coming soon for running local generative AI instances.
An epic and still-growing thread in the Fedora forums states one of the goals for the next version: the Fedora AI Developer Desktop Objective. It is causing some discontent, and at least one Fedora contributor, SUSE’s Fernando Mancera, has resigned.



Afaik its only gonna be used for stuff like screen reading, which if you’ve ever tried an open source speech synthesis model, you’d know even an old lightweight LLM model is better than it.
I’d also argue that if you actually care about local llms, you can just set up ollama and use that.
I hope actual weights are not packaged as part of the OS. The inference engine, sure, but again, it’s gonna get stale really fast.
Friends don’t let friends run Ollama. There are much better options.
It’s apparently a separate spin anyway, so your standard Fedora won’t have any of that shipped with it.
And it you’re using Ubuntu, I really can’t help you.
That was a good burn. I know I can do better with my distro choice. :)
Chin up, you could also do worse! Microslop are way ahead of Canonical or RedHat, and way less scrupulous about it…
Well yes. I miss my Solaris 9. It was the best OS I’ve had. But now Illumus and the forks are … well. Too bad. Any decent distro that follows System V the way it was?