

You can run your own LLM chatbot with https://ollama.com/
They have some really small ones that only require like 1GB of VRAM, but you’ll generally get better results if you pick the biggest model that fits on your GPU.
You can run your own LLM chatbot with https://ollama.com/
They have some really small ones that only require like 1GB of VRAM, but you’ll generally get better results if you pick the biggest model that fits on your GPU.
I was able to quiet mine with a bash script until eventually a software update changed the fan control to keep it quiet for me.
There are advantages to getting server-grade hardware. It’s designed to run 24/7, often supports more hard drives, ram sticks, processors, etc, and often is designed to make it very quick to replace things when they break.
You can find used servers on sites like EBay for reasonable prices. They typically come from businesses selling their old hardware after an upgrade.
However, for simple home use cases, an old regular desktop PC will be just fine. Run it until it breaks!
Slack
??? Slack works just fine on Linux
Honestly I don’t really want a smart context-aware Siri, I just want something I can give simple, straightforward voice commands to, and get predictable, reliable results.
Wait hasn’t DDG been the default in Safari for a few years now?