

I’m running a 4B model on one of my machines, an old surface book 1.
It’s a brutal machine. heat issues, and the GPU doesn’t work in linux. But pick a minimal enough model and it’s good enough for me to have LLM access in my nextcloud if for some reason I wanted it.
Biggest thing really seems to be memory, most cheaper GPUs don’t have enough to run a big model, and CPUs are dreadfully slow on larger models if you can put enough RAM in one of them.
Always go with more ram. I can say that from experience.
I’m partial to fanless, but keep in mind my empire of dirt is almost entirely fanless so I’m just partial to it.