Hello,

I’ve been looking into new laptops, trying to get back into Tech work after about ten years out due to a couple spinal injuries, and hoping for some advice.

TLDR: Laptop for coworking space, flexible AMD hardware preference, NPU for LLM work, Linux SysAdmin practice, metal case/chassis for heat dissipation, as budget friendly as possible.

I’d need a laptop due to limited space and to stay mobile depending on any particular job; mostly remote and may need to go to a local coworking space.

I’m hesitant about LLM usage, but know it’s here at least now and I should be familiar with the technology.

I’m considering experimenting with mostly smaller scale LLMs, and maybe whatever else I can get working, ‘feasibly’, locally;

  • Building as much of my own models if I am able to
  • Creating/curating training data sets
  • Training models on those as best as capable
  • Experimenting with suitably sized models from my training or pre-trained
  • Testing 'agents’/methods/whatever else these types of quality improvements are being called

So, I’m looking at machines with NPUs for that, though I hadn’t heard of these until I started looking into machines just recently. Any advice, guidance, rants, lectures, preaches related to that are all welcome and appreciated. Truly.

I trend towards AMD hardware. Just a preference of which corporation to send my money/support to. It may be that higher performing, lower priced, or more “ethical/moral/customer considerate” hardware exists. I’m unfamiliar, though, and these preferences were admittedly made from younger years and minimal information. Please correct as you see fit.

Additionally, I’ve had better results with Linux and AMD, when I’ve needed to use it; for continuing to use older machines, Linux SysAdmin practice, or Windows avoidance. Though, LLM work might be in early stages in Linux at the moment.

I am on the fence/generally consider against HP with how they’ve been with their products. Similarly, any statements or experiences with HP’s computers; advising avoidance, stating they’re not as problematic as their printers, or anything you feel relevant or helpful; will be highly regarded.

Yet, the lowest priced laptop I’ve found, which also has some local LLM options, seems to be this HP Omnibook 3 14’ (AMD Ryzen AI 7 350, 16G).

HP Omnibook, HP Website

I also found this Lenovo Ideapad 5 with about the same hardware. Yet, higher price even before tax.

Lenovo Ideapad, Costco Website

I’ve mostly found these CPUs with 16G of RAM. Preferably, I’d like 32G, for some additional memory with LLM training/usage practice. Outside of that alone, I’d likely do fine with 16.

I’d considered metal bodies/chassis for heat dissipation if I’m working the processors with the LLM practice; likely adding a laptop stand and/or cooling pad (if not included directly in the stand).

Sorry for the novel, but hopefully that helps with describing what I’m hoping towards.

I very much appreciate you taking the time to read my post. Thank you very much for any guidance or statements you may feel able to make.

Have a great weekend.

  • artifex@piefed.social
    link
    fedilink
    English
    arrow-up
    5
    ·
    3 days ago

    “Local LLM” and “budget friendly” are mutually exclusive at the moment. Just running prompt processing and inferencing on a very small model can be slow. Training - even for a tiny classifier model - would be impossible. I would strongly recommend getting a workhorse like a slightly used Lenovo T series and just renting GPU time for a few bucks an hour from someone like vast.ai before deciding if you really want to get into it locally (which will cost $$$).

    • vimmiewimmie@slrpnk.netOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 days ago

      That’s fair. I suppose I took their marketing statements for NPUs closer to face value than what feasibly works.

      I was hoping for something entry level to work with/recreate/train smaller models to avoid additional investment in extra services. But, outside of simply running the apps the companies are pushing into their devices, and some other community ones, local maybe doesn’t have much use without massive hardware.

      Wdyt?

      • artifex@piefed.social
        link
        fedilink
        English
        arrow-up
        3
        ·
        3 days ago

        I haven’t seen any consumer NPUs that will aid with training. They’re mainly used for accelerating image effects in photoshop or blurring your background in zoom. Most aren’t even any good for inference offload. Inference and especially training take a good GPU with a large amount of VRAM (expensive) or something like a ryzen strix halo with a ton of system RAM (also expensive). With model quantization you might run modestly sized models, but you would be training tiny, tiny models at best. Think thousands of parameters, not the billions or trillions used in the LLMs you know and love.