• nymnympseudonym@piefed.social
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 days ago

    Ollama cloud

    In these times I can’t fault anyone for not having enough blood to sell to buy a GPU, but… IMO the utility of an open-weight model is that you can run it locally/privately

    • unpossum@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 days ago

      I looked into running GLM 5.1 locally, but just the cooling system would probably cost more than a car, so I shelved it 😅