• Endmaker@ani.social
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      2
      ·
      edit-2
      4 days ago

      Someone with the expertise should correct me if I am wrong; it’s been 4-5 years since I learnt about NPUs during my internship so I am very rusty:

      You don’t even need a GPU if all you want to do is to run - i.e. perform inference with - a neural network (abbreviating it to NN). Just a CPU would do if the NN is sufficiently lightweight. The GPU is only needed to speed up the training of NNs.

      The thing is, the CPU is a general-purpose processor, so it won’t be able run the NN optimally / as efficiently as possible. Imagine you want to do something that requires the NN and as a result, you can’t do anything else on your phone / laptop (it won’t be problem for desktops with GPUs though).

      Where NPU really shines is when there are performance constraints on the model: when it has to be fast (to be specific: have real-time speed), lightweight and memory efficient. Use cases include mobile computing and IoT.

      In fact, there’s news about live translation on Apple AirPod. I think this may be the perfect scenario for using NPUs - ideally housed within the earphones directly but if not, within a phone.

      Disclaimer: I am only familiar with NPUs in the context of “old-school” convolutional neural networks (boy, tech moves so quickly). I am not familiar with NPUs for transformers - and LLMs by extension - but I won’t be surprised if NPUs have been adapted to work with them.

      • CixoUwU@lemmy.cixoelectronic.pl
        link
        fedilink
        English
        arrow-up
        2
        ·
        17 hours ago

        Yes, that’s right, NN can be run on either the CPU or GPU, but a GPU isn’t required. My point is that an NPU isn’t very useful for the average laptop user right now, while a GPU (which also accelerates neural network execution) is more versatile. In my opinion, it’s better to increase the GPU’s power and run NN on it when necessary than to add an NPU that’s useless 95% of the time.

      • rumba@lemmy.zip
        link
        fedilink
        English
        arrow-up
        4
        ·
        4 days ago

        I’m not exactly an expert either but I believe the NPUs were seeing in the wild here are more like efficiency cores for AI.

        Using the GPU would be faster, but have much larger energy consumption. They’re basically mathco processors that are good at matrix calculations.