• tal@lemmy.today
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 hour ago

    I don’t know if “GPUs” is the right term, but the only area where we’re seeing large gains in computational capacity now is in parallel compute, so I’d imagine that if Intel intends to be doing high performance computation stuff moving forward, they probably want to be doing parallel compute too.

  • Wioum@lemmy.world
    link
    fedilink
    English
    arrow-up
    120
    ·
    6 hours ago

    I had to check the date on the article. They’ve been making GPUs for 3 years now, but I guess this announcement–although weird–is a sign that Arc is here to stay, which is good news.

  • Goodeye8@piefed.social
    link
    fedilink
    English
    arrow-up
    23
    ·
    6 hours ago

    Well that article was a waste of space. Intel has already stepped into the GPU market with their ARC cards, so at the very least the article should contain a clarification on what the CEO meant.

    And I see people shitting on the arc cards. The cards are not bad. Last time I checked the B580 had performance comparable to the 4060 for half the cost. The hardware is good, it’s simply meant for budget builds. And of course the drivers have been an issue, but drivers can be improved and last time I checked Intel is actually getting better with their drivers. It’s not perfect but we can’t expect perfect. Even the gold standard of drivers, Nvidia, has been slipping in the last year.

    All is to say, I don’t understand the hate. Do we not want competition in the GPU space? Are we supposed to have Nvidia and AMD forever until AMD gives up because it becomes too expensive to compete with Nvidia? I’d like it to be someone else than Intel but as long as the price comes down I don’t care who brings it down.

    And to be clear, if Intels new strategy is keeping the prices as they are I’m all for “fuck Intel”.

      • gravitas_deficiency@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        8
        ·
        5 hours ago

        This is a big part of it, imo. They kissed the ring.

        The other part of it is that, per the article, this is an “AI” pivot. This is not them making more consumer-oriented GPUs. Which is frustrating, because they absolutely could be a viable competitor in low-mid tier if they wanted to. But “AI” is (for now) much more lucrative. We’ll see how long that lasts.

    • ZeDoTelhado@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 hours ago

      CPU overhead is quite well known and actually damages a lot the arc cards’ position on the budget class

    • MentalEdge@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      3 hours ago

      Wut?

      Alchemist and Battlemage cards were fine.

      Edit: oh no. It’s a pivot to AI compute 🤦‍♂️

  • Devolution@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    3 hours ago

    You mean non shit non arcs? They tried already and failed already with battle mage.

  • Paragone@piefed.social
    link
    fedilink
    English
    arrow-up
    1
    ·
    5 hours ago

    From what I’ve read about the “quality” of their drivers, … NVidia isn’t under any threat, whatsoever.

    Years before bugs get fixed, etc…

    ( Linux, not MS-Windows, but it’s Linux where the big compute gets done, so that’s relevant )

    https://www.phoronix.com/review/llama-cpp-vulkan-eoy2025/5

    for some relevant graphs: Intel isn’t a real competitor, & while they may work to change that … that lag is SERIOUSLY bad, behind NVidia.

    _ /\ _

  • Erik@discuss.online
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    6
    ·
    7 hours ago

    It isn’t much of a challenge if they suck. Just planning to make them doesn’t mean shit.

    Also, why do none of these articles have a summary posted for them? These are some seriously low effort posts.

  • tidderuuf@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    8
    ·
    7 hours ago

    At least they are admitting the Intel ARC was more of a joke rather than a graphics card.

    • RejZoR@lemmy.ml
      link
      fedilink
      English
      arrow-up
      9
      ·
      6 hours ago

      Intel ARC is no joke. Technologically it’s very capable, they just never really scaled it to compete on any higher level…