I haven’t thought about it in a while but the premise of the article rings true. Desktops are overall disposable. Gpu generations are only really significant with new cpu generations. CPUs are the same with real performance needed a new chipset and motherboard. At that point you are replacing the whole system.

Is there a platform that challenges that trend?

  • lightnsfw@reddthat.com
    link
    fedilink
    English
    arrow-up
    9
    ·
    44 minutes ago

    I have been ship of theseusing my desktop and server for 15 years. This article is fucking stupid.

  • A_norny_mousse@feddit.org
    link
    fedilink
    English
    arrow-up
    4
    ·
    57 minutes ago

    CPUs are the same with real performance needed a new chipset and motherboard. At that point you are replacing the whole system.

    I find the quoted statement untrue. You still have all peripherals, including the screen, the PSU, and the case.

    You can replace components as and when it becomes necessary.

    You can add up hard drives, instead of replacing a smaller one with a larger one.

    Desktop mobos are usually more upgradeable with RAM than laptops.

    There’s probably more arguments that speak against the gist of this article.

  • Rimu@piefed.social
    link
    fedilink
    English
    arrow-up
    1
    ·
    30 minutes ago

    Laptop CPUs are crippled garbage compared to desktop CPUs of the same generation. So there’s that.

  • fyrilsol@kbin.melroy.org
    link
    fedilink
    arrow-up
    10
    ·
    2 hours ago

    Everything is disposable. I don’t think you or the author who wrote that article has a clue. It’s a matter of getting things that’ll last longer than others do and making financially wise choices and purchasing decisions based on the needs of the moment.

    Like, I’m not spending $5 on a toothbrush when you need to replace it every 30 days, I buy the cheapest toothbrush I can afford to replace it with since they’re all equally made. I will spend some more money on a computer component if I feel it will have a positive increment on my entire system. Replacing my entire system would just set me back big and it would make me waste the components that are already inside that are still good. Plus, if I decide to sell the old system, I’m not going to get a good value back.

    The only thing I’ve yet to replace is the case. Why? Because it’s still serviceable to me.

    I just don’t get this stupid logic where you have to replace the entire system. For what? just to be with the in-crowd of current technology trends? No thanks, I’ll build my PC based on what I want out of it.

  • brucethemoose@lemmy.world
    link
    fedilink
    English
    arrow-up
    30
    ·
    edit-2
    3 hours ago

    That’s a huge generalization, and it depends what you use your system for. Some people might be on old threadripper workstations that works fine, for instance, and slaps in a second GPU. Or maybe someone needs more cores for work; they can just swap their CPU out. Maybe your 4K gaming system can make do with an older CPU.

    I upgraded RAM and storage just before the RAMpocalypse, and that’s not possible on many laptops. And I can stuff a whole bunch of SSDs into the body and use them all at once.

    I’d also argue that ATX desktops are more protected from anti-consumer behavior, like soldered price-gouged SSDs, planned obsolescence, or a long list of things you see Apple do.

    …That being said, there’s a lot of trends going against people, especially for gaming:

    • There’s “initial build FOMO” where buyers max out their platform at the start, even if that’s financially unwise and they miss out on sales/deals.

    • We just went from DDR4 to DDR5, on top of some questionable segmentation from AMD/Intel. So yeah, sockets aren’t the longest lived.

    • Time gaps between generations are growing as silicon gets more expensive to design.

    • …Buyers are collectively stupid and bandwagon. See: the crazy low end Nvidia GPU sales when they have every reason to buy AMD/Intel/used Nvidia instead. So they are rewarding bad behavior from companies.

    • Individual parts are more repairable. If my 3090 or mobo dies, for instance, I can send it to a repairperson and have a good chance of saving it.

    • You can still keep your PSU, case, CPU heating, storage and such. It’s a drop in the bucket cost-wise, but it’s not nothing.

    IMO things would be a lot better if GPUs were socketable, with LPCAMM on a motherboard.

  • SuiXi3D@fedia.io
    link
    fedilink
    arrow-up
    2
    ·
    1 hour ago

    Meanwhile I’ve been using an AM4 board and DDR4 for… well, it’s been awhile now.

  • m-p{3}@lemmy.ca
    link
    fedilink
    English
    arrow-up
    15
    ·
    3 hours ago

    Personally I still prefer the desktop because I can choose exactly where I prefer performance, and where I can make some tradeoffs. Also, parts are easier to replace when they fail, making them more sustainable. You don’t have that choice with a laptop since it’s all prebuilt.

    • socphoenix@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      3 hours ago

      Desktops also offer better heat dissipation and peripheral replacements extending the life of the unit. It can be difficult for most folks to replace a laptop display or even battery nowadays frankly.

  • Cyv_@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    5
    ·
    2 hours ago

    I disagree that you need to upgrade your CPU and GPU inline. I almost always stagger those upgrades. Sure, I might have some degree of bottleneck but it’s pretty minimal tbh.

    I also think it’s a bit funny the article mentions upgrading every generation. I’ve never done that, I don’t know a single person who does. Maybe I’m just too poor to hang with the rich fucks, but the idea of upgrading every generation was always stupid.

    Repairability is a big deal too. It also means that if my GPU dies I can just replace that one card rather than buy an entire new laptop since they tend to just solder things down for laptops.

    • stealth_cookies@lemmy.ca
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 hours ago

      I typically build a whole new PC and then do a mid-life GPU upgrade after a couple generations. e.g. I just upgraded my GPU I bought in late 2020. For most users there just isn’t a good reason to be upgrading your CPU that frequently.

      I can see why some people would upgrade their GPU every generation. I was suprised at how expensive even 2 generations old card are going for on ebay, if you buy a new card and sell your old one every couple years the “net cost per year” of usage is pretty constant.

  • artyom@piefed.social
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    2 hours ago

    Let’s say that you’ve just significantly upgraded your GPU. If you were getting the most out of your CPU with your previous GPU, there’s a good chance that your new GPU will be held back by that older component. So now, you need a new CPU or some percentage of your new GPU’s performance is wasted.

    There’s always an imbalance. It doesn’t mean it’s “wasted”. CPU and GPU do different things.

    except, getting a new CPU that’s worth the upgrade usually means getting a new motherboard

    Also not true. AM4 came out in 2016 and they are still making modern processors for it.

    Generational performance increases are too small

    Wrong again.

    Ask yourself this: how much of your current desktop computer has components from your PC from five years ago?

    Most of it.

    They’re also ignoring the concept of repairability. If my CPU dies? Buy another CPU. Maybe upgrade at the same time. CPU dies in your PS5? Fuck you, better throw the whole thing away and buy a new one.

  • saltesc@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    3 hours ago

    Let’s say that you’ve just significantly upgraded your GPU. If you were getting the most out of your CPU with your previous GPU, there’s a good chance that your new GPU will be held back by that older component. So now, you need a new CPU or some percentage of your new GPU’s performance is wasted. Except, getting a new CPU that’s worth the upgrade usually means getting a new motherboard, which might also require new RAM, and so on.

    This guy’s friends should keep him away from computers and just give him an iPad to play with.

  • RIotingPacifist@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    3 hours ago

    This has been true for a long time, CPU sockets don’t last long enough to make upgrades worth it, unless you are constantly upgrading. Whenever i’ve built a “futureproof” desktop with a mid-high end GPU, by the time I hit performance problems I needed a new motherboard to fit the new CPU anyway. Only really upgradable components are storage and ram, but you can do that in your laptop too.

    The main advantage of Desktops is still that you get much more performance for your money and can decide where it goes if you build it yourself.

  • JakoJakoJako13@piefed.social
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 hours ago

    It rings true but it’s not. It’s highly dependent on your upgrade plan. You can get a new CPU without a new mobo if you aren’t changing architecture like jumping from AM4 to AM5. The idea that only the cheap parts last the longest isn’t true either. I’ve been on the same GPU for nearly 7 years. It’s getting long in the tooth but when I do decide to upgrade I’m not forced to upgrade anything else. The GPU is the bottleneck but the bottleneck isn’t noticeable unless I’m playing some new AAA game that requires everything under the sun to run it.

    That last paragraph about parts being 5 to 10 years taking up close to 0% of your build just isn’t true for me either. The newest parts in my PC are three years old at this point. The case, the CPU and Mobo, Ram and an NVME drive. The case was purely for vanity reasons. I got an old GPU, and old PSU, 1 NVME drive, 2 SSD drives, and 2 HDDs that are 10 years old. All those parts are older than 5 years. The argument that most people are using PCs that are less than 5 years old sounds like some phone FOMO shit. I don’t buy it.

  • Seefra 1@lemmy.zip
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 hours ago

    Depends on what you use the computer for, for gaming, maybe you’re right, idk. I personally use the computer for 3D modeling which mostly relies on the GPU.

    I’ve recently built a computer with the latest gen GPU and got a nice 12 gen i7 as platform for it, the GPU is from 2025, but the CPU is like 4 years old.

    The thing is, I could have gotten a much older CPU haven’t I found the 12th gen for the same price. If I could just upgrade the GPU and ram on my old laptop I wouldn’t have bought a whole computer.

    Besides, buying a laptop with 16GB of vram would have been much more expensive than a desktop.

  • masterspace@lemmy.ca
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 hours ago

    The main benefit of a desktop is the price / performance ratio which is higher because you’re trading space and portability for easier thermal management and bigger components.

  • UnspecificGravity@piefed.social
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    2 hours ago

    I think the real thing you have learned is that PC upgrades are largely unnecessary. They are only selling new hardware that is better on paper and they need to create compatibility traps to make you upgrade a bunch of other shit to get that incremental upgrade.

    I think a lot of people really just fail to analyze if the thing they are going to get is worth the cost. Like if you have a perfectly good DDR4 system is it really worth a thousand dollars to upgrade every component in order to get what, an extra 5 FPS? People are spending a lot of money doing upgrades and expecting to get the kind of improvements you got ten years ago, and its just not going to happen because hardware hasn’t been improving at that rate for a long time.

    Even still, there are a lot of components that are not cheap that you can reuse regardless of CPU socket and memory compatibility changes. I’ve used the same PSU and case and drives and network card for a decade. That’s all shit I would have had to pay for over and over again with a different type of system.