EDIT: This happened back in 2025. Will leave as I’m sure I’m not the only one that didn’t know, but I saw it on hacker news and didn’t realize it was a year old. My bad.

In an odd approach to trying to improve customer tech support, HP allegedly implemented mandatory, 15-minute wait times for people calling the vendor for help with their computers and printers in certain geographies.

Callers from the United Kingdom, France, Germany, Ireland, and Italy were met with the forced holding periods, The Register reported on Thursday. The publication cited internal communications it saw from February 18 that reportedly said the wait times aimed to “influence customers to increase their adoption of digital self-solve, as a faster way to address their support question. This involves inserting a message of high call volumes, to expect a delay in connecting to an agent and offering digital self-solve solutions as an alternative.”

  • Taleya@aussie.zone
    link
    fedilink
    English
    arrow-up
    31
    ·
    18 hours ago

    Yes, because the #1 thing everyone wants to hear over and over is a voice saying “go to double u double u double u dot…”

    This is the fucking 21st century if they could fix their shit on the internet they would have already done it.

    Especially pisses me off when the only reason you’re calling them is because their website /portal / app explicitly went “you can’t do that here, call us”

    • Voroxpete@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      8
      ·
      6 hours ago

      Even better than that is Siteground’s absolutely abysmal support system.

      In order to access support they force you to type your question into their chatbot first. This is not optional. It’s the only way to get support.

      Fools that we are, we actually tried the solution the chatbot offered. This resulted in a good amount of time wasted looking for settings that didn’t exist, because the solution was total bullshit. They claim they’ve customized this thing to give helpful outputs, but it’s clearly just ChatGPT with a custom prompt.

      When we finally spoke to an agent I pointed this out and they responded with the stock “You should always double check the output of AI” line.

      DOUBLE CHECK WITH WHOM, YOU MOUTH BREATHING MORON? THIS IS YOUR OFFICIAL FUCKING SUPPORT CHANNEL. YOU LITERALLY DIDN’T GIVE ME ACCESS TO ANY OTHER KIND OF SUPPORT UNTIL I USED THE CHATBOT FIRST, SO WHERE IN THE ACTUAL FUCK AM I SUPPOSED TO DOUBLE CHECK THE OUTPUT?

      Is it with a customer service agent? Is that what you’re saying?! That I should ignore whatever it tells me, wait until I can talk to a representative and then do whatever they say instead? Because if that’s the case, WHY IN THE FUCK ARE YOU FORCING EVERYONE TO TALK TO THE BOT FIRST??!!!

      Absolutely fucking asinine idiocy. Anyway, don’t use Siteground, they fucking suck.

      • FG_3479@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        6 hours ago

        You shouldn’t talk to customer support agents like that. They’re not responsible for the actions of the shitty company, and you are giving them a bad day for no reason.

        • Voroxpete@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          2
          ·
          5 hours ago

          Jesus fucking Christ.

          OK little Timmy, today we’re going to learn that sometimes people express things in their “inner voice”, but they don’t share those things in their “outer voice”.

          And sometimes, later, they might share those “inner voice” thoughts with other people in an environment where it’s safe to do. But it doesn’t mean they have to express those inner voice thoughts to the person that they were thinking them about?

          Does that help you understand better? Would youv maybe like a juice box and a lie down to think about it?

    • KairuByte@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      15
      ·
      18 hours ago

      Yeah ran into that a month or so back with some service or other. Account was locked out, I told the prompt I was looking for an account unlock, I got to listen to “you can do most things by logging into your account at” for 45 minutes.

      • Echo Dot@feddit.uk
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        12 hours ago

        Online banking does this all of the time. It’s surprising how little you can actually do on their app, virtually every common banking task requires you to call them.

        I had to call them to set up an automatic payment on my credit card from my savings account. Because I couldn’t work out how to do it on the app. I confirmed with the support agent that you can’t do it on the app.

      • hateisreality@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        17 hours ago

        I have found that if I yell or sound angry at the LLM prompt, I’ll get an agent faster than if I am a proper adult

        • KairuByte@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          10
          ·
          16 hours ago

          If you swear or use certain words/phrases/tones there are absolutely some that put you into a higher priority queue. There are also some that immediately kick you into that queue the moment you swear, bypassing any info gathering and such.

          I’ve had to use it for things like Verizon which absolutely expects the LLM to be able to verify your account, but their account verification was broken. Swear at it a little and suddenly the account verification is no longer needed.

    • sqgl@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      3
      ·
      16 hours ago

      Could be worse: You could be made to sit through aich tee tee pee colon slash slash double u double u double u dot…