A profound relational revolution is underway, not orchestrated by tech developers but driven by users themselves. Many of the 400 million weekly users of ChatGPT are seeking more than just assistance with emails or information on food safety; they are looking for emotional support.

“Therapy and companionship” have emerged as two of the most frequent applications for generative AI globally, according to the Harvard Business Review. This trend marks a significant, unplanned pivot in how people interact with technology.

  • finitebanjo@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    10
    ·
    14 hours ago

    You should blame them because the AI is not a solution to their problems, it will only create more problems and worsen their current ones.

    • stephen01king@lemmy.zip
      cake
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      3
      ·
      14 hours ago

      Ok, so the solution to men feeling too scared to open up about their mental health enough to rely on something as unreliable as ChatGPT is for you to victim blame them?

      • finitebanjo@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        8
        ·
        14 hours ago

        If some men thought the solution to loneliness was fucking a toaster I’d think less of them too, yeah. At least find a tool that suits the purpose, talking to the AI is just self harm.

        • stephen01king@lemmy.zip
          cake
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          2
          ·
          13 hours ago

          Again, so you think the solution to someone that self harms is to blame them for performing the act. You’re a genius, you know that?