A profound relational revolution is underway, not orchestrated by tech developers but driven by users themselves. Many of the 400 million weekly users of ChatGPT are seeking more than just assistance with emails or information on food safety; they are looking for emotional support.

“Therapy and companionship” have emerged as two of the most frequent applications for generative AI globally, according to the Harvard Business Review. This trend marks a significant, unplanned pivot in how people interact with technology.

  • minorkeys@lemmy.world
    link
    fedilink
    English
    arrow-up
    29
    arrow-down
    1
    ·
    15 hours ago

    So somewhere they feel safe to do so. Says something pretty fucked up about our culture that men don’t feel safe to open up anywhere. And no, it’s not their own fault.

    • lightnsfw@reddthat.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 hours ago

      I wouldn’t use AI but I certainly don’t have anyone to open up to really. Either they’d use what I tell them against me or just aren’t in a position to offer any real support. With my luck I’d end up institutionalized for saying some unhinged shit anyway.

    • peteyestee@feddit.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      3 hours ago

      Everyday it seems to become clearer that American culture as whole is a problem. But that’s not something people are allowed to talk about.

    • 0x0@lemmy.zip
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      3
      ·
      12 hours ago

      And no, it’s not their own fault.

      Of course it is, men are cool targets to hate, get with the program.