• flamingo_pinyata@sopuli.xyz
    link
    fedilink
    arrow-up
    7
    ·
    14 hours ago

    Idk what the OP meant, but I’m asking how?

    I wish I could use it to make lonelyness easier but talking to one as with a partner is impossible. It’s so obviously a dumb bot. It will say anything I want it to say. I can’t feel an emotional connection to that.

    • fizzle@quokk.au
      link
      fedilink
      English
      arrow-up
      3
      ·
      4 hours ago

      It’s not my thing but I think it’s to do with the context window. Like if you continue the same “chat” it’s responses might feel more like it knows you.

      I remember there was an update or something and people lost that long-standing context window and they were legit complaining like “you murdered by girlfriend” and the fix was to get the model to read the last / closed chat into the new chat and people were like “wow she came back”.

      Obviously, to a healthy well adjusted adult who engages in appropriate social interactions, this type of thing can’t emulate that. However, I can see how this type of thing could be a soothing balm for loneliness in the future.

    • MagicShel@lemmy.zip
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      12 hours ago

      If you are superficial and care more about validation and agreement than partnership and empathy, I imagine AI is great at that. I’ll bet there are a lot of folks that want nothing more from a relationship than AI can give. And they are apparently happy and probably making other people happy by not making them find out the hard way what they are about.