Holy shit. And I thought I was alone.

Where the hell are we heading as humanity?

    • Little8Lost@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      4 months ago

      when trying it: better try it with a local llm, you have more options and it probably gives somewhat decent output when choosing a small and topic fitting model
      + you dont give money to the ai corps + privacy + probably more

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      edit-2
      4 months ago

      I mean, you mind as well do it right then. Use free, crowd hosted roleplaying finetunes, not a predatory OpenAI frontend.

      https://aihorde.net/

      https://lite.koboldai.net/

      Reply/PM me, and I’ll spin up a 32B or 49B instance myself and prioritize it for you, anytime. I would suggest this over ollama as the bigger models are much, much smarter.