when trying it: better try it with a local llm, you have more options and it probably gives somewhat decent output when choosing a small and topic fitting model
+ you dont give money to the ai corps + privacy + probably more
Reply/PM me, and I’ll spin up a 32B or 49B instance myself and prioritize it for you, anytime. I would suggest this over ollama as the bigger models are much, much smarter.
deleted by creator
when trying it: better try it with a local llm, you have more options and it probably gives somewhat decent output when choosing a small and topic fitting model
+ you dont give money to the ai corps + privacy + probably more
I mean, you mind as well do it right then. Use free, crowd hosted roleplaying finetunes, not a predatory OpenAI frontend.
https://aihorde.net/
https://lite.koboldai.net/
Reply/PM me, and I’ll spin up a 32B or 49B instance myself and prioritize it for you, anytime. I would suggest this over ollama as the bigger models are much, much smarter.