“It’s safe to say that the people who volunteered to “shape” the initiative want it dead and buried. Of the 52 responses at the time of writing, all rejected the idea and asked Mozilla to stop shoving AI features into Firefox.”

  • sudo@programming.dev
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    1 day ago

    If I can pick my own API (including local) and sampling parameters

    You can do this now:

    • selfhost ollama.
    • selfhost open-webui and point it to ollama
    • enable local models in about:config
    • select “local” instead of ChatGPT or w/e.

    Hardest part is hosting open-webui because AFAIK it only ships as a docker image.

    Edit: s/openai/open-webui

    • brucethemoose@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      9 hours ago

      Open WebUI isn’t very ‘open’ and kinda problematic last I saw. Same with ollama; you should absolutely avoid either.

      …And actually, why is open web ui even needed? For an embeddings model or something? All the browser should need is an openai compatible endpoint.