Just a stranger trying things.

  • 0 Posts
  • 2 Comments
Joined 2 years ago
cake
Cake day: July 16th, 2023

help-circle

  • Ollama is very useful but also rather barebones. I recommend installing Open-Webui to manage models and conversations. It will also be useful if you want to tweak more advanced settings like system prompts, seed, temperature and others.

    You can install open-webui using docker or just pip, which is enough if you only care about serving yourself.

    Edit: open-webui also renders markdown, which makes formatting and reading much more appealing and useful.

    Edit2: you can also plug ollama into continue.dev, an extension to vscode which brings the LLM capabilities to your IDE.