How to use Ollama in the siderbar? #535
Unanswered
NextStep-IM
asked this question in
Q&A
Replies: 2 comments 4 replies
-
Under GPT, you should simply be able to select it from the drop-down menu, like any other compatible model. If you wish for it to be the permanent default, the easiest way (and what I do) is placing it first in the services/gpt.js file and renaming it to openai (you'll probably have to comment out the actual openai service though. I do, because I personally don't use it anyways) |
Beta Was this translation helpful? Give feedback.
4 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Sorry if I missed any instruction or/and if this is a stupid question. I ran
ollama run llama3
in the terminal and the model ran successfully, but I don't know what to do with the sidebar.Beta Was this translation helpful? Give feedback.
All reactions