How run llama.cpp without the HTTP Server? (i think i need to use JSON or OpenWebUI for the interface) #13235
Unanswered
hilmimusyafa
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hey, can help me, i want to run llama.cpp on my container but without default HTTP, (i try with command llama-server) i mean, i need to interact with llama sever using a JSON or Custom API Server, or with OpenWebUI or Newelle, can help me please?
Beta Was this translation helpful? Give feedback.
All reactions