Replies: 2 comments 1 reply
-
Never mind, I got it to give me 'usage' after trying to set it using a non-server flag. First time I tried something similar it just raised an error, did not give a list of acceptable
|
Beta Was this translation helpful? Give feedback.
0 replies
-
Actually, no that does not work:
MODEL is in path. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
How do I load Llama 2 based 70B models with the llama_cpp.server? we need to declare
n_gqa=8
but as far as I can tell llama_cpp.server takes no arguments.Beta Was this translation helpful? Give feedback.
All reactions