Skip to content

Commit 52c1ebc

Browse files
authored
Merge pull request #142 from cagostino/chris/lite_llm
replacing inference calls with litellm, adding in /roll, upgrading /whisper
2 parents 061af57 + 1bee2dc commit 52c1ebc

18 files changed

+1718
-2361
lines changed

README.md

Lines changed: 17 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -66,17 +66,33 @@ response = get_llm_response("What is the capital of France? Respond with a json
6666
model='llama3.2',
6767
provider='ollama',
6868
format='json')
69+
print(response)
70+
# assistant's response is contained in the 'response' key for easier access
71+
assistant_response = response['response']
72+
print(assistant_response)
73+
# access messages too
74+
messages = response['messages']
75+
print(messages)
76+
77+
6978
#openai's gpt-4o-mini
79+
from npcsh.llm_funcs import get_llm_response
80+
7081
response = get_llm_response("What is the capital of France? Respond with a json object containing 'capital' as the key and the capital as the value.",
7182
model='gpt-4o-mini',
7283
provider='openai',
7384
format='json')
85+
print(response)
7486
# anthropic's claude haikue 3.5 latest
87+
from npcsh.llm_funcs import get_llm_response
88+
7589
response = get_llm_response("What is the capital of France? Respond with a json object containing 'capital' as the key and the capital as the value.",
76-
model='claude-haiku-3-5-latest',
90+
model='claude-3-5-haiku-latest',
7791
provider='anthropic',
7892
format='json')
7993

94+
95+
8096
# alternatively, if you have NPCSH_CHAT_MODEL / NPCSH_CHAT_PROVIDER set in your ~/.npcshrc, it will use those values
8197
response = get_llm_response("What is the capital of France? Respond with a json object containing 'capital' as the key and the capital as the value.",
8298
format='json')

0 commit comments

Comments
 (0)