docs/how_to/local_llms/ #27637
Replies: 2 comments
-
Nice guide! However, I wonder if there's a way to use a generic LLM locally, and not just the ones reported in the guide here. |
Beta Was this translation helpful? Give feedback.
0 replies
-
I found, to create an OllamaLLM, u must add a 'base_url' parameter. e.g., ``llm = OllamaLLM(model="llama3.1:8b", base_url="http://localhost:11434"), otherwise there will be a ConnectError: [WinError 10049] exception be thrown. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
docs/how_to/local_llms/
Use case
https://python.langchain.com/docs/how_to/local_llms/
Beta Was this translation helpful? Give feedback.
All reactions