-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Locally setup with Ollama and getting LanceError(IO) when trying to chat with an indexed document. #638
Comments
I mistyped the version; it is 0.9.11 |
@ykiran seems an error occurred when you index this document. Please delete it and upload again on the Files tab to see the error. You can try another document as well. |
I have set up a new local install of Kotaemon using a local ollama model. All tests work and I can chat with the local model. I have set cohere for embeddings. When I upload a file (the Kotaemon README.md) and try to chat with it I get the same error. Following the suggestion above I have uploaded the file via the Files tab and it is processed with no errors. I still get the same error. Console OutputSession reasoning type None use mindmap False use citation (default) language (default) |
@petervines if possible please try Docker installation method as well. I suspect this might have to do with some file permission issue. I will investigate the local install script in more detail. |
It works with the docker installation method. |
Also seeing this issue on my local setup with local models. Normal chats work fine, but attempting to pull from documents causes the same lance SQL parser error. Windows 11 Conda Env Edit: |
@petervines @EthanRush @ykiran Can you confirm if local Ollama work with linux-based setup like this Colab example ? |
I am getting the same error with Ollama on a Linux machine after using the local install script (not Docker). The file(s) were uploaded to the File Collection. Chatting without any files selected in File Collection works fine.
|
In general, it is hard to pin-point the issue without reproducing it from my side. You can re-run the Colab setup to verify the package versions / ollama setup process to compare with your local machine. My test so far with Colab setup with Linux installation script is still working as expected. |
Hi ! I have the same issue with a simple setup, ollama locally i managed to fix the error with adding this guard :
But it doen't seems to search in files anymore .. the I really need to make this RAG work, hope this can be solved ASAP (PS : Im using MACOS) |
I am trying to run Kotaemon locally using the following Docker command:
After running this, I can successfully log into Kotaemon, and the test connection with Ollama works for both LLM and Embedding. I also switched the File Collection to use Ollama. However, when trying to proceed, I encounter the following error:
Has anyone else experienced this issue, or does anyone know a possible solution? Thanks in advance! (PS : Im using MACOS) |
I managed to get around this error by changing the flowsettings.py file like this:
|
Hi I have the same problem with each file. Console output25-02-23 22:30:07 Session reasoning type complex use mindmap True use citation highlight language zh |
Description
I've set up the latest version (0.9.9) locally with Ollama as the backend.
Using deepseek-r1:14b for the LLM and nomic-embed-text for the embeddings.
After successfully indexing any document/url and trying to chat, I get the following error:
This is the partial log from the console after typing in a prompt
Reproduction steps
Screenshots
Logs
Browsers
Firefox
OS
Windows
Additional information
No response
The text was updated successfully, but these errors were encountered: