-
Notifications
You must be signed in to change notification settings - Fork 13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Exllamav2 installed but alway got "AssertionError: Please install ExllamaV2 or LLama CPP Python as backend" #37
Comments
Hi, I recalled have fixed this bug previously. Can you try to update to the latest by running |
Hi, |
Can you try the following: 2/ Try running the following script, you can also use jupyter notebook to run it line by line if it helps. This is the part of the script that is showing error. As you can see from the script, it just import ExllamaV2 and then check if it is imported.
|
Hi,
|
For some strange reason, install clean from source doesnt work for version 0.2.3 for me as well, and i am using prebuilt wheel. Look like your prebuilt url was of outdated version v0.1.8 instead of v0.2.3 You can look through all the version here. Ideally match both the python version and cuda version Another thing you can try is, still build from source, but install v0.2.0 instead of v0.2.3 Inside the exllamav2 folder that you clone from github. You can run the following command to check out a specific version: git checkout tags/v0.2.0 v0.2.1 and v0.2.2 have known bugs hence we skip it. I will check the exllamav2 install again later and open an issue on exllamav2 repo if it is confirmed to be issue with their install script |
Hi,
I think I will wait for a couple of time before continuing to try. |
Hi Quang, I have clarified with turboderp himself and couldnt figure out the issue. I have the exact issue with you where i can only use the wheel to install instead of build from source. I resolved it by creating a new python env in conda and reinstall everything from scatch. I believe that other library that i was testing messed up with my environment. Hope that you managed to sort out the issue on your end. Thanks |
Hi,
I installed Exllamav2 via (successful)
However, always get
File "/home/user/micromamba/envs/genv/lib/python3.12/site-packages/gallama/backend/model.py", line 39, in
assert ExLlamaV2 or Llama, "Please install ExllamaV2 or LLama CPP Python as backend"
Please help, thank you.
The text was updated successfully, but these errors were encountered: