Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Does it support Mixtral 8x7B? #15

Open
iMountTai opened this issue Jan 14, 2024 · 1 comment
Open

Does it support Mixtral 8x7B? #15

iMountTai opened this issue Jan 14, 2024 · 1 comment

Comments

@iMountTai
Copy link

iMountTai commented Jan 14, 2024

After I modified the code, there was a problem with the gate size of lora weight. After loading, I found that lora_a was the same as base_layer, and a size_mismatch problem occurred. Thanks!

@yxli2123
Copy link
Owner

yxli2123 commented Mar 5, 2024

Hi @iMountTai, have you resolved this issue? Could you please provide the code you modified?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants