You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
First, thanks for your kindness in making the code alpaca dataset publicly available. 🙏
Based on your open-source code instruction data, I further developed a multilingual code generation Llama2 Model with parameter-efficient instruction-tuning on a single RTX 3090, called CodeUp. If some guys are interested, please move forward to https://github.com/juyongjiang/CodeUp. Thanks. 😄
CodeUp
Training Pipeline
The text was updated successfully, but these errors were encountered:
Hi there,
First, thanks for your kindness in making the code alpaca dataset publicly available. 🙏
Based on your open-source code instruction data, I further developed a multilingual code generation Llama2 Model with parameter-efficient instruction-tuning on a single RTX 3090, called CodeUp. If some guys are interested, please move forward to https://github.com/juyongjiang/CodeUp. Thanks. 😄
The text was updated successfully, but these errors were encountered: