File tree 1 file changed +19
-2
lines changed
1 file changed +19
-2
lines changed Original file line number Diff line number Diff line change @@ -36,8 +36,6 @@ git clone https://github.com/OpenAccess-AI-Collective/axolotl
36
36
pip3 install -e .
37
37
pip3 install -U git+https://github.com/huggingface/peft.git
38
38
39
- accelerate config
40
-
41
39
# finetune lora
42
40
accelerate launch scripts/finetune.py examples/openllama-3b/lora.yml
43
41
525
523
accelerate launch scripts/finetune.py configs/your_config.yml
526
524
` ` `
527
525
526
+ # ### Multi-GPU Config
527
+
528
+ - llama FSDP
529
+ ` ` ` yaml
530
+ fsdp:
531
+ - full_shard
532
+ - auto_wrap
533
+ fsdp_config:
534
+ fsdp_offload_params: true
535
+ fsdp_state_dict_type: FULL_STATE_DICT
536
+ fsdp_transformer_layer_cls_to_wrap: LlamaDecoderLayer
537
+ ` ` `
538
+
539
+ - llama Deepspeed: append ` ACCELERATE_USE_DEEPSPEED=true` in front of finetune command
540
+
528
541
# ## Inference
529
542
530
543
Pass the appropriate flag to the train command:
@@ -575,6 +588,10 @@ Try set `fp16: true`
575
588
576
589
Try to turn off xformers.
577
590
591
+ > Message about accelerate config missing
592
+
593
+ It' s safe to ignore it.
594
+
578
595
## Need help? 🙋♂️
579
596
580
597
Join our [Discord server](https://discord.gg/HhrNrHJPRb) where we can help you
You can’t perform that action at this time.
0 commit comments