Skip to content

Commit

Permalink
remove notes.
Browse files Browse the repository at this point in the history
  • Loading branch information
sayakpaul committed Jan 5, 2025
1 parent 0e89cf5 commit 051f234
Showing 1 changed file with 0 additions and 7 deletions.
7 changes: 0 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -141,13 +141,6 @@ Note that the memory consumption in the table is reported with most of the optio

If you would like to use a custom dataset, refer to the dataset preparation guide [here](./docs/dataset/README.md).

## Notes

* The example training commands have the below thoughts in mind to prioritize memory-optimized runs:
* don't contain any arguments to enable validation inference.
* gradient checkpointing and gradient accumulation enabled.
* precompuation enabled.

## Acknowledgements

* `finetrainers` builds on top of a body of great open-source libraries: `transformers`, `accelerate`, `peft`, `diffusers`, `bitsandbytes`, `torchao`, `deepspeed` -- to name a few.
Expand Down

0 comments on commit 051f234

Please sign in to comment.