Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[chore] relax requirements a bit. #253

Closed
wants to merge 1 commit into from
Closed

Conversation

sayakpaul
Copy link
Collaborator

Until and unless a user needs it, I think it is okay to not install torchao and bitsandbytes by default.

@sayakpaul sayakpaul requested a review from a-r-r-o-w January 30, 2025 07:22
@a-r-r-o-w
Copy link
Owner

Will keep this open for a bit if you don't mind. In #245, I'm also exploring using torchao fp8 directly since that would be true fp8 training instead of what we have at the moment, so maybe not too problematic to have it as a direct dependancy. Our layerwise implementation is good for most GPUs but, in the end, it is unstable. TorchAO fp8 would be true fp8 and more stable, but currently limited to Ada and Hopper only I think

@sayakpaul sayakpaul closed this Feb 3, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants