forked from Dao-AILab/flash-attention
-
Notifications
You must be signed in to change notification settings - Fork 57
Issues: ROCm/flash-attention
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
[Issue]: FA2 test failed when build CK/Triton backend on top of a rocm official docker image.
Under Investigation
#143
opened May 23, 2025 by
DerienFe
[Issue]: ModuleNotFoundError: No module named 'rotary_emb'
Under Investigation
#136
opened Apr 13, 2025 by
arunhk3
[Issue]: Test failing with ROCm 6.3.1 on MI250X
Under Investigation
#120
opened Jan 29, 2025 by
al-rigazzi
replace kernel implementation using CK tile-programming performant kernels
#33
opened Jan 10, 2024 by
carlushuang
4 tasks
Feature request: Sliding Window Attention
Feature Request
function
#22
opened Nov 29, 2023 by
tjtanaa
ProTip!
Follow long discussions with comments:>50.