forked from Dao-AILab/flash-attention
-
Notifications
You must be signed in to change notification settings - Fork 73
Pull requests: ROCm/flash-attention
Author
Label
Projects
Milestones
Reviews
Assignee
Sort
Pull requests list
[Windows] Add
hasattr checks for distributed to improve compatibility
#171
opened Jan 24, 2026 by
0xDELUXA
Loading…
Update fake tensor wrapper for flash_attn and falash_attn_varlen_func…
#163
opened Oct 20, 2025 by
sahirema
Loading…
1 task
[Do not merge] vllm layout varlen
WIP
work in progress
#106
opened Dec 3, 2024 by
rocking5566
•
Draft
ProTip!
Filter pull requests by the default branch with base:tridao.