Skip to content

Conversation

@dependabot
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Jun 30, 2025

Bumps flash-attn from 2.7.4.post1 to 2.8.0.post2.

Commits
  • de79b13 [CI] Build with NVCC_THREADS=2 to avoid OOM
  • 71f7ac2 [CI] Compile with ubuntu-22.04 instead of ubuntu-20.04
  • 6f8f040 Bump to v2.8.0
  • d738303 Update Cutlass to 4.0
  • d417a5b [CI] Compile with nvcc 12.9.0
  • d31da73 [Cute] Implement PackGQA for attn fwd Sm90
  • a737ade [Cute] Use TMA for O when not varlen
  • 9a79170 [Cute] Implement varlen_q and varlen_q for attn fwd Sm90
  • 8ede036 [Cute] Refactor Softmax and BlockInfo objects
  • 69133f8 [Cute] Use TMA to store O in attn fwd epilogue
  • Additional commits viewable in compare view

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Bumps [flash-attn](https://github.com/Dao-AILab/flash-attention) from 2.7.4.post1 to 2.8.0.post2.
- [Release notes](https://github.com/Dao-AILab/flash-attention/releases)
- [Commits](Dao-AILab/flash-attention@v2.7.4.post1...v2.8.0.post2)

---
updated-dependencies:
- dependency-name: flash-attn
  dependency-version: 2.8.0.post2
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <[email protected]>
@dependabot dependabot bot added dependencies Pull requests that update a dependency file python Pull requests that update python code labels Jun 30, 2025
@dependabot dependabot bot requested a review from a team as a code owner June 30, 2025 00:32
@dependabot dependabot bot added dependencies Pull requests that update a dependency file python Pull requests that update python code labels Jun 30, 2025
@dependabot @github
Copy link
Contributor Author

dependabot bot commented on behalf of github Jul 14, 2025

Superseded by #1856.

@dependabot dependabot bot closed this Jul 14, 2025
@dependabot dependabot bot deleted the dependabot/pip/flash-attn-2.8.0.post2 branch July 14, 2025 00:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

dependencies Pull requests that update a dependency file python Pull requests that update python code

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant