Triton Flash Attention
Python Version | Build Status |
---|---|
3.8 | |
3.9 | |
3.10 | |
3.11 | |
3.12 | |
3.13 |
Coding Flash Attention from scratch using triton and pytorch.
- Clone the repository and Go to
triton-flash-attn
directory.
git clone https://github.com/eljandoubi/triton-flash-attn.git && cd triton-flash-attn
- Build environment.
make build
make clean