Skip to content

simonri/flash-attention-3-h100

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

FlashAttention-3 pre-built wheels for H100

This repository hosts prebuilt Python wheels for flash-attn v3, compiled for NVIDIA H100 GPUs.

Available wheels

  • flash_attn_3-3.0.0b1-cp39-abi3-linux_x86_64.whl
    • Python 3.9+ (abi3)
    • Linux x86_64
    • Built with CUDA support for H100 (sm_90)

Installation

Download the wheel from the Releases page, then install with:

pip install flash_attn_3-3.0.0b1-cp39-abi3-linux_x86_64.whl

About

FlashAttention-3 pre-built wheels for H100

Resources

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages