File tree Expand file tree Collapse file tree
Expand file tree Collapse file tree Original file line number Diff line number Diff line change 8080 cuda-version : ' 11.7.1'
8181 - torch-version : ' 2.1.0.dev20230731'
8282 cuda-version : ' 11.8.0'
83+ # Pytorch >= 2.1 with nvcc 12.1.0 segfaults during compilation, so
84+ # we only use CUDA 12.2. setup.py as a special case that will
85+ # download the wheel for CUDA 12.2 instead.
86+ - torch-version : ' 2.1.0.dev20230731'
87+ cuda-version : ' 12.1.0'
8388
8489 steps :
8590 - name : Checkout
Original file line number Diff line number Diff line change 1- __version__ = "2.2.3.post1 "
1+ __version__ = "2.2.3.post2 "
22
33from flash_attn .flash_attn_interface import (
44 flash_attn_func ,
Original file line number Diff line number Diff line change @@ -223,6 +223,8 @@ def get_wheel_url():
223223 # _, cuda_version_raw = get_cuda_bare_metal_version(CUDA_HOME)
224224 torch_cuda_version = parse (torch .version .cuda )
225225 torch_version_raw = parse (torch .__version__ )
226+ if torch_version_raw .major == 2 and torch_version_raw .minor == 1 :
227+ torch_cuda_version = parse ("12.2" )
226228 python_version = f"cp{ sys .version_info .major } { sys .version_info .minor } "
227229 platform_name = get_platform ()
228230 flash_version = get_package_version ()
Original file line number Diff line number Diff line change @@ -85,14 +85,11 @@ RUN pip install transformers==4.25.1 datasets==2.8.0 pytorch-lightning==1.8.6 tr
8585RUN pip install git+https://github.com/mlcommons/logging.git@2.1.0
8686
8787# Install FlashAttention
88- RUN pip install flash-attn==2.2.3.post1
88+ RUN pip install flash-attn==2.2.3.post2
8989
9090# Install CUDA extensions for cross-entropy, fused dense, layer norm
9191RUN git clone https://github.com/HazyResearch/flash-attention \
92- && cd flash-attention && git checkout v2.2.3.post1 \
93- && cd csrc/fused_softmax && pip install . && cd ../../ \
94- && cd csrc/rotary && pip install . && cd ../../ \
92+ && cd flash-attention && git checkout v2.2.3.post2 \
9593 && cd csrc/layer_norm && pip install . && cd ../../ \
9694 && cd csrc/fused_dense_lib && pip install . && cd ../../ \
97- && cd csrc/ft_attention && pip install . && cd ../../ \
9895 && cd .. && rm -rf flash-attention
You can’t perform that action at this time.
0 commit comments