Skip to content

Commit 73a5fff

Browse files
kyle-256wenxie-amd
authored andcommitted
format
1 parent 1812fa3 commit 73a5fff

File tree

1 file changed

+2
-1
lines changed
  • primus/backends/torchtitan/models/llama3/model

1 file changed

+2
-1
lines changed

primus/backends/torchtitan/models/llama3/model/model.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,12 +5,13 @@
55
###############################################################################
66

77
import torch
8+
from torch.nn.attention.flex_attention import BlockMask
89
from torchtitan.models.llama3.model.model import Attention as TTAttention
910
from torchtitan.models.llama3.model.model import apply_rotary_emb
10-
from torch.nn.attention.flex_attention import BlockMask
1111

1212
AttentionMasksType = dict[str, BlockMask] | BlockMask
1313

14+
1415
class Attention(TTAttention):
1516
def forward(
1617
self,

0 commit comments

Comments
 (0)