Skip to content

Commit 1bb3ce5

Browse files
committed
selfattention block: Remove the fc linear layer if it is not used
Signed-off-by: John Zielke <[email protected]>
1 parent 8dcb9dc commit 1bb3ce5

File tree

1 file changed

+4
-1
lines changed

1 file changed

+4
-1
lines changed

monai/networks/blocks/selfattention.py

+4-1
Original file line numberDiff line numberDiff line change
@@ -106,7 +106,10 @@ def __init__(
106106

107107
self.num_heads = num_heads
108108
self.hidden_input_size = hidden_input_size if hidden_input_size else hidden_size
109-
self.out_proj = nn.Linear(self.inner_dim, self.hidden_input_size)
109+
if include_fc:
110+
self.out_proj = nn.Linear(self.inner_dim, self.hidden_input_size)
111+
else:
112+
self.out_proj = nn.Identity()
110113

111114
self.qkv: Union[nn.Linear, nn.Identity]
112115
self.to_q: Union[nn.Linear, nn.Identity]

0 commit comments

Comments
 (0)