Skip to content

Commit 547ac94

Browse files
committed
selfattention block: Remove the fc linear layer if it is not used
Signed-off-by: John Zielke <[email protected]>
1 parent 8dcb9dc commit 547ac94

File tree

1 file changed

+5
-1
lines changed

1 file changed

+5
-1
lines changed

monai/networks/blocks/selfattention.py

+5-1
Original file line numberDiff line numberDiff line change
@@ -106,7 +106,11 @@ def __init__(
106106

107107
self.num_heads = num_heads
108108
self.hidden_input_size = hidden_input_size if hidden_input_size else hidden_size
109-
self.out_proj = nn.Linear(self.inner_dim, self.hidden_input_size)
109+
self.out_proj: Union[nn.Linear, nn.Identity]
110+
if include_fc:
111+
self.out_proj = nn.Linear(self.inner_dim, self.hidden_input_size)
112+
else:
113+
self.out_proj = nn.Identity()
110114

111115
self.qkv: Union[nn.Linear, nn.Identity]
112116
self.to_q: Union[nn.Linear, nn.Identity]

0 commit comments

Comments
 (0)