-
Notifications
You must be signed in to change notification settings - Fork 12
Description
It seems that the last lif (proj_lif) in Spiking_Self_Attention get input with shape
The reshape operators should be executed after proj_bn and before proj_lif.
The Accuracy of trained HST-10-384 is 78.732% when batch_size is set to
When fixing this problem by setting batch_size to 1, the performance drops to 74.202% (-4.530%).
When fixing this problem by replacing it with following code, the performance drops to 74.196% (-4.536%).
x = self.proj_lif(self.proj_bn(self.proj_conv(x)).reshape(T, B, C, W, H))Lines 143 to 146 in 43f0adf
| x = x.transpose(3, 4).reshape(T, B, C, N).contiguous() | |
| x = self.attn_lif(x) | |
| x = x.flatten(0, 1) | |
| x = self.proj_lif(self.proj_bn(self.proj_conv(x))).reshape(T, B, C, W, H) |
Lines 115 to 118 in 43f0adf
| x = x.transpose(3, 4).reshape(T, B, C, N).contiguous() | |
| x = self.attn_lif(x) | |
| x = x.flatten(0,1) | |
| x = self.proj_lif(self.proj_bn(self.proj_conv(x))).reshape(T,B,C,W,H) |
Lines 115 to 118 in 43f0adf
| x = x.transpose(3, 4).reshape(T, B, C, N).contiguous() | |
| x = self.attn_lif(x) | |
| x = x.flatten(0,1) | |
| x = self.proj_lif(self.proj_bn(self.proj_conv(x))).reshape(T,B,C,W,H) |
QKFormer/dvs128-gesture/model.py
Lines 143 to 146 in 43f0adf
| x = x.transpose(3, 4).reshape(T, B, C, N).contiguous() | |
| x = self.attn_lif(x) | |
| x = x.flatten(0, 1) | |
| x = self.proj_lif(self.proj_bn(self.proj_conv(x))).reshape(T, B, C, W, H) |
Lines 155 to 158 in 43f0adf
| x = x.transpose(3, 4).reshape(T, B, C, N).contiguous() | |
| x = self.attn_lif(x) | |
| x = x.flatten(0,1) | |
| x = self.proj_lif(self.proj_bn(self.proj_conv(x))).reshape(T,B,C,W,H) |