Skip to content

注意力机制导致的膜电位泄露 #17

@hhx0320

Description

@hhx0320

第二个注意力结构的x = self.proj_lif(self.proj_bn(self.proj_conv(x))).reshape(T, B, C, W, H)
应该修改为x = self.proj_lif(self.proj_bn(self.proj_conv(x)).reshape(T, B, C, W, H))
原操作会导致膜电位泄露,同一批次中的不同样本的膜电位产生混杂

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions