Skip to content

Why is sigmoid activation for LRP not allowed? #1361

Open
@CloseChoice

Description

❓ Questions and Help

I tried to a small model with a sigmoid activation but it's actually tested here that this does not work. Is there a specific reason for that? IMO since sigmoid is a scalar operation it should work analogously to ReLU and Tanh which can be used with LRP.

Simply adding sigmoid here yields the expected result. So why not just do so?

I would be willing to create the PR and add a test for this if there is no reason not to.

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions