Skip to content

Disable patch_tensor_ops for modules with custom_modules_hooks #139

Closed
@Rikorose

Description

@Rikorose

I have some custom layers where I implemented the flop counting manually in custom_modules_hooks. This enables nice outputs during print_per_layer_stats so I know the flops and params for the corresponding layers in a larger model.

However, some ops are counted also in patch_tensor_ops which results in the final output being twice as large as printed in the per layer stats:

1604a6vh

Ideally, patch_tensor_ops is not applied in modules with a custom hook.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions