Skip to content

Freezing weights  #60

Open
Open
@sebffischer

Description

@sebffischer
  • the problem here is that the trained weights do not live in the PipeOpTorch objects
  • PipeOpTorch should probably keep a link to the nn_module's parameters (torch_tensors have reference semantics, in some way). (Anticipating problems with cloning here...). They should have a hyperparameter 'fix_weights' or something like that, in which case the new nn_module they create has these weights (and they are fixed).
  • maybe also have a 'relative learning rate' hyperparameter

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions