Skip to content

Conversation

@junyann
Copy link

@junyann junyann commented Apr 25, 2021

(Contributed by @zuluzazu) The concept embeddings have already been frozen when instantiating the model with freeze_ent_emb=args.freeze_ent_emb. Adding freeze_net(model.decoder.concept_emb) could addtionally freeze self.cpt_transform = nn.Linear(concept_in_dim, concept_out_dim) in the CustomizedEmbedding object, which is not desired.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant