This repository has been archived by the owner on Nov 3, 2023. It is now read-only.
Support tied positional weights in transformers #2184
Labels
donotreap
Avoid automatically marking as stale.
Enhancement
H1 2020
Tasks to be completed in H1 2020
Help Wanted
Medium
Milestone
Add a flag to use tied positional embeddings in transformer/generator and transformer/retrieval, and implement the tied weights. Should be False by default for backwards compatibility, and upgrade_opt should be used to migrate old models.
The text was updated successfully, but these errors were encountered: