-
Notifications
You must be signed in to change notification settings - Fork 26.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix Failed tests with mobile bert resize tokens embedding #33950
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
SG to me, why were the tests skipped?
Ah could you update |
Ah and |
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
@ArthurZucker I have addressed mobilebert test, GitModeltest, and recurrent_gemma test. recurrent_gemma test failed because an outlier was sampled, so I multiplied covariance by 1e-9 instead of 1e-5. |
I feel bad about those tests' failures actually, I wanted to deliver good code but tests didn't help me. 😅 The outlier of recurrent_gemma got sampled only after merging the code haha. And I am not sure if other tests were actually skipped before merging. It's weird. |
Yep! They are not part of the important model, the test fetcher seems to badly behave! It should have found out the whole dependencies! No worries, we are the ones who set you up for failure in that case! cc @ydshieh if you can have a look at the reasons why this was not fetched when you have time! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thanks 🤗
if is_covariance_psd: | ||
# If covariances is positive definite, a distribution can be created. and we can sample new weights from it. | ||
distribution = torch.distributions.multivariate_normal.MultivariateNormal( | ||
mean_embeddings, covariance_matrix=1e-9 * covariance |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
could have reduce the strictness of the test as well, not sure what's best?
config.vocab_size = 4 | ||
config.pad_token_id = 3 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what was this failing?
The tests failed with mobilebert because of a missing transposing for the
old_lm_head
. This PR fixes that. I have tried the two failed tests locally.It's weird that all tests passed before merging. EDIT: I see now, some tests were skipped
I have also changed the logic when the covariance matrix is not positive definite, just initialize the new embeddings with the mean if covariance is not positive definite.
c.c. @ArthurZucker