Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix Failed tests with mobile bert resize tokens embedding #33950

Open
wants to merge 8 commits into
base: main
Choose a base branch
from

Conversation

abuelnasr0
Copy link
Contributor

@abuelnasr0 abuelnasr0 commented Oct 4, 2024

The tests failed with mobilebert because of a missing transposing for the old_lm_head. This PR fixes that. I have tried the two failed tests locally.
It's weird that all tests passed before merging. EDIT: I see now, some tests were skipped

I have also changed the logic when the covariance matrix is not positive definite, just initialize the new embeddings with the mean if covariance is not positive definite.

c.c. @ArthurZucker

@abuelnasr0 abuelnasr0 changed the title Fix Failed tests with mobile bert Fix Failed tests with mobile bert resize tokens embedding Oct 4, 2024
Copy link
Collaborator

@ArthurZucker ArthurZucker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

SG to me, why were the tests skipped?

@ArthurZucker
Copy link
Collaborator

Ah could you update tests/models/recurrent_gemma/test_modeling_recurrent_gemma.py::RecurrentGemmaModelTest::test_resize_tokens_embeddings as welll?

@ArthurZucker
Copy link
Collaborator

Ah and tests/models/git/test_modeling_git.py::GitModelTest::test_resize_tokens_embeddings - AssertionError: Padding_idx must be within num_embeddings as well

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@abuelnasr0
Copy link
Contributor Author

@ArthurZucker I have addressed mobilebert test, GitModeltest, and recurrent_gemma test.

recurrent_gemma test failed because an outlier was sampled, so I multiplied covariance by 1e-9 instead of 1e-5.
Git model tests failed because the config was overwritten after the first resizing test and used to initialize the model again, so I created a new copy for the new model initialization.

@abuelnasr0
Copy link
Contributor Author

abuelnasr0 commented Oct 4, 2024

I feel bad about those tests' failures actually, I wanted to deliver good code but tests didn't help me. 😅

The outlier of recurrent_gemma got sampled only after merging the code haha.

And I am not sure if other tests were actually skipped before merging. It's weird.
Do you know why the test failures appeared after merging? mobilebert and GitModeltest?

@ArthurZucker
Copy link
Collaborator

Yep! They are not part of the important model, the test fetcher seems to badly behave! It should have found out the whole dependencies!

No worries, we are the ones who set you up for failure in that case!

cc @ydshieh if you can have a look at the reasons why this was not fetched when you have time!

Copy link
Collaborator

@ArthurZucker ArthurZucker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks 🤗

if is_covariance_psd:
# If covariances is positive definite, a distribution can be created. and we can sample new weights from it.
distribution = torch.distributions.multivariate_normal.MultivariateNormal(
mean_embeddings, covariance_matrix=1e-9 * covariance
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could have reduce the strictness of the test as well, not sure what's best?

config.vocab_size = 4
config.pad_token_id = 3
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what was this failing?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants