Fix DeepSpeed auto batch size crash for DataLoader(batch_size=None)#21669
Open
trentisiete wants to merge 2 commits intoLightning-AI:masterfrom
Open
Fix DeepSpeed auto batch size crash for DataLoader(batch_size=None)#21669trentisiete wants to merge 2 commits intoLightning-AI:masterfrom
trentisiete wants to merge 2 commits intoLightning-AI:masterfrom
Conversation
DataLoader(batch_size=None) sets batch_sampler to None, but the previous hasattr check still returned True and dereferencing .batch_size raised AttributeError. Use an explicit None check and warn when the batch size cannot be inferred, falling back to 1 (matching the previous behavior before Lightning-AI#19209). Fixes Lightning-AI#19460
for more information, see https://pre-commit.ci
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #21669 +/- ##
=========================================
- Coverage 87% 79% -8%
=========================================
Files 270 267 -3
Lines 23973 23916 -57
=========================================
- Hits 20751 18810 -1941
- Misses 3222 5106 +1884 |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
What does this PR do?
Fixes #19460.
DeepSpeedStrategy._auto_select_batch_sizedoeshasattr(train_dataloader, "batch_sampler")and then dereferencestrain_dataloader.batch_sampler.batch_size. When the user passesDataLoader(batch_size=None)(common for iterable datasets that yield pre-batched tensors), PyTorch setsbatch_samplertoNonerather than omitting the attribute, sohasattrstill returnsTrueand the dereference raisesAttributeError: 'NoneType' object has no attribute 'batch_size'.Before #19209 the code was wrapped in a broad
try/exceptthat silently swallowed the error and fell back to1. Per @awaelchli 's suggestion on the issue, this PR replaces that with an explicitNonecheck and arank_zero_warnpointing the user toDeepSpeedStrategy(logging_batch_size_per_gpu=...)so they can set the logging batch size themselves if the default of 1 is not appropriate.Added a unit test that mocks the data source to return
DataLoader(batch_size=None)and asserts both the returned value and the warning. The test is gated on@RunIf(deepspeed=True)like the rest of the file.Before submitting
batch_sampler.batch_sizeis None with deepspeed andDataLoader(batch_size=None)#19460 (maintainer invited a PR).test_deepspeed_auto_batch_size_none_batch_sampler.ruff/mypyon the changed files; the new test is skipped on my machine (Windows, nodeepspeed) like the rest of the file, so I also reproduced the regression and verified the fix path by executing_auto_select_batch_sizedirectly on aDataLoader(batch_size=None).### Fixedinsrc/lightning/pytorch/CHANGELOG.md.PR review
Anyone in the community is welcome to review the PR.
Before you start reviewing, make sure you have read the review guidelines. In short, see the following bullet-list:
Reviewer checklist
📚 Documentation preview 📚: https://pytorch-lightning--21669.org.readthedocs.build/en/21669/