-
-
Notifications
You must be signed in to change notification settings - Fork 656
Fix FutureWarning: Replace torch.cuda.amp.GradScaler with torch.amp.GradScaler #3458
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Co-authored-by: vfdev-5 <[email protected]>
|
@copilot address the review comments |
…e PyTorch version requirements, revert notebook changes Co-authored-by: vfdev-5 <[email protected]>
…iations and revert to correct PyTorch version requirements Co-authored-by: vfdev-5 <[email protected]>
Fixed in commit 67c8572. The CI failures were caused by:
Both issues have been corrected. Warning Firewall rules blocked me from connecting to one or more addresses (expand for details)I tried to connect to the following addresses, but was blocked by firewall rules:
If you need me to access, download, or install something from one of these locations, you can either:
|
|
@copilot apply also the replacements
|
…in README, docs, and notebooks Co-authored-by: vfdev-5 <[email protected]>
Done in commit 8d96541. Applied torch.cuda.amp.GradScaler → torch.amp.GradScaler replacements in:
No changes needed for examples/notebooks/Cifar100_bench_amp.ipynb as it had no GradScaler class references. |
|
@copilot Code style check CI job is failing with the following message: |
Co-authored-by: vfdev-5 <[email protected]>
... Fixed in commit 7ec5149. Added missing newline at the end of the notebook file to satisfy the pre-commit prettier hook. |
Plan to Fix FutureWarning for torch.cuda.amp.GradScaler
torch.cuda.amp.GradScalertotorch.amp.GradScaler"torch.cuda.amp.GradScaler"to"torch.amp.GradScaler"torch.cuda.amp.GradScaler(2**10)totorch.amp.GradScaler('cuda', 2**10)torch.cuda.amptotorch.ampfrom torch.amp import autocastandfrom torch.amp import GradScaler"torch.cuda.amp.GradScaler"to"torch.amp.GradScaler"torch.cuda.amp.GradScalertotorch.amp.GradScalerSummary
Successfully replaced all occurrences of the deprecated
torch.cuda.amp.GradScalerwithtorch.amp.GradScaler:from torch.amp import autocast, GradScaler"torch.cuda.amp.GradScaler"to"torch.amp.GradScaler"GradScaler(enabled=...)without explicit device parameter (auto-detected)The changes have been tested and validated:
Original prompt
Fixes #3435
💬 Share your feedback on Copilot coding agent for the chance to win a $200 gift card! Click here to start the survey.