Skip to content

Conversation

@Aaraviitkgp
Copy link

Fixes #3435

Description:
I have made change in code from torch.cuda.amp import GradScaler to from torch.amp import GradScaler.
• torch.amp provides a unified AMP interface across devices.
• Using torch.cuda.amp restricts AMP usage to CUDA-only environments.
• This change helps Ignite better support PyTorch’s full AMP ecosystem in a clean and future-proof way.

No new functionality added — this is a safe refactor with no effect on runtime behavior.

@github-actions github-actions bot added module: engine Engine module examples Examples labels Jul 26, 2025
@vfdev-5
Copy link
Collaborator

vfdev-5 commented Oct 14, 2025

Closing this PR in favor of #3458

@vfdev-5 vfdev-5 closed this Oct 14, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

examples Examples module: engine Engine module

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Fix FutureWarning: torch.cuda.amp.GradScaler(args...) is deprecated. Please use torch.amp.GradScaler('cuda', args...) instead.

2 participants