Pytorch Lightning DDP Sampling - How to correctly do it with torchgeo #1402
Unanswered
TolgaAktas
asked this question in
Q&A
Replies: 2 comments 4 replies
-
Our samplers don't currently support distributed training but this is on our roadmap. |
Beta Was this translation helpful? Give feedback.
2 replies
-
Related to #305 |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I am working on migrating my codebase to PLightning for easier experimentation and multi-GPU training, but I learned that PL with DDP backend requires some gizmos with the sampler. I believe that I need to provide a custom sampler to DDP trainer, which I believe would be one of the GeoSamplers of torchgeo. Is that so? I have the following sampler and dataloader in my vanilla PyTorch script, and I was wondering if I could just pass those to the PyTorch Trainer for DDP.
Beta Was this translation helpful? Give feedback.
All reactions