Why do we need "batch_idx" argument for "training_step" method? #19707
Unanswered
flourish-727
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 1 comment
-
Did you get an answer? I'm curious too. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello all. I am a rookie and just started learning Lightning.
I don't really understand the role of "batch_idx" as an argument of the methods like "training_step" or validation_step.
The method does not really use the argument (see this example from official):
def training_step(self, batch, batch_idx):
x, y, z = batch
out = self.encoder(x)
loss = self.loss(out, x)
return loss
See? "batch_idx" goes nowhere!!!
Could someone please give some hints?
Thanks!
Beta Was this translation helpful? Give feedback.
All reactions